Keyboard shortcuts

Press or to navigate between chapters

Press S or / to search in the book

Press ? to show this help

Press Esc to hide this help

Installation

Get ReasonKit’s five ThinkTools for structured AI reasoning:

ToolPurposeUse When
GigaThinkExpansive thinking, 10+ perspectivesNeed creative solutions, brainstorming
LaserLogicPrecision reasoning, fallacy detectionValidating arguments, logical analysis
BedRockFirst principles decompositionFoundational decisions, axiom building
ProofGuardMulti-source verificationFact-checking, claim validation
BrutalHonestyAdversarial self-critiqueReality checks, finding flaws

Quick Install

Linux / macOS

curl -fsSL https://get.reasonkit.sh | bash

Windows (PowerShell)

irm https://get.reasonkit.sh/windows | iex

Prerequisites

  • Git (for building from source)
  • Rust 1.70+ (auto-installed if missing)
  • An LLM API key (Anthropic, OpenAI, OpenRouter, or local Ollama)

Installation Methods

The installer auto-detects your OS and architecture:

# Linux/macOS
curl -fsSL https://get.reasonkit.sh | bash

# Windows PowerShell
irm https://get.reasonkit.sh/windows | iex

This will:

  1. Install Rust if not present
  2. Clone and build ReasonKit
  3. Add rk-core to your PATH

Cargo

For Rust developers:

cargo install reasonkit-core

From Source

For development or customization:

git clone https://github.com/reasonkit/reasonkit-core
cd reasonkit-core
cargo build --release
./target/release/rk-core --help

Verify Installation

rk-core --version
# reasonkit-core 0.1.0

rk-core --help

LLM Provider Setup

ReasonKit requires an LLM provider. Choose one:

Best quality reasoning:

export ANTHROPIC_API_KEY="sk-ant-..."

OpenAI

export OPENAI_API_KEY="sk-..."

OpenRouter (300+ Models)

Access to many models through one API:

export OPENROUTER_API_KEY="sk-or-..."

# Specify a model
rk-core think "question" --model anthropic/claude-3-opus

Google Gemini

export GOOGLE_API_KEY="..."

Groq (Fast Inference)

export GROQ_API_KEY="..."

Local Models (Ollama)

For privacy-sensitive use cases:

ollama serve
rk-core think "question" --provider ollama --model llama3

Quick Test

Try each ThinkTool:

# GigaThink - Get 10+ perspectives
rk-core think "Should I start a business?" --tool gigathink

# LaserLogic - Check reasoning
rk-core think "This investment guarantees 50% returns" --tool laserlogic

# BedRock - Find first principles
rk-core think "What makes a good leader?" --tool bedrock

# ProofGuard - Verify claims
rk-core think "Coffee causes cancer" --tool proofguard

# BrutalHonesty - Reality check
rk-core think "My startup idea is perfect" --tool brutalhonesty

Configuration File

Create ~/.config/reasonkit/config.toml:

[default]
provider = "anthropic"
model = "claude-3-sonnet-20240229"
profile = "balanced"

[providers.anthropic]
api_key_env = "ANTHROPIC_API_KEY"

[providers.openai]
api_key_env = "OPENAI_API_KEY"
model = "gpt-4-turbo-preview"

[output]
format = "pretty"
color = true

Docker

docker run -e ANTHROPIC_API_KEY=$ANTHROPIC_API_KEY \
  ghcr.io/reasonkit/reasonkit-core \
  think "Should I buy a house?"

Troubleshooting

“API key not found”

Make sure your API key is exported:

echo $ANTHROPIC_API_KEY  # Should print your key

“Rate limited”

Use a different provider or wait. Consider OpenRouter for high volume.

“Model not available”

Check that your provider supports the requested model:

rk-core models list  # Show available models

Next Steps