Cipher Installation Guide

Cipher is an agentic memory runtime that provides intelligent, stateful agents with persistent memory capabilities. This guide covers all installation methods from quick Docker setup to development builds.

Quick Start

For the fastest setup, see the Quickstart Guide which covers installation, API setup, and connecting to your coding assistant in under 5 minutes.

Prerequisites

System Requirements

  • Node.js: ≥ 20.0.0 (specified in engines)
  • pnpm: ≥ 9.14.0 (npm is also supported)
  • Operating System: Linux, macOS, or Windows
  • Memory: Minimum 4GB RAM (8GB+ recommended for better performance)

Optional External Services

  • Neo4j: For advanced knowledge graph memory (optional)
  • Redis: For distributed caching (optional, defaults to in-memory)
  • Qdrant/Milvus: For external vector storage (optional, defaults to in-memory)
  • Ollama: For local LLM hosting (optional)

Installation Methods

# Install globally
npm install -g @byterover/cipher

# Or install locally in your project
npm install @byterover/cipher
# Clone repository
git clone https://github.com/campfirein/cipher.git
cd cipher

# Setup environment
cp .env.example .env
# Edit .env with your API keys (see Environment Setup below)

# Start with Docker Compose
docker-compose up -d

# Verify installation
curl http://localhost:3000/health

3. Build from Source (Development)

# Clone repository
git clone https://github.com/campfirein/cipher.git
cd cipher

# Install dependencies and build
pnpm install && pnpm run build && npm link

# Setup environment
cp .env.example .env
# Configure your API keys (see Environment Setup below)

# Verify installation
cipher --version

Environment Setup

Required Configuration

You need at least one API provider. For source builds, copy the environment template:
# Source/Docker installations
cp .env.example .env

# Edit .env with at least one API provider:
OPENAI_API_KEY=your_openai_api_key_here          # For GPT models
ANTHROPIC_API_KEY=your_anthropic_api_key_here    # For Claude models  
OPENROUTER_API_KEY=your_openrouter_api_key_here  # For 200+ models
OLLAMA_BASE_URL=http://localhost:11434/v1        # For self-hosted models
Get API keys from: For advanced configuration options, see the Configuration Guide.

Verification

Basic Testing

cipher --version                 # Show version
cipher "Hello, how are you?"     # One-shot test
cipher --mode mcp                # Start as MCP server

Advanced Testing

For API mode testing and integration examples, see the Connections Guide.

Development Setup

For Contributors

# Development workflow
pnpm install                # Install dependencies
pnpm run build             # Build project
pnpm run typecheck         # Type checking
pnpm run lint:fix          # Lint and fix
pnpm run format            # Format code
pnpm run test              # Run tests

# Development mode with file watching
pnpm run dev               # TypeScript compilation with watch mode

Quality Assurance Commands

pnpm run precommit         # Full pre-commit checks
pnpm run test:unit         # Unit tests only
pnpm run test:integration  # Integration tests only

Architecture Overview

Runtime Modes

  • CLI Mode: Interactive command-line interface (default)
  • API Mode: REST server for programmatic access
  • MCP Mode: Model Context Protocol server for tool integration
  • One-shot Mode: Execute single commands and exit

Key Components

  • Memory System: Persistent storage with vector search and knowledge graphs
  • Multi-LLM Support: OpenAI, Anthropic, OpenRouter, Ollama integration
  • MCP Integration: Connect to external tools and development environments
For detailed architecture information, see the Overview and Memory Overview pages.

Troubleshooting

Common Issues

  1. Node.js Version: Ensure Node.js ≥20.0.0
  2. API Keys: At least one LLM provider API key required
  3. Build Errors (source builds): Run pnpm run typecheck to identify issues
  4. Port Conflicts: Use --port flag to change default port (3000)

Getting Help