Conare AI vs Terraphim: Context Engineering Comparison
Executive Summary
Conare AI is a macOS-only context management tool for Claude Code ($59 lifetime). Terraphim is an open-source, cross-platform AI assistant with knowledge graph capabilities that can replicate and exceed Conare's functionality using semantic search, knowledge graphs, and the Model Context Protocol (MCP).
This guide shows how to use Terraphim to achieve superior context engineering compared to Conare, while maintaining full privacy and extensibility.
Feature Comparison
| Feature | Conare AI | Terraphim | Advantage | |---------|-----------|-----------|-----------| | Context Items | Upload docs/PDFs/websites once, reuse | Knowledge Graph + Haystacks (local, GitHub, Notion, email, Reddit, Rust docs) | Terraphim: Multiple sources, semantic relationships | | Vibe-Rules | Store coding rules and patterns | Knowledge Graph with rule nodes + Role-based system prompts | Terraphim: Hierarchical rules with graph relationships | | File References | "@" referencing with line numbers | MCP tools: autocomplete, paragraph extraction, code search | Terraphim: More powerful with semantic context | | Token Tracking | Monitor uploaded material tokens | Built-in document metadata and indexing stats | Terraphim: Full document analytics | | Privacy | Local (uses Claude Code) | Fully local with multiple backend options | Equal: Both privacy-first | | Platform | macOS only | Linux, macOS, Windows | Terraphim: Cross-platform | | Price | $59 lifetime | Free, open source | Terraphim: Free | | LLM Support | Claude only | Ollama, OpenRouter, any compatible provider | Terraphim: Multiple providers | | Search | Basic context retrieval | BM25, BM25F, BM25Plus, TerraphimGraph with semantic expansion | Terraphim: Advanced relevance algorithms | | Context Management | Load/unload rules, collections | Role switching with per-role knowledge graphs | Terraphim: More flexible with roles |
Core Concepts Mapping
Conare → Terraphim
1. Context Items → Knowledge Graph + Haystacks
Conare Approach:
- Upload a document once
- Reference it across conversations
- Track token usage
Terraphim Approach:
Terraphim indexes documents into a knowledge graph with:
- Nodes: Concepts extracted from documents
- Edges: Relationships between concepts
- Documents: Full-text indexed with BM25 relevance
- Thesaurus: Semantic mappings (synonyms, related terms)
Search once, get semantically related results automatically.
2. Vibe-Rules → Knowledge Graph Rules + System Prompts
Conare Approach:
- Store coding rules
- Load/unload rule sets
- Global vs local rules
Terraphim Approach:
Create rules as knowledge graph documents with special tags:
System prompt per role:
Advantages:
- Rules are searchable: "Show me async patterns" → finds related rules
- Rules have relationships: "async-pattern" → "tokio-spawn" → "structured-concurrency"
- Hierarchical: Global rules (all roles) + role-specific rules
- Version control: Rules are just markdown files in git
3. File References → MCP Tools
Conare Approach:
- "@" instant file referencing
- Shows line numbers
- Provides full context
Terraphim Approach:
MCP server already exposes powerful tools:
// Autocomplete with context
autocomplete_terms(prefix: string, limit: number) → [{term, snippet}]
// Extract paragraphs starting at matched term
extract_paragraphs_from_automata(text: string, term: string) → [{paragraph, line_number}]
// Search with semantic expansion
search(query: string, role: string, limit: number) → [Document]
// Graph connectivity
is_all_terms_connected_by_path(terms: string[]) → booleanUsage in Claude Desktop:
// claude_desktop_config.json
Now Claude can:
- Search your codebase semantically
- Extract relevant paragraphs with line numbers
- Understand relationships between concepts
- Navigate graph connections
Implementation Guide
Step 1: Create Context Engineer Role
Create terraphim_server/default/context_engineer_config.json:
Step 2: Create Context Library Structure
# Create directory structure
# Example structure:
Step 3: Create Vibe-Rules as Knowledge Graph
Example docs/vibe-rules/rust/async-patterns.md:
This markdown becomes a knowledge graph automatically:
- Nodes: "async", "rust", "tokio", "concurrency", "channels", "backpressure", "cancellation"
- Edges: "async" → "tokio", "tokio" → "tokio::spawn", "channels" → "backpressure"
- Document: Indexed with full text, searchable by tags
Step 4: Build Knowledge Graph
# Run Terraphim server with Context Engineer role
# The server will:
# 1. Index all markdown files in docs/context-library and docs/vibe-rules
# 2. Build thesaurus from hashtags and terms
# 3. Create automata for fast matching
# 4. Generate knowledge graph with nodes and edges
# 5. Enable semantic search across all documentsStep 5: Use MCP Server with Claude Desktop
Configure Claude Desktop to use Terraphim MCP:
Now Claude can:
User: "Show me async patterns"
Claude: [Uses search tool] → Returns documents with #async tags
User: "What's the best practice for channels?"
Claude: [Uses autocomplete_terms("channel")] → Finds "bounded channels" rule
User: "Extract the tokio::spawn example"
Claude: [Uses extract_paragraphs_from_automata] → Returns exact code with line numbersAdvanced Usage
Context Collections
Create named collections by organizing rules into directories:
Switch collections by changing the haystacks location in your role config.
Token Tracking
Terraphim tracks document metadata automatically:
// In your code
let stats = service.get_graph_stats.await?;
println!;
println!;
println!;For token counting, add to your documents:
Hierarchical Rules
Implement global + role-specific rules:
Search priority: Role-specific rules rank higher than global rules.
Version Control Integration
Since rules are markdown files:
# Track rule changes
# Share rules with team
# Team members pull latest rules
# Terraphim rebuilds knowledge graph automaticallyMigration from Conare
If you're currently using Conare:
- Export context items: Copy your uploaded documents to
docs/context-library/ - Export vibe-rules: Copy your rules to
docs/vibe-rules/as markdown with hashtags - Configure Terraphim: Create
context_engineer_config.jsonwith your preferences - Run Terraphim: Start the server and MCP server
- Configure Claude Desktop: Point to Terraphim MCP server
- Test: Search for rules, verify autocomplete works
Best Practices
1. Use Hashtags for Tagging
2. Create Cross-References
3. Include Code Examples
Always include runnable code snippets in rules:
4. Maintain Rule Hierarchy
docs/vibe-rules/
├── 00-global/ # Priority 1: Global rules
├── 10-language/ # Priority 2: Language-specific
├── 20-framework/ # Priority 3: Framework-specific
└── 30-project/ # Priority 4: Project-specificAPI Reference
MCP Tools Available
All tools are automatically exposed via Terraphim MCP server:
Search Tools
search(query, role, limit, skip)- Semantic search with knowledge graph expansionautocomplete_terms(prefix, limit)- Fast autocomplete from knowledge graphfuzzy_autocomplete_search(query, max_distance)- Fuzzy matching with Jaro-Winkler
Context Tools
extract_paragraphs_from_automata(text, term)- Extract paragraphs starting at matched termfind_matches(text, role)- Find all concept matches in textis_all_terms_connected_by_path(terms)- Check if terms are related in graph
Knowledge Graph Tools
load_thesaurus(role)- Load knowledge graph for roleget_term_context(term, depth)- Get related concepts with depth traversal
Performance Comparison
| Operation | Conare | Terraphim | Notes | |-----------|--------|-----------|-------| | Initial indexing | ~1s | ~2-3s | Terraphim builds full knowledge graph | | Context retrieval | <100ms | <50ms | Terraphim uses Aho-Corasick automata | | Semantic search | N/A | ~200ms | Terraphim expands queries via graph | | Token counting | Real-time | Metadata-based | Both provide usage info | | Memory usage | Unknown | ~50MB per role | Terraphim caches automata in memory |
Troubleshooting
Knowledge Graph Not Building
# Check if markdown files exist
# Verify config path
|
# Run with debug logging
RUST_LOG=debug MCP Server Not Connecting
# Test MCP server manually
# Check Claude Desktop config
# Verify MCP server is running
| Search Returns No Results
# Verify documents are indexed
|
# Check haystack configuration
|
# Test search directly
Conclusion
Terraphim provides a superior alternative to Conare AI by offering:
- Open Source: No licensing fees, full customization
- Cross-Platform: Works on Linux, macOS, Windows
- Knowledge Graphs: Semantic relationships between concepts
- Multiple LLMs: Ollama, OpenRouter, custom providers
- Advanced Search: BM25, semantic expansion, graph traversal
- Version Control: Rules and context are just markdown in git
- MCP Integration: Native support for Claude Desktop and other MCP clients
- Privacy: Runs entirely locally with no external dependencies
By using Terraphim as your "Context Engineer", you gain all of Conare's benefits plus advanced knowledge graph capabilities for true semantic code understanding.