Home

The Prosthetic Brain

Why I built it. How it works. What it reveals.

366K messages·256ms queries·440MB LanceDB

Why I Built It

"The hyperfocus to hyperproductive idea is born out of me turning 0.1% of my hyperfocus into tangible value."

2025-07 | Cognitive Analysis System Architecture | chatgpt

Monotropic focus. When I'm deep in something, I'm completely there. When I switch, the previous context dies.

"When the tunnel moves, everything in the previous tunnel dies."

This isn't a deficiency. It's a different architecture. But it needs infrastructure. So I built queryable memory.

"My cognitive architecture—deep, narrow, intense—is a feature set. The only real constraint is being stuck in the moment deep—very deep but very narrow. That's why I am building this prosthetic."

2025-12 | intellectual-dna | claude-code

The Game Changer

"About a year and a half ago, somewhere in the middle of 2024, I was finally able to stop and completely complete the project within a single hyperfocus session. This was a game changer for me."

2025-10 | Untitled | claude-code

Before: hyperfocus produced value, but switching killed it. 0.1% survived.

After: the prosthetic makes previous work SAFE to access. No context loss. Single-session completion became possible.

"If you can't measure it, you can't improve it. If you can't query it, you don't own it."

2025-12 | Building Brain MCP | claude-code

The Numbers

366K
Messages
15,378 conversations
108K
Embedded
81% of user messages
256ms
Query Time
LanceDB + HNSW
440MB
Storage
32x smaller than before

How It Works

// The Stack
MCP Brain Server (Python)
LanceDB (440MB, 32x smaller)
nomic-embed-text-v1.5 (768 dim)
HNSW Index (256ms queries)

All conversations exported from ChatGPT, Claude, Gemini. Unified schema. User messages embedded with sentence-transformers. HNSW-indexed for fast retrieval.

90+ MCP Tools

MCP tools available for querying the brain. Used in Claude Code, Claude Desktop. Here's a sample:

semantic_search()

Find conceptually similar messages across 108K vectors

search_conversations()

Full-text search across 366K messages

thinking_trajectory()

Track how an idea evolved over time

first_mention()

Find when you first thought about something

what_was_i_thinking()

Time-travel to a specific month

youtube_search()

Search 31K consumed videos

query_signature_phrases()

432 recurring expressions

query_correction_patterns()

1,778 tracked corrections

The Insight

Building this revealed patterns I didn't know I had. Signature phrases that repeat. Correction patterns that show how I steer.

PhraseCountMeaning
"give me the"789xDirect request, expects output not process
"no no no"333xStrong disagreement, immediate redirect
"step by step"244xSequential, monotropic processing
"make sure to"206xPrecision focus, need for certainty

1,778 corrections tracked. "no no no" 333 times. Active steering, not passive acceptance.

Full data receipts

Open Source

The full system is open source. DuckDB schema, MCP server, embedding pipeline.

intellectual-dna on GitHub