Cross-references

Cross-referencing in a wiki context involves linking related content within the same or different documents to enhance navigation and understanding. It’s particularly useful for maintaining coherent, interconnected knowledge bases.

Karpathy’s LLM Wiki Pattern

  • Introduced by Andrej Karpathy as an alternative to traditional Retrieval Augmented Generation (RAG) models.
  • Emphasizes a persistent, compounding wiki maintained entirely by a Large Language Model (LLM).
  • Addresses the challenge of effectively managing and growing personal knowledge bases over time.

Key Concepts

  • Retrieval Augmented Generation (RAG): A technique that combines vector-based document retrieval with language generation.
  • Persistent Knowledge Base: A system that accumulates information continuously, allowing for the gradual refinement and expansion of data.
  • Large Language Model (LLM): Advanced AI models trained on vast amounts of text data to generate human-like responses.

Advantages

  • Dynamic updating without manual intervention.
  • Continuous compounding of knowledge through interactions with users.
  • Enhanced accuracy and depth as the model learns from its own outputs.

Integration

The video Karpathy’s LLM Wiki: Watch Me Build a Knowledge Base From Scratch! highlights how this approach can be implemented to create an effective, self-maintaining personal wiki.

2026 04 10 Karpathys LLM Wiki Beyond RAG for Persistent Knowledge Bases