- “ai”
- “image-generation”
- “flux”
- “lora”
- “black-forest-labs”
- “low-vram”
- “ai-image-generation”
- “flux-models”
- “lora-training”
- “rag-mechanics”
- “rag-embedding”
- “embedding-models”
- “domain-specific-data”
- “fine-tuning”
- “text-chunking”
- “chromadb”
- “quantisation”
- “large-language-models”
- “nvidia”
- “nemotron”
- “parameters”
- “2026 04 14 Adam Lucek quantisation of LLM”
summary: “Adam Lucek specializes in AI image generation using FLUX.1 LoRA adapters and provides tutorials on RAG mechanics, embedding model fine-tuning, domain-specific data optimization, and optimal RAG chunking strategies with ChromaDB. Additionally, he covers quantization techniques for large language models (LLMs), including implementations for models like NVIDIA’s Llama 3.1 Nemotron 70B.” updated: 2026-04-14
Adam Lucek is known for his practical implementations in AI image generation, particularly with FLUX.1 models.
- Quantisation of LLMs
- Detailed overview in video: https://www.youtube.com/watch?v=3EDI4akymhA
- Explains quantisation for LLMs like NVIDIA’s Llama 3.1 Nemotron 70B (70.6 billion parameters)
- Covers storage requirements (e.g., 30+ GB)
Source Notes
- 2026-04-14: # Fine Tuning RAG - Adam Lucek --- --- https://www.youtube.com/watch?v=hztWQcoUbt0 This video demonstrates how to fine-tune embedding models to optimize the document retrieval step in a Retrie (Fine Tuning RAG - Adam Lucek)
- 2026-04-14: # Knowledge Graph or Vector Database RAG comparison --- --- https://www.youtube.com/watch?v=6vG_amAshTk Video by Adam Lucek This video provides a detailed introduction to Knowledge Knowledge Graph or Vector Database RAG comparison)