LlamaIndex

LlamaIndex is a high-level framework for building retrieval-augmented generation (RAG) and agentic applications with large language models (LLMs) in Python and TypeScript.

It helps you ingest and parse data from several sources, build indexes, retrieve relevant context, and synthesize answers. LlamaIndex integrates with many hosted LLM APIs and also supports local models—for example, via Ollama.

Official website: developers.llamaindex.ai

Tutorial

LlamaIndex in Python: A RAG Guide With Examples

Learn how to set up LlamaIndex, choose an LLM, load your data, build and persist an index, and run queries to get grounded, reliable answers with examples.

intermediate ai

For additional information on related topics, take a look at the following resources:


By Leodanis Pozo Ramos • Updated Dec. 5, 2025 • Reviewed by Martin Breuss