Skip to content

himanshu231204/langchain-playground--for-llms-

Repository files navigation

🦜 LangChain Playground for LLMs

The most complete, hands-on LangChain learning repository on GitHub

GitHub Stars GitHub Forks GitHub Watchers GitHub Issues

Python LangChain LangGraph Ollama OpenAI Google Gemini Streamlit License PRs Welcome


Build, learn, and ship production-ready LLM applications — from your first prompt to full-scale RAG pipelines and autonomous AI agents.


⭐ If this repo helps you, please give it a star — it helps others discover it too! ⭐


🚀 Why This Repository?

Most LangChain resources are scattered across tutorials, notebooks, and outdated blog posts. This repository brings everything into one place — structured, runnable, and ready for production.

🎯 What You Get
📚 Structured learning path — progress from zero to LLM expert in a logical sequence
💻 100% runnable code — every concept has a working example you can run immediately
🔍 RAG pipelines — full retrieval-augmented generation with Chroma, FAISS, and more
🤖 AI Agents — ReAct-pattern agents with custom tools and LangGraph workflows
🏠 Local LLM support — run everything offline with Ollama (zero API costs)
🌐 Multi-provider — OpenAI, Google Gemini, Hugging Face, and Ollama side-by-side
📝 Rich notes & docs — detailed markdown notes for every topic covered
🔧 Production patterns — LCEL, tool calling, structured output, and memory

Table of Contents


Learning Roadmap

The content is organized to be followed in order:

# Topic Description
1 Chat Models LLM initialization with Ollama, Gemini, and OpenAI (chatmodel/)
2 Messages SystemMessage, HumanMessage, AIMessage, and chat history (Messages/)
3 Prompt Engineering Prompt templates, dynamic generation, and a Streamlit prompt UI (Prompts/)
4 Structured Output Enforcing JSON schema, Pydantic, and TypedDict outputs (Structured Output/)
5 Output Parsers JSON, Pydantic, string, and structured output parsers (Output_Parsers/)
6 Chains Sequential, parallel, and conditional chains (Chain/)
7 Runnables (LCEL) LangChain Expression Language — sequences, parallelism, branching, lambdas, passthrough (Runnables/)
8 Embeddings Document similarity using embedding models (EmbeddingModel/)
9 RAG Components Document loaders, text splitters, vector stores (Chroma), and retrievers (Rag Components/)
10 Tools & Agents Custom tool creation, tool calling, built-in tools, and agent workflows (Tools/)
11 Mini Projects Applied projects — AI agents, currency conversion, agent templates (Mini_projects/)

Repository Structure

langchain-playground--for-llms-/
├── Chain/                    # Sequential, parallel, and conditional chain examples
├── chatmodel/                # LLM initialization (Ollama, Gemini)
├── chroma_db/                # Persisted Chroma vector database
├── EmbeddingModel/           # Document similarity with embeddings
├── Messages/                 # Chat message types and prompt templates
├── Mini_projects/            # End-to-end LLM applications
├── Output_Parsers/           # JSON, Pydantic, string, and structured parsers
├── Prompts/                  # Prompt templates and Streamlit UI
├── Rag Components/           # RAG pipeline modules
│   ├── Document Loaders/     #   PDF, CSV, text, web, and directory loaders
│   ├── Retrievers/           #   MultiQuery, contextual compression, MMR
│   ├── Text Splitters/       #   Length, markdown, code, and semantic splitting
│   └── Vectors Stores/       #   Chroma vector database integration
├── Runnables/                # LCEL runnables (sequence, parallel, branch, lambda)
├── Sample Data/              # Reference PDFs for RAG examples
├── Structured Output/        # JSON schema, Pydantic, and TypedDict outputs
├── Tools/                    # Custom and built-in tool calling, agent toolkits
├── notes.md                  # Core LangChain concept notes (Chat Models, Prompts, Messages)
├── requirements.txt          # Python dependencies
├── test.py                   # Quick-start test script
├── sample.pdf                # Sample document for loader testing
├── sample_data.md            # Sample text data for RAG testing
├── Titanic-Dataset.csv       # Sample dataset for data-loading examples
└── README.MD

Notes Index

All concept notes are organized by topic. Click any link to jump directly to the notes.

🧠 Foundations

Topic Notes File
Chat Models, Prompts & Messages notes.md
Chat History Messages/chat_history.md

⛓️ Chains

Topic Notes File
Simple & Sequential Chains Chain/chain.md
Parallel Chain Chain/parallel_chain.md
Conditional Chain (IF/ELSE routing) Chain/conditional_Chain.md

🔄 Runnables (LCEL)

Topic Notes File
Runnables, LCEL, Primitive vs Task-Specific Runnables/Runnables.md

📤 Output Parsers & Structured Output

Topic Notes File
Output Parsers Overview & Ollama Output_Parsers/notes1.md
String Output Parser Output_Parsers/stringoutputParser.md
JSON Output Parser Output_Parsers/jsonoutput_parser.md
Pydantic Output Parser Output_Parsers/PydanticOutputParser.md
Structured Output Parser (legacy) Output_Parsers/Structured Output Parser.md
TypedDict vs Pydantic vs JSON Schema Structured Output/structured.md

📚 RAG Components

Topic Notes File
Document Loaders (PDF, Web, CSV, Directory) Rag Components/rag_notes.md
Text Splitters Rag Components/text_splitter_langchain_notes.md
Vector Stores Overview Rag Components/Vectors Stores/vector_store_notes.md
Chroma DB Rag Components/Vectors Stores/chroma_db_notes.md
Chroma Internal Architecture Rag Components/Vectors Stores/chroma_internal_architecture.md
FAISS vs Chroma vs Pinecone Comparison Rag Components/Vectors Stores/vector_db_architecture_comparison.md
Retrievers Overview Rag Components/Retrievers/retriver.md
MMR Retriever Rag Components/Retrievers/mmr.md
MultiQuery Retriever Rag Components/Retrievers/MultiQuery_Retriever_Notes.md
Contextual Compression Retriever Rag Components/Retrievers/Contextual_Compression_Retriever_Notes.md

🔧 Tools & Agents

Topic Notes File
Tools Overview Tools/tools.md
Tool Binding Tools/tool_binding.md
Tool Calling Tools/tools_calling .md
Custom Tools Tools/custom_tools.md
Toolkits Tools/toolkit.md
AI Agents (ReAct Pattern) Mini_projects/agent_notes.md

📄 Sample Data

File Description
sample_data.md AI overview text for RAG testing
sample.pdf Sample PDF for document loader testing
Titanic-Dataset.csv Sample CSV for data loading examples

Getting Started

Prerequisites

  • Python 3.10 or higher
  • Ollama (optional, for local LLM inference)

1. Clone the Repository

git clone https://github.com/himanshu231204/langchain-playground--for-llms-.git
cd langchain-playground--for-llms-

2. Create a Virtual Environment

python -m venv venv
source venv/bin/activate        # macOS / Linux
venv\Scripts\activate           # Windows

3. Install Dependencies

pip install -r requirements.txt

4. Configure Environment Variables

Create a .env file in the project root:

OPENAI_API_KEY=your_openai_key
GOOGLE_API_KEY=your_google_api_key
HUGGINGFACEHUB_API_TOKEN=your_hf_token

Note: The .env file is listed in .gitignore and should never be committed.


Local LLM Support (Ollama)

This repository supports local LLM inference via Ollama, allowing experimentation without API costs or internet access.

Setup

# Install Ollama from https://ollama.com, then:
ollama pull llama3
ollama run llama3

Usage with LangChain

from langchain_ollama import ChatOllama

llm = ChatOllama(model="llama3")
response = llm.invoke("Explain Retrieval Augmented Generation in simple words")
print(response.content)

Benefits of local inference:

  • No API costs
  • Full data privacy
  • Offline development
  • Faster iteration for prototyping

Security

  • API keys are managed through environment variables (.env)
  • .env is excluded from version control via .gitignore
  • No secrets are hard-coded in source files

Tech Stack

Layer Technologies
LLM Providers OpenAI GPT-4, Google Gemini, Ollama (Llama 3, Mistral), Hugging Face
Frameworks LangChain v1.0, LangGraph v1.0, LangChain Expression Language (LCEL)
Vector Stores Chroma DB, FAISS
Embeddings OpenAI Embeddings, HuggingFace Sentence Transformers
Output Validation Pydantic v2, TypedDict, JSON Schema
UI Streamlit
Data Pandas, CSV, PDF, Markdown document loaders
Language Python 3.10+

Future Enhancements

  • Multi-agent systems with LangGraph
  • Conversational memory integration
  • Evaluation and monitoring pipelines
  • Deployment-ready GenAI applications
  • Streamlit-based interactive demos

Topics & Keywords

This repository covers the following areas of Generative AI and LLM development:

langchain llm generative-ai rag retrieval-augmented-generation langchain-python ai-agents openai google-gemini ollama vector-database chromadb pydantic lcel langchain-expression-language huggingface streamlit python machine-learning nlp large-language-models prompt-engineering ai deep-learning chatbot langsmith langgraph embeddings transformers tutorial


Contributing

Contributions are welcome. To get started:

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/my-feature)
  3. Commit your changes (git commit -m "Add my feature")
  4. Push to the branch (git push origin feature/my-feature)
  5. Open a pull request

Connect


🌟 Found this useful? Star the repo and share it with fellow AI enthusiasts!

Star History Chart

Made with ❤️ by Himanshu — Happy Learning! 🚀

About

A personal learning space for LangChain, featuring code snippets, notes, and mini-projects to explore LLM integrations. Ideal for beginners aiming to learn quickly through structured examples.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors