A conversational AI assistant built using LangGraph, LangChain, and the blazing-fast Groq API. This project uses LLaMA3 (70B) to create a responsive, memory-enabled chatbot that runs in a loop and gracefully exits on command.
- β Stateful, node-based conversation flow using LangGraph
- π¬ Supports memory with
ConversationBufferWindowMemory - βοΈ Runs with LLaMA 3 via Groq API
- π Graceful exit on
Ctrl+Cor by typingexitorquit - π Clean, modular Python code with type annotations
- os.environ["GROQ_API_KEY"] = "your-groq-api-key-here"
- model_name="your-desired-model-name"
Contributions are welcome! Feel free to fork, create issues, or submit pull requests with improvements.
-
Clone the repo
git clone https://github.com/yourusername/llama3-chatbot-graph.git cd llama3-chatbot-graph