Skip to content

A powerful AI chatbot that leverages the Llama 3.1 8B Instant API from Groq and ChromaDB to provide intelligent, context-aware conversations based on your custom documents useful for websites.

License

Notifications You must be signed in to change notification settings

devrahulbanjara/personalized-chatbot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

69 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Company Specialized Chatbot

FastAPI Python Groq Llama 3 HTML5 CSS3 JavaScript


Overview

This is a sophisticated, personalized chatbot application powered by a FastAPI backend and a modern, responsive frontend. It features Retrieval-Augmented Generation (RAG) for knowledge-based responses and ultra-low latency inference via Groq LPU and Llama 3.


Features

  • Modern UI: Clean, mobile-friendly, responsive chat interface built using Vanilla HTML/CSS.
  • RAG-based Intelligence: Combines local document retrieval with LLM response generation.
  • FastAPI Backend: Async Python backend for low-latency interactions.
  • High Performance: Powered by Groq's LPU and Meta's Llama 3.
  • Modular Design: Easy to extend and maintain with separate layers for parsing, prompting, and vector DB.

Tech Stack

Layer Technologies
Frontend HTML, CSS, JavaScript
Backend Python, FastAPI, Uvicorn, Pydantic, Groq (Llama 3), PyMuPDF, ChromaDB
Tools uv (Python package management), make, streamlit

Project Structure

personalized-chatbot/
├── backend/
│   ├── src/                # FastAPI app, routing and orchestration logic
│   ├── database/           # Vector store (ChromaDB) integration
│   ├── parser/             # Document parsing logic
│   ├── prompts/            # Prompt templates
│   ├── data/               # ChromaDB persistent storage
│   ├── .env.example        # Environment config sample
│   ├── load_documents.py   # Load files to vector store
│   ├── pyproject.toml      # Backend deps (uv format)
│   ├── uv.lock             # Lockfile
│   └── requirements.txt    # Backend deps (pip format)
├── frontend/              
│   ├── index.html          # Main UI
│   ├── chatbot.js          # Core chat logic
│   ├── script.js           # Additional behavior
│   ├── chatbot.css         # Chat styles
│   └── styles.css          # General styles
├── .gitignore             
├── .python-version        
├── Makefile               
├── .env                   # API keys and environment values
└── README.md              

Getting Started

Prerequisites

  • uv (Astral) - Fast Python package manager
  • make utility (available by default on Linux/macOS)

To install uv:

Linux/macOS

curl -Ls https://astral.sh/uv/install.sh | sh

Windows

powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"

Setup Instructions

Step With make Without make
Clone Repository git clone https://github.com/devrahulbanjara/personalized-chatbot Same as left
cd personalized-chatbot
Install Dependencies make install cd backend && uv sync
Load Documents make run-loaddoc cd backend && source .venv/bin/activate && streamlit run load_documents.py
Run Backend Server make run-backend cd backend && source .venv/bin/activate && python -m src.main
Open Frontend Open frontend/index.html in browser Same as left

Note: Loading documents to the knowledge base is a mandatory step before using the chatbot.


Sample API Query with CURL

Once the backend is running:

curl -X 'POST' \
  'http://localhost:8000/api/chat' \
  -H 'accept: application/json' \
  -H 'Content-Type: application/json' \
  -d '{
  "query": "Where does KP Oli live ?",
  "session_id": "some_random_string"
}'

Makefile Commands

Command Description
make install Install all backend dependencies using uv
make run-backend Start FastAPI backend
make run-loaddoc Load documents (starts backend and streamlit loader)
make clean Remove virtual environment and reset state
make help Show all make commands

Contributing

  1. Fork the repository
  2. Create a feature branch: git checkout -b feature/my-feature
  3. Commit your changes: git commit -m "Add my feature"
  4. Push to the branch: git push origin feature/my-feature
  5. Open a pull request

License

This project is licensed under the GNU GENERAL PUBLIC LICENSE.

About

A powerful AI chatbot that leverages the Llama 3.1 8B Instant API from Groq and ChromaDB to provide intelligent, context-aware conversations based on your custom documents useful for websites.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published