Publication:- https://app.readytensor.ai/publications/medical-rag-bot-42AUw4scRGDl
MedLLM is a medical chatbot designed to answer questions based on PDFs in the data folder. It includes a user-friendly interface built with HTML, CSS, and Flask, and utilizes Pinecone for vector storage and Groq for API handling.
Clone this repository to your local machine:
git clone https://github.com/yourusername/medLLM.git
cd medLLMconda create -n medibot python=3.10 -y
conda init
conda activate medibotpip install -r requirements.txtEmbedding Links -> https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2
GROQ_API_KEY = "xxxxxx"
PINECONE_API_KEY = "xxxxxx"python store_index.pypython app.pyPDF Ingestion: Answer queries based on content from PDFs in the data folder.
Embedding & Retrieval: Utilizes the all-MiniLM-L6-v2 embedding model and Pinecone for vector storage.
Interactive UI: Chat interface built with HTML, CSS, and Flask.