Skip to content

shrey802/medLLM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MedLLM

Publication:- https://app.readytensor.ai/publications/medical-rag-bot-42AUw4scRGDl

MedLLM is a medical chatbot designed to answer questions based on PDFs in the data folder. It includes a user-friendly interface built with HTML, CSS, and Flask, and utilizes Pinecone for vector storage and Groq for API handling.

Quick Start

Step 1: Clone the Repository

Clone this repository to your local machine:

git clone https://github.com/yourusername/medLLM.git
cd medLLM

Step 2: Set Up the Conda Environment

conda create -n medibot python=3.10 -y
conda init
conda activate medibot

Step 3: Install Dependencies

pip install -r requirements.txt

Step 4: Set Up the Embedding Model

Embedding Links -> https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2

Step 5: Setup PINECONE VECTOR DB API KEY AND GROQ API KEY

GROQ_API_KEY = "xxxxxx"
PINECONE_API_KEY = "xxxxxx"

Step 6: Run the store_index.py

python store_index.py

Step 7: Running the Application

python app.py

Features

PDF Ingestion: Answer queries based on content from PDFs in the data folder.

Embedding & Retrieval: Utilizes the all-MiniLM-L6-v2 embedding model and Pinecone for vector storage.

Interactive UI: Chat interface built with HTML, CSS, and Flask.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages