PDF-RAG-AI: AI-Powered Document Question Answering end to end encrypted without anyone interruption (not even mine)
A modern web application that enables intelligent document analysis and question answering for PDF documents using RAG (Retrieval-Augmented Generation) technology.
- PDF Upload & Processing: Upload PDFs for AI analysis and indexing
- Intelligent Question Answering: Ask natural language questions about your documents
- Context-Aware Responses: AI generates answers based specifically on your document content
- Secure Authentication: User authentication with Clerk
- Simple UI: Clean, responsive design with dark mode support
┌─────────────┐ ┌─────────────┐ ┌─────────────┐
│ │ │ │ │ │
│ Client │────▶│ Server │────▶│ Worker │
│ (Next.js) │ │ (Express) │ │ (Background)│
│ │ │ │ │ │
└─────────────┘ └─────────────┘ └─────────────┘
│ │ │
│ │ │
│ ▼ │
│ ┌─────────────┐ │
│ │ │ │
└──────────▶│ Valkey │◀──────────┘
│ (Redis) │
│ │
└─────────────┘
│
│
▼
┌─────────────┐
│ │
│ Qdrant │
│(Vector DB) │
│ │
└─────────────┘
- Client: Next.js React application for the user interface
- Server: Express.js backend API handling PDF uploads and queries
- Worker: Background processing for PDF ingestion and vector embedding
- Valkey: Redis-compatible message broker for job queue
- Qdrant: Vector database for storing document embeddings
- Next.js 15.3 with App Router
- React 19
- TypeScript
- TailwindCSS 4
- ShadCN UI components
- Clerk Authentication
- React Toastify for notifications
- Embla Carousel for UI components
- Express.js 5
- LangChain for LLM operations
- Mistral AI for embeddings and chat completions
- BullMQ for background processing
- Multer for file uploads
- PM2 for process management
- Qdrant vector database for semantic search
- Valkey Redis-compatible message broker
- Docker containerization
- Node.js 20+
- Docker and Docker Compose
- Clone the repository
git clone https://github.com/Sumit-Ks1/PDF-RAG-AI.git
cd PDF-RAG-AI-
Set up environment variables:
- Create
server/.envwith Mistral API key and connection details - Create
client/.env.localwith backend URL and Clerk keys
- Create
-
Start with Docker Compose:
docker-compose up --build- Access the application:
- Frontend: http://localhost:3000
- Backend API: http://localhost:8000
├── client/ # Next.js frontend application
├── server/ # Express backend application
└── docker-compose.yml # Docker Compose configuration
See individual README files in client/ and server/ directories for more details.
- Authentication: Users sign in through Clerk authentication
- PDF Upload: Documents are uploaded to the server
- Processing: Worker processes PDFs and creates vector embeddings
- Storage: Embeddings stored in Qdrant vector database
- Query: Users ask questions about their documents
- Retrieval: System retrieves relevant document sections
- Response: AI generates answers based on retrieved content
This project is licensed under the ISC License.
- Mistral AI for the LLM model
- LangChain for the RAG framework
- Qdrant for vector search capabilities
- Clerk for authentication