🚀 A Powerful Retrieval-Augmented Generation (RAG) API using LLama3, LangChain, Ollama, and ChromaDB
Doc_Query_Genie is a cutting-edge RAG-based Python solution that enhances information retrieval and generation using Flask API. With the power of LLama3, LangChain, Ollama, and ChromaDB, this tool provides a seamless experience for querying both general knowledge and custom document uploads.
✅ AI-Powered Chat – Use it like OpenAI's ChatGPT to ask any question.
✅ PDF Intelligence – Upload a PDF and ask context-specific questions.
✅ Source Referencing – Get precise answers with citations from the document (paragraph/line references).
✅ Fast & Efficient – Optimized for quick and reliable response generation.
✅ Easy Integration – Simple API setup to integrate with other applications.
This project brings the best of AI-driven retrieval and context-aware generation, making it a versatile tool for researchers, students, and professionals.
- Clone the repository
git clone https://github.com/yourusername/Doc_Query_Genie.git cd Doc_Query_Genie
- Install dependencies
pip install -r requirements.txt
- Run the application
python app.py
- Use the API
- Access it at
http://127.0.0.1:5000/
- Upload PDFs and start querying
- Access it at
We welcome contributions! Feel free to submit issues or pull requests.
🌟 If you find this project helpful, please consider giving it a star!