title | emoji | colorFrom | colorTo | sdk | sdk_version | app_file | pinned | license | header | fullWidth | short_description | models | ||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Chat with DuckDuckGo Agent |
🦆 |
yellow |
red |
gradio |
4.36.1 |
app.py |
true |
apache-2.0 |
mini |
true |
Chat llama-cpp-agent that can search the web. |
|
A Gradio-based web interface that allows users to chat with an AI agent capable of searching the web using DuckDuckGo. The agent uses llama.cpp to run various open-source language models locally.
- Web search capabilities using DuckDuckGo
- Support for multiple LLM models:
- Mistral 7B Instruct v0.3
- Mixtral 8x7B Instruct v0.1
- Meta Llama 3 8B Instruct
- Real-time chat interface
- Source citation for responses
- Customizable model parameters
- Dark mode support
- Framework: Gradio 4.36.1
- Models: Uses GGUF quantized models
- Agent: Powered by llama-cpp-agent
- Web Scraping: Uses trafilatura for content extraction
- Context Window:
- 32k tokens for Mistral and Mixtral
- 8k tokens for Meta Llama 3
The chat interface provides several customizable parameters:
- Model selection
- System message
- Max tokens (1-4096)
- Temperature (0.1-1.0)
- Top-p (0.1-1.0)
- Top-k (0-100)
- Repetition penalty (0.0-2.0)
- "latest news about Yann LeCun"
- "Latest news site:github.blog"
- "Where I can find best hotel in Galapagos, Ecuador intitle:hotel"
- "filetype:pdf intitle:python"
- Clone the repository:
git clone https://github.com/pabl-o-ce/ddg-web-search-chat
cd ddg-web-search-chat
- Install dependencies:
pip install -r requirements.txt
- Download the required model files: The application will automatically download the following models:
- Mistral-7B-Instruct-v0.3-Q6_K.gguf
- Meta-Llama-3-8B-Instruct-Q6_K.gguf
- mixtral-8x7b-instruct-v0.1.Q5_K_M.gguf
- Run the application:
python app.py
The application can be configured through various parameters in settings.py
:
- Model context limits
- Message formatter types for different models
- System prompts and templates
The interface uses a custom theme with:
- Orange primary color
- Amber secondary color
- Dark mode support
- Custom CSS for message bubbles and layout
Apache 2.0
Contributions are welcome! Please feel free to submit a Pull Request.
Note: This project is powered by llama-cpp-agent and uses Hugging Face Spaces for deployment.