Skip to content

pabl-o-ce/ddg-web-search-chat

Repository files navigation

title emoji colorFrom colorTo sdk sdk_version app_file pinned license header fullWidth short_description models
Chat with DuckDuckGo Agent
🦆
yellow
red
gradio
4.36.1
app.py
true
apache-2.0
mini
true
Chat llama-cpp-agent that can search the web.
mistralai/Mistral-7B-Instruct-v0.3
meta-llama/Meta-Llama-3-8B-Instruct

Chat with DuckDuckGo Agent 🦆

Open In Spaces Apache 2.0

A Gradio-based web interface that allows users to chat with an AI agent capable of searching the web using DuckDuckGo. The agent uses llama.cpp to run various open-source language models locally.

Features

  • Web search capabilities using DuckDuckGo
  • Support for multiple LLM models:
    • Mistral 7B Instruct v0.3
    • Mixtral 8x7B Instruct v0.1
    • Meta Llama 3 8B Instruct
  • Real-time chat interface
  • Source citation for responses
  • Customizable model parameters
  • Dark mode support

Technical Details

  • Framework: Gradio 4.36.1
  • Models: Uses GGUF quantized models
  • Agent: Powered by llama-cpp-agent
  • Web Scraping: Uses trafilatura for content extraction
  • Context Window:
    • 32k tokens for Mistral and Mixtral
    • 8k tokens for Meta Llama 3

Usage

The chat interface provides several customizable parameters:

  • Model selection
  • System message
  • Max tokens (1-4096)
  • Temperature (0.1-1.0)
  • Top-p (0.1-1.0)
  • Top-k (0-100)
  • Repetition penalty (0.0-2.0)

Example Queries

- "latest news about Yann LeCun"
- "Latest news site:github.blog"
- "Where I can find best hotel in Galapagos, Ecuador intitle:hotel"
- "filetype:pdf intitle:python"

Installation

  1. Clone the repository:
git clone https://github.com/pabl-o-ce/ddg-web-search-chat
cd ddg-web-search-chat
  1. Install dependencies:
pip install -r requirements.txt
  1. Download the required model files: The application will automatically download the following models:
  • Mistral-7B-Instruct-v0.3-Q6_K.gguf
  • Meta-Llama-3-8B-Instruct-Q6_K.gguf
  • mixtral-8x7b-instruct-v0.1.Q5_K_M.gguf
  1. Run the application:
python app.py

Configuration

The application can be configured through various parameters in settings.py:

  • Model context limits
  • Message formatter types for different models
  • System prompts and templates

UI Customization

The interface uses a custom theme with:

  • Orange primary color
  • Amber secondary color
  • Dark mode support
  • Custom CSS for message bubbles and layout

License

Apache 2.0

Links

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.


Note: This project is powered by llama-cpp-agent and uses Hugging Face Spaces for deployment.