A powerful local AI agent using DeepSeek-R1:1.5B via Ollama and Vercel AI SDK
- ๐ง Advanced Reasoning: Powered by DeepSeek-R1:1.5B reasoning model
- ๐ ๏ธ Tool Calling: Calculator, file operations, web search simulation, code generation
- ๐ฌ Interactive Chat: Streaming responses with conversation history
- ๐จ Beautiful CLI: Rich formatting with colors, gradients, and animations
- ๐ Memory: Persistent conversation history within sessions
- ๐ Real-time: Live streaming responses as the AI thinks
- ๐ฏ Local: Runs completely offline after initial setup
- Node.js 18+: Download here
- Ollama: Install locally
# macOS/Linux curl -fsSL https://ollama.ai/install.sh | sh # Windows # Download from https://ollama.ai/download
-
Clone and setup:
git clone <your-repo-url> cd deepseek-ai-agent npm install
-
Install the DeepSeek-R1:1.5B model:
# Pull the model (this will download ~1.1GB) ollama pull deepseek-r1:1.5b # Or use the npm script npm run install-model
-
Start Ollama service (if not running):
ollama serve
-
Run the AI Agent:
npm start # or node ai-agent.js
Once started, you'll see a beautiful banner and command prompt. Here are the available commands:
Command | Description | Example |
---|---|---|
help |
Show all available commands | help |
chat |
Start interactive chat mode | chat |
calc <expression> |
Quick mathematical calculation | calc 15 * 42 + 8 |
files |
List files in current directory | files |
think <question> |
Deep reasoning mode | think Why is the sky blue? |
code <lang> <task> |
Generate code | code python fibonacci sequence |
history |
Show conversation history | history |
clear |
Clear conversation history | clear |
exit |
Exit the agent | exit |
In chat mode, the AI can:
- Answer questions with advanced reasoning
- Use tools automatically (calculator, file operations, etc.)
- Generate code in any programming language
- Maintain conversation context
- Stream responses in real-time
Example chat session:
๐ง You: Can you help me calculate the area of a circle with radius 5 and then write Python code to do this calculation?
๐ค AI: I'll help you calculate the area of a circle with radius 5 and then provide Python code for this calculation.
First, let me calculate the area using the formula A = ฯrยฒ:
๐ง Tool Results:
Tool: calculator
Result: {
"result": "78.53975",
"expression": "3.14159 * 5 * 5"
}
The area of a circle with radius 5 is approximately 78.54 square units.
Now, here's Python code to calculate the area of a circle:
```python
import math
def circle_area(radius):
"""Calculate the area of a circle given its radius."""
return math.pi * radius ** 2
# Example usage
radius = 5
area = circle_area(radius)
print(f"The area of a circle with radius {radius} is {area:.2f} square units")
The AI agent has access to several built-in tools:
Perform mathematical calculations safely:
> calc 15 * 42 + sqrt(16)
โ
15 * 42 + sqrt(16) = 634
Read, write, and list files:
> files
๐ Files in .:
โข ai-agent.js
โข package.json
โข README.md
Simulate web search results (extend for real implementation):
// In chat mode, ask: "Search for information about Node.js"
// The AI will use the webSearch tool automatically
Generate code in any programming language:
> code javascript "function to reverse a string"
๐ป Generated JavaScript Code:
function reverseString(str) {
return str.split('').reverse().join('');
}
DeepSeek-R1 features advanced reasoning capabilities. Use the think
command for complex problems:
> think How would you design a scalable microservices architecture for an e-commerce platform?
๐ง Deep Reasoning Result:
[Detailed reasoning about microservices architecture, including:
- Service decomposition strategies
- Data management patterns
- Communication protocols
- Scaling considerations
- Security implications]
You can modify the model settings in ai-agent.js
:
const model = ollama('deepseek-r1:1.5b', {
temperature: 0.7, // Creativity level (0.0-1.0)
simulateStreaming: true, // Enable streaming responses
// Add more Ollama-specific options
});
Add your own tools by extending the tools
object:
const tools = {
// ... existing tools
customTool: {
description: 'Your custom tool description',
parameters: z.object({
param: z.string().describe('Parameter description'),
}),
execute: async ({ param }) => {
// Your tool logic here
return { result: 'Tool result' };
}
}
};
The agent supports rich customization:
- Colors: Modify gradients and chalk colors
- Styling: Change boxen styles and borders
- Commands: Add new commands to the
commands
object - Tools: Extend functionality with custom tools
- Model: Switch to different Ollama models
Attribute | Value |
---|---|
Model Size | 1.5B parameters |
Download Size | ~1.1GB |
RAM Usage | ~2-4GB during inference |
Architecture | Qwen2-based reasoning model |
Context Length | 131,072 tokens |
Component | Minimum | Recommended |
---|---|---|
RAM | 8GB | 16GB+ |
Storage | 2GB free | 5GB+ free |
CPU | 4 cores | 8 cores+ |
GPU | None (CPU only) | NVIDIA GPU for faster inference |
-
Model not found:
Error: Model deepseek-r1:1.5b not found # Solution: Pull the model ollama pull deepseek-r1:1.5b
-
Ollama not running:
Error: Connection refused # Solution: Start Ollama service ollama serve
-
Permission errors:
# On macOS/Linux, you might need: chmod +x ai-agent.js
-
Out of memory:
- Close other applications
- Use a smaller model variant
- Add more RAM to your system
Enable debug logging by setting the environment variable:
DEBUG=* node ai-agent.js
- Local execution: All data stays on your machine
- Safe evaluation: Calculator uses Function constructor with restrictions
- File access: Limited to current directory by default
- No external calls: Web search is simulated (safe for local use)
- Plugin system for custom tools
- Multiple model support (switch between models)
- Conversation export (JSON, Markdown)
- Real web search integration
- Voice input/output support
- GUI version using Electron
- Docker containerization
- Multi-language support
Contributions are welcome! Please feel free to submit a Pull Request. Areas where help is needed:
- Additional tool implementations
- UI/UX improvements
- Performance optimizations
- Documentation enhancements
- Bug fixes and testing
This project is licensed under the MIT License - see the LICENSE file for details.
- DeepSeek for the amazing R1 reasoning model
- Ollama for local model serving
- Vercel for the excellent AI SDK
- Community for tool inspirations and feedback
If you encounter any issues or have questions:
- Check the troubleshooting section
- Search existing issues
- Create a new issue with detailed information
Made with โค๏ธ and ๐ค by the AI community
โญ Star this repo if you found it helpful!