DeepSeek MCP Server with Terminal Chat & Command Flow
A fully functional MCP (Model Context Protocol) server that provides:
- 🤖 Terminal Chat Interface - Direct conversation with DeepSeek 7B model
- ⚡ AI-Powered Command Flow - Execute system commands through natural language
- 🔒 Safe Command Execution - Protected command execution with confirmations
- 🖥️ Windows 11 Integration - Optimized for Windows with conda environment support
- 🔌 MCP Server - Compatible with Claude Desktop and other MCP clients
🚀 Quick Start
Prerequisites
- Windows 11
- Conda environment named
cyber_llm
(already set up) - DeepSeek 7B model file in
models/deepseek-llm-7b-chat-Q6_K.gguf
Launch Options
Option 1: Using Batch File (Recommended)
# Double-click deepseek.bat or run from command prompt
deepseek.bat
Option 2: Direct Python Execution
# Activate conda environment first
conda activate cyber_llm
# Choose your interface:
python terminal_chat.py # Enhanced chat with command flow
python chat.py # Basic terminal chat
python start_server.py # MCP server for Claude Desktop
🎮 Features Overview
1. Enhanced Terminal Chat (terminal_chat.py
)
The most powerful interface combining chat and command execution:
- 💬 Natural Conversation: Chat directly with DeepSeek 7B
- ⚡ Command Flow: Ask for system operations in natural language
- 🔒 Safe Execution: Automatic safety checks for commands
- 📚 Smart Help: Context-aware assistance
- 📜 History Tracking: Conversation and command history
Example Usage: ``` 🤖 You: list all Python files in the current directory ⚡ DeepSeek: Analyzing command request... ⚡ DeepSeek Command Analysis:
COMMAND: dir *.py /b EXPLANATION: Lists all Python files in current directory WARNINGS: Safe read-only operation
🎯 Suggested Command: dir *.py /b 🔒 Safety Level: ✅ Safe ⚡ Executing: dir *.py /b
### 2. Basic Terminal Chat (`chat.py`)
Simple chat interface for conversations only:
- Direct model interaction
- Conversation history
- Configuration controls
- Lightweight and fast
### 3. MCP Server (`start_server.py`)
Standard MCP server for integration with Claude Desktop:
- MCP protocol compliance
- Tool-based interactions
- Resource management
- Logging and monitoring
## 🛠️ Configuration
### Environment Variables
Customize behavior through environment variables:
```bash
# Model Configuration
set MCP_MODEL_PATH=path\to\your\model.gguf
set MCP_CONTEXT_SIZE=4096
set MCP_GPU_LAYERS=35
set MCP_THREADS=8
# Generation Settings
set MCP_DEFAULT_MAX_TOKENS=512
set MCP_DEFAULT_TEMPERATURE=0.7
set MCP_DEFAULT_TOP_P=0.9
# Server Settings
set MCP_SERVER_NAME=deepseek-mcp-server
set MCP_LOG_LEVEL=INFO
Configuration File (config.py
)
The config.py
file handles all settings with sensible defaults:
- Model path detection
- Performance optimization
- Logging configuration
- Safety settings
📋 Command Reference
Enhanced Terminal Chat Commands
Command | Description |
---|---|
/help |
Show detailed help and usage |
/cmd |
Toggle command execution mode |
/safe |
Show safety information |
/history |
Display conversation history |
/commands |
Show command execution history |
/clear |
Clear all history |
/config |
Display current configuration |
/temp <n> |
Set temperature (0.1-2.0) |
/tokens <n> |
Set max tokens (50-2048) |
/quit |
Exit the chat |
Safe Commands (Auto-Execute)
These commands execute automatically without confirmation:
dir
,ls
,pwd
,cd
,echo
,type
,cat
find
,grep
,python
,pip
,conda
,git
node
,npm
,help
,where
,which
whoami
,date
,time
,systeminfo
,tasklist
Command Flow Examples
File Operations:
"show me all files in this directory"
"create a new Python file called test.py"
"find all .txt files in subdirectories"
System Information:
"check system information"
"show running processes"
"what's my current directory?"
Development Tasks:
"check git status"
"install numpy using pip"
"run my Python script"
"activate conda environment"
🔒 Security Features
Command Safety
- Safe Commands: Execute automatically (read-only operations)
- Risky Commands: Require user confirmation
- Timeout Protection: 30-second execution limit
- Command Logging: All executions are logged
Model Safety
- Temperature Control: Configurable response randomness
- Token Limits: Prevent excessive generation
- Context Management: Automatic history trimming
🏗️ Architecture
┌─────────────────────┐ ┌──────────────────────┐
│ terminal_chat.py │ │ chat.py │
│ (Enhanced Chat) │ │ (Basic Chat) │
└──────────┬──────────┘ └──────────┬───────────┘
│ │
└──────────┬───────────────────────────┘
│
┌──────────▼──────────────────┐
│ mcp_interface.py │
│ (Core MCP Logic) │
└──────────┬──────────┘
│
┌──────────▼──────────┐
│ config.py │
│ (Configuration) │
└─────────────────────┘
📁 Project Structure
mcp_llm_server/
├── 🚀 deepseek.bat # Windows launcher
├── 🤖 terminal_chat.py # Enhanced chat with commands
├── 💬 chat.py # Basic terminal chat
├── 🔌 mcp_interface.py # Core MCP server logic
├── ⚙️ config.py # Configuration management
├── 🏃 start_server.py # MCP server starter
├── 🧪 test_server.py # Server testing
├── ⚡ quick_chat.py # Quick chat utility
├── 📋 requirements.txt # Python dependencies
├── 📚 README.md # This documentation
├── 🔧 claude_desktop_config.json # Claude Desktop config
└── 📁 models/
└── deepseek-llm-7b-chat-Q6_K.gguf # Model file (5.6GB)
🔧 Installation & Setup
1. Dependencies
# Activate your conda environment
conda activate cyber_llm
# Install required packages
pip install -r requirements.txt
2. Model Setup
Ensure your model file is located at:
models/deepseek-llm-7b-chat-Q6_K.gguf
3. Test Installation
# Test basic functionality
python test_server.py
# Test chat interface
python chat.py
🐛 Troubleshooting
Common Issues
Model Not Found:
WARNING: Model file not found at models\deepseek-llm-7b-chat-Q6_K.gguf
- Ensure model file is in correct location
- Check file permissions
- Verify file isn't corrupted
Conda Environment Issues:
ERROR: Failed to activate conda environment 'cyber_llm'
- Verify environment exists:
conda env list
- Recreate if needed:
conda create -n cyber_llm python=3.11
Memory Issues:
Error loading model: Out of memory
- Reduce
n_gpu_layers
in config - Set
MCP_LOW_VRAM=true
- Close other applications
Command Execution Issues:
Command timed out (30s limit)
- Commands have 30-second timeout
- Use
/cmd
to toggle command mode - Check command syntax
Performance Optimization
For Better Speed:
- Increase
n_gpu_layers
(if you have GPU) - Reduce
n_ctx
for faster responses - Use lower temperature for consistent output
For Lower Memory Usage:
- Set
MCP_LOW_VRAM=true
- Reduce
n_ctx
to 2048 or lower - Decrease
n_gpu_layers
📈 Usage Examples
Example 1: Development Workflow
🤖 You: check what Python files are in this project
⚡ DeepSeek: [Suggests: dir *.py /b]
✅ Command completed successfully!
🤖 You: show me the git status
⚡ DeepSeek: [Suggests: git status]
✅ Command completed successfully!
🤖 You: what does the config.py file do?
🤖 DeepSeek: The config.py file manages configuration settings...
Example 2: System Administration
🤖 You: show me system information
⚡ DeepSeek: [Suggests: systeminfo | findstr /C:"OS Name" /C:"Total Physical Memory"]
✅ Command completed successfully!
🤖 You: what processes are using the most memory?
⚡ DeepSeek: [Suggests: tasklist /fo table | sort /r /+5]
⚠️ This command may modify your system. Execute? (y/N):
🤝 Contributing
This is a complete, working MCP server. To extend functionality:
- Add New Tools: Modify
mcp_interface.py
- Enhance Commands: Update
terminal_chat.py
- Improve Safety: Extend safe command list
- Add Features: Create new interface files
📄 License
Open source project for educational and development purposes.
🆘 Support
For issues or questions:
- Check troubleshooting section above
- Review log files in project directory
- Test with basic chat first (
chat.py
) - Verify conda environment and dependencies
🎉 Enjoy your AI-powered terminal experience with DeepSeek!
- Downloads last month
- -