A powerful Model Context Protocol (MCP) command-line interface tool
MCP-CLI: Model Context Protocol Command-Line Interface Tool
Project Overview
MCP-CLI is a powerful, feature-rich command-line interface tool specifically designed for interacting with Model Context Protocol (MCP) servers. By integrating the CHUK-MCP protocol library, this tool provides users with seamless communication capabilities with Large Language Models (LLMs).
This client support tool offers utility usage, conversation management, and multiple operation modes. The core protocol implementation has been migrated to a separate package, allowing the CLI to focus on providing a rich user experience while the protocol library handles the communication layer.
Core Features
🔄 Multiple Operation Modes
- Chat Mode: A conversational interface supporting direct LLM interaction and automated tool usage.
- Interactive Mode: A command-driven interface for direct server operations.
- Command Mode: A Unix-friendly mode supporting scripted automation and piping operations.
- Direct Commands: Run single commands without entering interactive mode.
🌐 Multi-Provider Support
- OpenAI Integration: Supports models like
gpt-4o-mini
,gpt-4o
,gpt-4-turbo
, etc. - Ollama Integration: Supports local models like
llama3.2
,qwen2.5-coder
, etc. - Extensible Architecture: Supports adding other providers.
🛠️ Powerful Tool System
- Automatic discovery of server-provided tools.
- Server-aware tool execution.
- Tool call history tracking and analysis.
- Supports complex multi-step tool chains.
💬 Advanced Conversation Management
- Full conversation history tracking.
- Supports filtering and viewing specific message ranges.
- JSON export functionality for debugging or analysis.
- Conversation compression feature to reduce token usage.
🎨 Rich User Experience
- Context-aware command auto-completion.
- Color-formatted console output.
- Progress indicators for long-running operations.
- Detailed help and documentation.
🔧 Reliable Resource Management
- Proper asynchronous I/O resource cleanup.
- Graceful error handling.
- Clean terminal restoration.
- Supports multiple simultaneous server connections.
System Requirements
- Python 3.11 or higher
- For OpenAI: A valid API key must be set in the
OPENAI_API_KEY
environment variable. - For Ollama: A local Ollama installation is required.
- Server configuration file (default:
server_config.json
) - CHUK-MCP protocol library
Installation Methods
Standard Installation
# Clone the repository
git clone https://github.com/chrishayuk/mcp-cli
cd mcp-cli
# Install the package and development dependencies
pip install -e ".[cli,dev]"
# Run the CLI
mcp-cli --help
Using UV for Dependency Management
# Install UV (if not already installed)
pip install uv
# Install dependencies
uv sync --reinstall
# Run with UV
uv run mcp-cli --help
Usage Guide
Global Options
All commands support the following global options:
--server
: Specify the server(s) to connect to (multiple servers separated by commas).--config-file
: Path to the server configuration file (default:server_config.json
).--provider
: LLM provider to use (openai
orollama
, default:openai
).--model
: Specific model to use (depends on provider's default).--disable-filesystem
: Disable file system access (default: true).
Chat Mode
Chat mode provides a conversational interface with the LLM, automatically using available tools when needed:
# Basic chat mode
mcp-cli chat --server sqlite
# Specify provider and model
mcp-cli chat --server sqlite --provider openai --model gpt-4o
mcp-cli chat --server sqlite --provider ollama --model llama3.2
Chat Mode Slash Commands
In chat mode, the following slash commands can be used:
Help Commands:
/help
: Displays available commands./help <command>
: Displays detailed help for a specific command./quickhelp
or/qh
: Displays a quick reference for common commands.
Tool Related:
/tools
: Displays all available tools and their server information./tools --all
: Displays detailed tool information including parameters./tools --raw
: Displays raw tool definitions./toolhistory
or/th
: Displays tool call history in the current session.
Conversation Management:
/conversation
or/ch
: Displays conversation history./save <filename>
: Saves conversation history to a JSON file./compact
: Compresses conversation history into a summary.
Interface Control:
/cls
: Clears the screen but retains conversation history./clear
: Clears both the screen and conversation history./verbose
or/v
: Toggles between verbose and concise tool display modes.
Interactive Mode
Interactive mode provides a command-line interface for direct server interaction using slash commands:
mcp-cli interactive --server sqlite
Interactive Mode Commands
/ping
: Checks if the server is responsive./prompts
: Lists available prompts./tools
: Lists available tools./resources
: Lists available resources./chat
: Enters chat mode./exit
or/quit
: Exits the program.
Command Mode
Command mode provides a Unix-friendly interface for automation and pipeline integration:
mcp-cli cmd --server sqlite [options]
Command Mode Options
--input
: Input file path (use-
for stdin).--output
: Output file path (use-
for stdout, default).--prompt
: Prompt template (use{{input}}
as input placeholder).--raw
: Output raw text without formatting.--tool
: Directly call a specific tool.--tool-args
: JSON arguments for the tool call.--system-prompt
: Custom system prompt.
Command Mode Examples
# Summarize a document
mcp-cli cmd --server sqlite --input document.md --prompt "Summarize this: {{input}}" --output summary.md
# Process stdin and output to stdout
cat document.md | mcp-cli cmd --server sqlite --input - --prompt "Extract key points: {{input}}"
# Directly call a tool
mcp-cli cmd --server sqlite --tool list_tables --raw
mcp-cli cmd --server sqlite --tool read_query --tool-args '{"query": "SELECT COUNT(*) FROM users"}'
# Batch processing
ls *.md | parallel mcp-cli cmd --server sqlite --input {} --output {}.summary.md --prompt "Summarize: {{input}}"
Direct Commands
Run single commands without entering interactive mode:
# List available tools
mcp-cli tools list --server sqlite
# Call a specific tool
mcp-cli tools call --server sqlite
# List available prompts
mcp-cli prompts list --server sqlite
# Check server connection
mcp-cli ping --server sqlite
# List available resources
mcp-cli resources list --server sqlite
Configuration File
Create a server_config.json
file to configure servers:
{
"mcpServers": {
"sqlite": {
"command": "python",
"args": ["-m", "mcp_server.sqlite_server"],
"env": {
"DATABASE_PATH": "your_database.db"
}
},
"another-server": {
"command": "python",
"args": ["-m", "another_server_module"],
"env": {}
}
}
}
Project Structure
src/
├── mcp_cli/
│ ├── chat/ # Chat mode implementation
│ │ ├── commands/ # Chat slash commands
│ │ │ ├── __init__.py # Command registration system
│ │ │ ├── conversation.py # Conversation management
│ │ │ ├── help.py # Help commands
│ │ │ ├── tools.py # Tool commands
│ │ │ └── ...
│ │ ├── chat_context.py # Chat session state management
│ │ ├── chat_handler.py # Main chat loop handler
│ │ ├── command_completer.py # Command auto-completion
│ │ └── ui_manager.py # User interface
│ ├── commands/ # CLI commands
│ │ ├── chat.py # Chat command
│ │ ├── cmd.py # Command mode
│ │ ├── interactive.py # Interactive mode
│ │ └── ...
│ ├── llm/ # LLM client implementation
│ │ ├── providers/ # Provider-specific clients
│ │ │ ├── base.py # Base LLM client
│ │ │ └── openai_client.py # OpenAI implementation
│ │ └── llm_client.py # Client factory
│ ├── ui/ # User interface components
│ │ ├── colors.py # Color definitions
│ │ └── ui_helpers.py # UI utilities
│ ├── main.py # Main entry point
│ └── config.py # Configuration loader
Usage Examples
Automated Tool Execution
In chat mode, MCP CLI can automatically execute tools provided by the server:
You: What tables are available in the database?
Assistant: Let me check for you.
[Tool Call: list_tables]
I found the following tables in the database:
- users
- products
- orders
- categories
You: How many users do we have?
Assistant: I'll query the database for that information.
[Tool Call: read_query]
There are 873 users in the database.
Automation Scripting
Command mode supports powerful automation scripts:
#!/bin/bash
# Example script for analyzing multiple documents
# Process all markdown files in the current directory
for file in *.md; do
echo "Processing $file..."
# Generate a summary
mcp-cli cmd --server sqlite --input "$file" \
--prompt "Summarize this document: {{input}}" \
--output "${file%.md}.summary.md"
# Extract entities
mcp-cli cmd --server sqlite --input "$file" \
--prompt "Extract all company names, people, and locations from this text: {{input}}" \
--output "${file%.md}.entities.txt" --raw
done
# Create a comprehensive report
echo "Creating final report..."
cat *.entities.txt | mcp-cli cmd --server sqlite --input - \
--prompt "Analyze these entities and identify the most frequently mentioned:" \
--output report.md
Conversation History Management
Track and manage conversation history:
> /conversation
Conversation History (12 messages)
# | Role | Content
1 | system | You are an intelligent assistant capable of using t...
2 | user | What tables are available in the database?
3 | assistant | Let me check for you.
4 | assistant | [Tool call: list_tables]
...
> /conversation 4
Message #4 (Role: assistant)
[Tool call: list_tables]
Tool Calls:
1. ID: call_list_tables_12345678, Type: function, Name: list_tables
Arguments: {}
> /save conversation.json
Conversation saved to conversation.json
> /compact
Conversation history compacted with summary.
Summary:
The user asked about database tables, and I listed the available tables (users, products, orders, categories). The user then asked about the number of users, and I queried the database to find there are 873 users.
Dependency Management
The CLI uses optional dependency groups for organization:
- cli: Rich terminal UI, command auto-completion, and provider integration.
- dev: Development and testing tools.
- wasm: (Reserved for future WebAssembly support).
- chuk-mcp: Protocol implementation library (core dependency).
Install specific extras:
pip install "mcp-cli[cli]" # Basic CLI features
pip install "mcp-cli[cli,dev]" # CLI + development tools
Contribution Guide
Contributions are welcome! Please follow these steps:
- Fork the repository.
- Create your feature branch (
git checkout -b feature/amazing-feature
). - Commit your changes (
git commit -m 'Add some amazing feature'
). - Push to the branch (
git push origin feature/amazing-feature
). - Open a Pull Request.
About the Model Context Protocol (MCP)
MCP is an open protocol that standardizes how applications provide context to LLMs. MCP can be thought of as the USB-C port for AI applications. Just as USB-C provides a standardized way to connect devices to various peripherals and accessories, MCP provides a standardized way to connect AI models to various data sources and tools.