abhishekbhakat 80ba05338f feat: Implement async utilities for MCP server management and JSON-RPC communication
- Added `process.py` for managing MCP server subprocesses with async capabilities.
- Introduced `protocol.py` for handling JSON-RPC communication over streams.
- Created `llm_client.py` to support chat completion requests to various LLM providers, integrating with MCP tools.
- Defined model configurations in `llm_models.py` for different LLM providers.
- Removed the synchronous `mcp_manager.py` in favor of a more modular approach.
- Established a provider framework in `providers` directory with a base class and specific implementations.
- Implemented `OpenAIProvider` for interacting with OpenAI's API, including streaming support and tool call handling.
2025-03-26 11:00:20 +00:00

Streamlit Chat App with MCP Integration

A powerful chat application built with Streamlit that integrates with OpenAI's API and Model Context Protocol (MCP) for enhanced tool capabilities.

Features

  • 💬 Interactive chat interface with Streamlit
  • 🧠 OpenAI API integration with model selection
  • 🛠️ MCP server management and tool integration
  • Both streaming and non-streaming response modes
  • 🔄 Automatic tool discovery and invocation

Installation

  1. Clone the repository:
git clone https://git.bhakat.dev/abhishekbhakat/mcpapp.git
cd mcpapp
  1. Create and activate a virtual environment:
python -m venv venv
source venv/bin/activate  # On Windows use `venv\Scripts\activate`
  1. Install dependencies:
pip install -e .

Configuration

  1. Copy the sample config files:
cp config/sample_config.ini config/config.ini
cp config/sample_mcp_config.json config/mcp_config.json
  1. Edit config/config.ini with your OpenAI API key and preferences:
[openai]
api_key = your_api_key_here
base_url = https://api.openai.com/v1
model = gpt-3.5-turbo

[dolphin-mcp]
servers_json = config/mcp_config.json
  1. Configure MCP servers in config/mcp_config.json:
{
    "mcpServers": {
        "example-server": {
            "command": "uvx",
            "args": ["mcp-server-example"],
            "env": {
                "API_KEY": "your-api-key"
            }
        }
    }
}

Usage

Start the application:

streamlit run src/app.py

The app will be available at http://localhost:8501

Architecture

Key components:

  • src/app.py: Main Streamlit application
  • src/openai_client.py: OpenAI API client with MCP integration
  • src/mcp_manager.py: Synchronous wrapper for MCP server management
  • src/custom_mcp_client/: Custom MCP client implementation

Development

Running Tests

pytest

Code Formatting

ruff check . --fix

Building

python -m build

License

MIT License - See LICENSE for details.

Description
No description provided
Readme 423 KiB
Languages
Python 100%