Files
mcpapp/README.md

103 lines
2.0 KiB
Markdown

# Streamlit Chat App with MCP Integration
A powerful chat application built with Streamlit that integrates with OpenAI's API and Model Context Protocol (MCP) for enhanced tool capabilities.
## Features
- 💬 Interactive chat interface with Streamlit
- 🧠 OpenAI API integration with model selection
- 🛠️ MCP server management and tool integration
- ⚡ Both streaming and non-streaming response modes
- 🔄 Automatic tool discovery and invocation
## Installation
1. Clone the repository:
```bash
git clone https://git.bhakat.dev/abhishekbhakat/mcpapp.git
cd mcpapp
```
2. Create and activate a virtual environment:
```bash
python -m venv venv
source venv/bin/activate # On Windows use `venv\Scripts\activate`
```
3. Install dependencies:
```bash
pip install -e .
```
## Configuration
1. Copy the sample config files:
```bash
cp config/sample_config.ini config/config.ini
cp config/sample_mcp_config.json config/mcp_config.json
```
2. Edit `config/config.ini` with your OpenAI API key and preferences:
```ini
[openai]
api_key = your_api_key_here
base_url = https://api.openai.com/v1
model = gpt-3.5-turbo
[dolphin-mcp]
servers_json = config/mcp_config.json
```
3. Configure MCP servers in `config/mcp_config.json`:
```json
{
"mcpServers": {
"example-server": {
"command": "uvx",
"args": ["mcp-server-example"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
```
## Usage
Start the application:
```bash
streamlit run src/app.py
```
The app will be available at `http://localhost:8501`
## Architecture
Key components:
- `src/app.py`: Main Streamlit application
- `src/openai_client.py`: OpenAI API client with MCP integration
- `src/mcp_manager.py`: Synchronous wrapper for MCP server management
- `src/custom_mcp_client/`: Custom MCP client implementation
## Development
### Running Tests
```bash
pytest
```
### Code Formatting
```bash
ruff check . --fix
```
### Building
```bash
python -m build
```
## License
MIT License - See [LICENSE](LICENSE) for details.