feat: Implement async utilities for MCP server management and JSON-RPC communication

- Added `process.py` for managing MCP server subprocesses with async capabilities.
- Introduced `protocol.py` for handling JSON-RPC communication over streams.
- Created `llm_client.py` to support chat completion requests to various LLM providers, integrating with MCP tools.
- Defined model configurations in `llm_models.py` for different LLM providers.
- Removed the synchronous `mcp_manager.py` in favor of a more modular approach.
- Established a provider framework in `providers` directory with a base class and specific implementations.
- Implemented `OpenAIProvider` for interacting with OpenAI's API, including streaming support and tool call handling.
This commit is contained in:
2025-03-26 11:00:20 +00:00
parent a7d5a4cb33
commit 80ba05338f
14 changed files with 1749 additions and 273 deletions

View File

@@ -1,12 +1,12 @@
{
"mcpServers": {
"dolphin-demo-database-sqlite": {
"command": "uvx",
"args": [
"mcp-server-sqlite",
"--db-path",
"~/.dolphin/dolphin.db"
]
}
"mcpServers": {
"mcp-server-sqlite": {
"command": "uvx",
"args": [
"mcp-server-sqlite",
"--db-path",
"~/.mcpapp/mcpapp.db"
]
}
}
}