- Added `process.py` for managing MCP server subprocesses with async capabilities. - Introduced `protocol.py` for handling JSON-RPC communication over streams. - Created `llm_client.py` to support chat completion requests to various LLM providers, integrating with MCP tools. - Defined model configurations in `llm_models.py` for different LLM providers. - Removed the synchronous `mcp_manager.py` in favor of a more modular approach. - Established a provider framework in `providers` directory with a base class and specific implementations. - Implemented `OpenAIProvider` for interacting with OpenAI's API, including streaming support and tool call handling.
13 lines
187 B
JSON
13 lines
187 B
JSON
{
|
|
"mcpServers": {
|
|
"mcp-server-sqlite": {
|
|
"command": "uvx",
|
|
"args": [
|
|
"mcp-server-sqlite",
|
|
"--db-path",
|
|
"~/.mcpapp/mcpapp.db"
|
|
]
|
|
}
|
|
}
|
|
}
|