feat: Implement async utilities for MCP server management and JSON-RPC communication
- Added `process.py` for managing MCP server subprocesses with async capabilities. - Introduced `protocol.py` for handling JSON-RPC communication over streams. - Created `llm_client.py` to support chat completion requests to various LLM providers, integrating with MCP tools. - Defined model configurations in `llm_models.py` for different LLM providers. - Removed the synchronous `mcp_manager.py` in favor of a more modular approach. - Established a provider framework in `providers` directory with a base class and specific implementations. - Implemented `OpenAIProvider` for interacting with OpenAI's API, including streaming support and tool call handling.
This commit is contained in:
@@ -61,6 +61,7 @@ lint.select = [
|
||||
"T10", # flake8-debugger
|
||||
"A", # flake8-builtins
|
||||
"UP", # pyupgrade
|
||||
"TID", # flake8-tidy-imports
|
||||
]
|
||||
|
||||
lint.ignore = [
|
||||
@@ -81,7 +82,7 @@ skip-magic-trailing-comma = false
|
||||
combine-as-imports = true
|
||||
|
||||
[tool.ruff.lint.mccabe]
|
||||
max-complexity = 12
|
||||
max-complexity = 16
|
||||
|
||||
[tool.ruff.lint.flake8-tidy-imports]
|
||||
# Disallow all relative imports.
|
||||
|
||||
Reference in New Issue
Block a user