Commit Graph

22 Commits

Author SHA1 Message Date
51e3058961 fix: update temperature parameter to 0.6 across multiple providers and add debugging output 2025-03-27 19:02:52 +00:00
ccf750fed4 fix: correct logging error message for Google Generative AI SDK 2025-03-27 15:22:19 +00:00
2fb6c5af3c refactor: remove OpenAIClient implementation to streamline codebase 2025-03-27 11:13:32 +00:00
6b390a35f8 feat: Implement GoogleProvider for Google Generative AI integration
- Added GoogleProvider class to handle chat completions with Google Gemini API.
- Implemented client initialization and response handling for streaming and non-streaming responses.
- Created utility functions for tool conversion, response parsing, and content extraction.
- Removed legacy tool conversion utilities from the tools module.
- Enhanced logging for better traceability of API interactions and error handling.
2025-03-27 11:11:56 +00:00
678f395649 feat: implement OpenAIProvider with client initialization, message handling, and utility functions 2025-03-26 19:59:01 +00:00
bae517a322 refactor: move convert_to_anthropic_tools function to tools.py for better organization 2025-03-26 19:06:21 +00:00
ab8d5fe074 feat: implement AnthropicProvider with client initialization, message handling, and utility functions 2025-03-26 19:02:26 +00:00
246d921743 feat: add GoogleProvider implementation and update conversion utilities for Google tools 2025-03-26 18:18:10 +00:00
15ecb9fc48 feat: enhance token usage tracking and context management for LLM providers 2025-03-26 17:27:41 +00:00
49aebc12d5 refactor: update application name and enhance header display in Streamlit app 2025-03-26 12:27:00 +00:00
bd56cc839d Refactor code structure for improved readability and maintainability 2025-03-26 12:14:58 +00:00
a4683023ad feat: add support for Anthropic provider, including configuration and conversion utilities 2025-03-26 11:57:52 +00:00
b4986e0eb9 refactor: remove custom MCP client implementation files 2025-03-26 11:00:43 +00:00
80ba05338f feat: Implement async utilities for MCP server management and JSON-RPC communication
- Added `process.py` for managing MCP server subprocesses with async capabilities.
- Introduced `protocol.py` for handling JSON-RPC communication over streams.
- Created `llm_client.py` to support chat completion requests to various LLM providers, integrating with MCP tools.
- Defined model configurations in `llm_models.py` for different LLM providers.
- Removed the synchronous `mcp_manager.py` in favor of a more modular approach.
- Established a provider framework in `providers` directory with a base class and specific implementations.
- Implemented `OpenAIProvider` for interacting with OpenAI's API, including streaming support and tool call handling.
2025-03-26 11:00:20 +00:00
a7d5a4cb33 fix: improve handling of OpenAI responses and simplify MCP response wrapping 2025-03-26 08:11:51 +00:00
845f2e77dd feat: update MCP client and manager to include API key and base URL in query processing 2025-03-26 08:09:16 +00:00
ec39844bf1 feat: enhance logging in MCP client and manager for better debugging and error tracking 2025-03-26 06:55:52 +00:00
ccd0a1e45b fix: update section name and correct path for MCP server configuration 2025-03-26 06:16:35 +00:00
314b488bf9 feat: implement custom MCP client and integrate with OpenAI API for enhanced chat functionality 2025-03-25 19:00:00 +00:00
f8dec1951f fix: improve error handling and logging in OpenAI client and chat message processing 2025-03-25 17:32:23 +00:00
bef445baf4 fix: enhance OpenAI client for OpenRouter compatibility while maintaining configuration 2025-03-25 17:21:06 +00:00
d1ef966e65 Add initial project structure with .gitignore, pyproject.toml, and main application files 2025-03-25 17:12:45 +00:00