omnillm-mcp
OmniLLM: A Model Context Protocol (MCP) server that enables Claude to access and integrate responses from multiple LLMs including ChatGPT, Azure OpenAI, and Google Gemini, creating a unified AI knowledge hub.
sabpap
Package Information
No package information available for omnillm-mcp.
Resources
GitHub Statistics
Stars
1Watchers
1Forks
1Releases
0Repository createdMarch 23, 2025
Last updatedMarch 29, 2025
Stats refreshedMay 18, 2025