MCP-ollama_server
Extends Model Context Protocol (MCP) to local LLMs via Ollama, enabling Claude-like tool use (files, web, email, GitHub, AI images) while keeping data private. Modular Python servers for on-prem AI. #LocalAI #MCP #Ollama
Sethuram2003
Python
Package Information
No package information available for MCP-ollama_server.
Resources
GitHub Statistics
Stars
1Watchers
1Forks
0Releases
0Repository createdApril 29, 2025
Last updatedMay 4, 2025
Stats refreshedMay 18, 2025