Ollama

MCP Server Verified Featured

Run large language models locally. Llama, Mistral, Gemma, and custom models on your machine.

by Ollama

★★★★★ 4.6/5 (233 reviews)
10.8kGitHub stars
80kinstalls
Updated 2026-03-03
activego

Installation

npx -y @ollama/mcp-server-ollama

Quick Start

Add to your Claude Desktop config:
```json
{
  "mcpServers": {
    "ollama": {
      "command": "npx",
      "args": ["-y", "@ollama/mcp-server-ollama"]
    }
  }
}
```

Tools & Capabilities

run_inference
list_models
get_predictions
train_model

Compatibility

Claude Desktop Continue Cline Cursor Windsurf

About Ollama

Run large language models locally. Llama, Mistral, Gemma, and custom models on your machine.

View on GitHub →

Language: Go

Category: AI & ML

← Back to MCP Hub Directory