Skip to content

Ollama Cloud provider support#209

Open
kwanLeeFrmVi wants to merge 2 commits intodecolua:masterfrom
kwanLeeFrmVi:feat/ollama-cloud-restore
Open

Ollama Cloud provider support#209
kwanLeeFrmVi wants to merge 2 commits intodecolua:masterfrom
kwanLeeFrmVi:feat/ollama-cloud-restore

Conversation

@kwanLeeFrmVi
Copy link
Contributor

@kwanLeeFrmVi kwanLeeFrmVi commented Feb 26, 2026

Summary

Restore support for Ollama Cloud, an API-key provider that exposes hosted Ollama models with managed infrastructure and usage tracking.

What is Ollama Cloud?

Ollama Cloud offers a hosted version of the popular Ollama runtime with:

  • Managed deployment of community and proprietary Ollama models
  • API-key access secured via https://ollama.com
  • OpenAI-compatible endpoints for chat, tags, and model metadata
  • Built-in usage tracking and rate limiting

Changes

  • Add ollama provider metadata (ID: ollama, alias: ollama) with icon asset and UI definition
  • Wire Ollama endpoints into shared config and /api/providers/[id]/models so model lists can be fetched
  • Extend provider validation + connection test harness to ping https://ollama.com/api/tags with bearer auth
  • Cherry-pick upstream commit 0b3e190f5cac6ec9d7e97f81e2c8e2405ab7073b to remain in sync with original repo

Usage

  • Endpoint: https://ollama.com/api/chat
  • API Key: from Ollama Cloud dashboard
  • Model IDs: use the names returned by GET https://ollama.com/api/tags

Testing

tested models:

  • gpt-oss:120b
  • deepseek-v3.2
  • glm-5
  • minimax-m2.5
  • kimi-k2.5

@kwanLeeFrmVi kwanLeeFrmVi changed the title Restore Ollama Cloud provider support Ollama Cloud provider support Feb 26, 2026
…ntication

- Add ollama to PROVIDERS config with https://ollama.com/api/chat endpoint
- Add ollama to PROVIDER_MODELS with gpt-oss:120b model
- Add ollama provider logo (ollama.png)
- Add ollama to APIKEY_PROVIDERS with cloud icon and white color
- Add ollama to API key validation and model fetching endpoints
- Use /api/tags endpoint for ollama model listing and connection testing
- Fix color opacity handling for providers with full
…arrays to OpenAI format

- Add Ollama non-streaming response normalization in translateNonStreamingResponse to handle { message: { role, content, tool_calls }, done } format
- Convert Ollama tool_calls to OpenAI format with proper id generation and argument serialization
- Map Ollama token usage fields (prompt_eval_count, eval_count) to OpenAI usage format
- Add normalizeContentToString helper to convert content arrays to strings for providers that don't
@kwanLeeFrmVi kwanLeeFrmVi force-pushed the feat/ollama-cloud-restore branch from 808bbf3 to 91f1c18 Compare March 2, 2026 10:54
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant