Ollama API Proxy

An API that fetches, parses, and caches data from ollama.com.

Getting Started

# Explore the API documentation $ open http://localhost:5115/docs

Example Endpoints

GET /library?o=popular Default

Get popular models from official library

GET /jmorganca/llama3 User Model

Get details for specific user model

GET /search?q=mistral Search

Global model search functionality

GET /.../blobs/model Blobs

Access raw model artifacts

System Status

Cache Status Live

6-hour intelligent caching

Uptime 99.9%

High availability service

Powered Apps