50% off all plans, limited time. Starting at $2.48/mo
Ollama

Ollama

Ollama runs LLMs locally on your VPS. Pull and serve Llama 3, Mistral, Phi, Gemma, and Qwen with one command. Self-hosted AI without an API key, REST API compatible with OpenAI clients. The simplest path to private inference, used by 200,000+ developers.

Version

Latest

Operating System

Ubuntu Server 24.04 LTS

Min. RAM

8 GB

IP Types

IPV4,IPV6

More in Data Science

Related apps.

Deploy Ollama now. From $2.48/mo.