Back to Resources
🦙

Ollama

LLMs

Simple local LLM runtime with a clean developer experience.

105kstars8kforksGo

About

Ollama lets you run, version, and distribute LLMs locally using a simple CLI and Modelfile format. It abstracts hardware acceleration and model configuration.

Key Features

  • Local models
  • REST API
  • Modelfiles
  • GPU support

Tags

Local LLMInferenceCLISelf-hosted

Related Resources