News

Fully-featured web interface for Ollama LLMs Get up and running with Large Language Models quickly, locally and even offline. This project aims to be the easiest way for you to get started with LLMs.