Local-first LLM inference toolkit. Run models on your own hardware with an OpenAI-compatible API, no cloud, no API keys, no data leaving your machine. - View it on GitHub
Star
28
Rank
786036