Gitstar Ranking
Users
Organizations
Repositories
Rankings
Users
Organizations
Repositories
Sign in with GitHub
veggiemonk
Fetched on 2026/03/14 07:14
veggiemonk
/
yzma
yzma lets you use Go to perform local inference with Vision Language Models (VLMs) and Large Language Models (LLMs) using llama.cpp without CGo. -
View it on GitHub
Star
0
Rank
13933569