Gitstar Ranking
Users
Organizations
Repositories
Rankings
Users
Organizations
Repositories
Sign in with GitHub
veggiemonk
Fetched on 2025/11/05 01:59
veggiemonk
/
yzma
yzma lets you use Go to perform local inference with Vision Language Models (VLMs) and Large Language Models (LLMs) using llama.cpp without CGo. -
View it on GitHub
Star
0
Rank
12940334