yzma lets you use Go to perform local inference with Vision Language Models (VLMs) and Large Language Models (LLMs) using llama.cpp without CGo. - View it on GitHub
Star
0
Rank
12940334