MiniLLM is a minimal system for running modern LLMs on consumer-grade GPUs - View it on GitHub
Star
817
Rank
42112