Gitstar Ranking
Users
Organizations
Repositories
Rankings
Users
Organizations
Repositories
Sign in with GitHub
woct0rdho
Fetched on 2026/03/14 09:49
woct0rdho
/
transformers-qwen3-moe-fused
Fused Qwen3 MoE layer for faster training, compatible with Transformers, LoRA, bnb 4-bit quant, Unsloth. Also possible to train LoRA over GGUF -
View it on GitHub
Star
241
Rank
150326