Gitstar Ranking
Users
Organizations
Repositories
Rankings
Users
Organizations
Repositories
Sign in with GitHub
woct0rdho
Fetched on 2026/05/08 12:52
woct0rdho
/
transformers-qwen3-moe-fused
Fused Qwen3 MoE layer for faster training, compatible with Transformers, LoRA, bnb 4-bit quant, Unsloth. Also possible to train LoRA over GGUF -
View it on GitHub
Star
254
Rank
145449