Train a Transformer-based Large Language Model (LLM) from scratch using PyTorch. This project covers the full pipeline: data preprocessing, tokenizer creation, model architecture (GPT-style decoder-only transformer), training loop, evaluation, and inference. Inspired by nanoGPT, OpenBA, and rasbt’s LLMs-from-scratch - View it on GitHub
Star
0
Rank
13908001