The $175 Experiment: Training GPUburnout-1B on a Single GPU
The short version I trained a 1 billion parameter model from scratch. It took 90,000 steps, 11.8 billion tokens, one A100 GPU, and $175. The model went from generating random unicode soup to writing paragraphs about single-cell RNA sequencing with confidently hallucinated journal citations. (They look real. They are not.) This is the full story — every phase, every dollar, and every moment I stared at a loss curve instead of sleeping like a normal person. ...