<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/"><channel><title>Gpuburnout-3b on GPUburnout | Jun Park</title><link>https://gpuburnout.com/tags/gpuburnout-3b/</link><description>Recent content in Gpuburnout-3b on GPUburnout | Jun Park</description><generator>Hugo -- 0.155.2</generator><language>en-us</language><lastBuildDate>Sun, 19 Apr 2026 00:00:00 +0000</lastBuildDate><atom:link href="https://gpuburnout.com/tags/gpuburnout-3b/index.xml" rel="self" type="application/rss+xml"/><item><title>Nothing Happened for 75,000 Steps and It Was Glorious</title><link>https://gpuburnout.com/posts/s5-ch3-nothing-happened/</link><pubDate>Sun, 19 Apr 2026 00:00:00 +0000</pubDate><guid>https://gpuburnout.com/posts/s5-ch3-nothing-happened/</guid><description>Five days of training. One small architectural fix. Zero crises. The loss curve I had wanted for two seasons.</description></item><item><title>My Code Agent Said It Was a Moose. I Said No. It Was a Moose.</title><link>https://gpuburnout.com/posts/s5-ch2-moose/</link><pubDate>Sun, 12 Apr 2026 00:00:00 +0000</pubDate><guid>https://gpuburnout.com/posts/s5-ch2-moose/</guid><description>Twelve hours of debugging, five wrong suspects, one network filesystem, and a lesson in trusting the diagnostic.</description></item><item><title>I Have an A100. I Have 528 Shards of Data. I Cannot Combine Them.</title><link>https://gpuburnout.com/posts/s5-ch1-528-shards/</link><pubDate>Tue, 07 Apr 2026 00:00:00 +0000</pubDate><guid>https://gpuburnout.com/posts/s5-ch1-528-shards/</guid><description>Three days. Four GPUs. Three datacenters. Zero training tokens.</description></item></channel></rss>