February 28, 2026
The whole of how minds emerge, compressed to 200 lines. Everything else was just efficiency.
Inspiration
Microgpt
Andrej Karpathy's decade-long obsession arrives: a single file of 200 lines of pure Python — no dependencies — that trains and inferences a GPT. Dataset, tokenizer, autograd engine, neural network, optimizer, training loop, inference loop. All of it. The rest, he says, is just efficiency. A beautiful act of reduction.
Read the article →The happiest I've ever been
A personal essay about arriving somewhere — not through optimization or efficiency, but through attention and time. A quiet counterpoint to the day's dominant theme: some things can't be compressed.
Read the article →We do not think Anthropic should be designated as a supply chain risk
OpenAI weighing in on how Anthropic gets classified in Washington — a reminder that the same week someone distills intelligence to 200 lines of Python, governments are still trying to figure out what any of it means for national security.
Read the article →