Hello, World!
A Brief History of Programming in 90 Languages
What was your first language?
Pick a language to see Hello, World!
90 languages.
76 years.
The stories behind the code.
From Plankalkül in 1948 to Gleam in 2024, every programming language has its own story—a moment of invention, a problem it was born to solve, a generation of programmers it shaped. Hello, World! tells all 90 of them, one "Hello, World!" at a time.
Releases May 1, 2026 · Hardcover & Paperback pre-orders opening soon
// TODO: get this book
Leave your email and we'll ping you on launch day. One email. That's the whole program.
28 Stories. 76 Years.
The human drama behind the code
The Day Hello World Was Born
In 1972, Brian Kernighan wrote a Bell Labs tutorial and needed a throwaway example — something simple enough to prove the toolchain worked. He typed two words: hello, world. He wasn't starting a tradition. But every programmer who has written their first line of code since has repeated him, and "Time to Hello World" is now an industry metric for measuring how quickly you can get a new language running.
Read in the book →The Dropout and the Priesthood
John Backus was expelled from college for skipping class, had a metal plate surgically installed in his skull, and wandered into IBM's computing center on a whim. At the time, programming meant hand-coding raw machine instructions, and the "priesthood" of programmers who could do it insisted no machine could ever match their speed. Backus assembled ten people — including a cryptographer, a chess champion, and a crystallographer — and proved them wrong. Fortran is still running climate models sixty-eight years later.
Read in the book →The Notation That Came Alive
John McCarthy never intended LISP to run on a computer. It was supposed to be mathematical notation — elegant, precise, and safely confined to the page. But in 1958, while developing artificial intelligence research at MIT, he asked a deceptively simple question: if you wanted to build a machine that could think, what language would it need? The answer turned out to be stunningly minimal — a language that could express any computation from almost nothing. Someone implemented it. Then it wouldn't stop.
Read in the book →The Woman Who Made Computers Speak English
In the 1950s, Grace Hopper built a working compiler — the first program ever written to translate human-readable instructions into machine code — and nobody believed her. She spent years on what she called a "selling job," proving that computers could do more than arithmetic. The "temporary" language she helped create, COBOL, now processes 95% of the world's ATM transactions. The thing she was told couldn't exist has never stopped running.
Read in the book →4 AM in College Hall
In 1964, two Dartmouth professors made a radical bet: that ordinary people — not just trained mathematicians — could learn to program. They called their language BASIC, recruited a dozen undergraduates to build it, and stayed up all night to launch it. At 4 AM on May 1st, two people typed RUN simultaneously and both got the right answer. Two decades later, it was the first language Bill Gates ever sold — and the spark that ignited the personal computer revolution.
Read in the book →The Exile’s Classroom
Seymour Papert grew up in apartheid South Africa, where he became an anti-apartheid activist before leaving for Europe to study mathematics. As a child, he’d learned more about math from mechanical gears and his brother’s compass than from any classroom — and that insight became a philosophy. In 1967, he created Logo, a programming language designed to teach children to think by letting them play. The turtle on screen didn’t just draw shapes. It taught a generation that you learn by building, not by being told.
Read in the book →The Quiet Craftsman
At Harvard, Dennis Ritchie studied physics and applied mathematics. By grad school, he’d concluded he wasn’t smart enough to be a physicist, and that computers were “quite neat.” At Bell Labs, he created a language so small and so close to the machine that it could build an entire operating system. That language was C, and that operating system was Unix. Nearly every device you’ve touched today — your phone, your router, your car’s firmware — runs on something written in the language Ritchie built in 1972.
Read in the book →The Last Thing They Did Together
Don Chamberlin and Raymond Boyce set out to build a language that would let ordinary people query databases without understanding the math underneath. They called it SEQUEL, and in May 1974 they published their first paper. Less than a month later, Boyce collapsed in a cafeteria — a brain aneurysm, no warning. He was twenty-seven, with an infant daughter at home. That paper became the last thing they ever did together. SQL now runs virtually every database on earth.
Read in the book →“Everything Is an Object”
In 1968, Alan Kay sketched something that wouldn't exist for twenty years: a portable personal computer for children. At Xerox PARC, his team invented Smalltalk, the graphical user interface, the mouse-driven desktop, and networked personal computing. Steve Jobs negotiated a tour in 1979, saw it all in one afternoon, and built Apple's future on it. Xerox's inventions generated trillions of dollars of value — almost none of it for Xerox.
Read in the book →The Thesis That Wouldn’t Compile
In 1978, a Danish PhD student at Cambridge used Simula for his thesis and loved the design but couldn't stand the speed. He rewrote everything in BCPL and got the speed but hated the code. Most students would have finished the thesis and moved on. Bjarne Stroustrup finished the thesis and decided no programmer should ever have to make that choice again. He built C++ — and spent forty-five years trying to give the world both elegance and speed without charging for either.
Read in the book →Nine Nines
Telephone switches needed to be reachable essentially forever. Ericsson’s requirement was 99.9999999% uptime — nine nines — meaning less than 31 milliseconds of downtime per year. Not per server. Per system. Joe Armstrong, Robert Virding, and Mike Williams shared an impossible problem: how do you build software that absolutely cannot fail? Their answer was Erlang, a language built on a radical philosophy — instead of preventing crashes, let processes fail and recover instantly. The philosophy was so effective that decades later, WhatsApp used it to serve 900 million users with a team of fifty engineers.
Read in the book →The Impossible Program
In 1998, Ben Olmstead designed a programming language specifically to be unusable — and named it after the eighth circle of Hell. He couldn't write Hello World in it himself. Two years passed before anyone could, and when it finally happened, a computer had to solve what no human brain was capable of. A small community has been picking the lock ever since, finding "islands of sanity" in self-modifying chaos.
Read in the book →The Christmas Project
Over Christmas 1989, Guido van Rossum was bored. The office was closed, so he started writing a programming language — keeping the best ideas from a failed predecessor, throwing out everything rigid, and naming it after Monty Python. For two decades it grew quietly. Then the AI revolution arrived, and TensorFlow and PyTorch both chose Python as their interface. The holiday project of a bored Dutch programmer became the language in which the future is being written.
Read in the book →The Linguist’s Language
Larry Wall trained as a missionary linguist, planning to create writing systems for unwritten languages. Health problems ended that plan. Instead, he designed Perl the way a linguist designs a language — messy, context-dependent, flexible, with more than one way to say anything. His motto: TIMTOWTDI — "There's More Than One Way To Do It." By the mid-1990s, Perl was powering Amazon, Craigslist, Slashdot, and the Human Genome Project. Someone called it "the duct tape that holds the internet together."
Read in the book →The Supercooled Solution
By the mid-1980s, more than a dozen functional programming languages existed, all built on the same radical ideas: lazy evaluation, pure functions, no side effects. The researchers behind them kept meeting at the same conferences, presenting similar papers, and the joke wrote itself — more languages than users. In 1987, a committee decided to unify. Simon Peyton Jones, a young researcher without a PhD, became one of its driving forces. The result was Haskell, a language so principled it influenced nearly every modern programming language without most programmers ever writing a line of it.
Read in the book →The Dropout and the Demo
Alan Cooper stood in front of a Xerox Star workstation and had an epiphany: the graphical interface was fluid and intuitive, and it made Windows look like something designed with a stick and a rock. Someone needed to give ordinary developers a way to create graphical applications as easily as the Star made it look. Cooper, a counterculture kid turned software entrepreneur, built a visual programming prototype and sold it to Microsoft. It became Visual Basic — the tool that made Windows programmable and taught millions of people that they could build software too.
Read in the book →The Language That Wished You Well
Before writing a single line of code, Yukihiro Matsumoto and a friend sat down to name their programming language. They knew they wanted a precious jewel. They considered “Coral” — elegant, oceanic. Matsumoto chose “Ruby” instead: shorter, lovelier, and a whisper of the language it would eventually challenge. No functions existed yet. No class definitions. They decided who it would be before it was born — and when Matsumoto finally built it, the design principle was simple: optimize for the programmer’s happiness, not the machine’s.
Read in the book →The Set-Top Box That Ate the World
James Gosling grew up in a suburb of Calgary, a self-described “ridiculously geeky kid” who discovered computers at thirteen by dumpster-diving for discarded punch cards at the University of Calgary. He taught himself to program by reassembling what other people had thrown away. At Sun Microsystems, he built a language for set-top boxes — smart TVs that never quite took off. The language survived the product’s failure, found the internet, and “Write Once, Run Anywhere” made Java the default language of enterprise computing.
Read in the book →JavaScript in 10 Days
In May 1995, Brendan Eich was given ten days to build a programming language for Netscape's browser. It had to look like Java, behave like Scheme, and ship immediately. The deadline was non-negotiable and the result was imperfect, bizarre, and completely inescapable. JavaScript shipped in the browser, and the browser was everywhere. Three decades later, it runs on every device with a screen — and the ten-day deadline is still the most famous origin story in programming.
Read in the book →The Language Born from a Lawsuit
Microsoft paid Anders Hejlsberg a million dollars to leave Borland, then hired thirty-four of Borland's top engineers. When a judge killed Microsoft's Java clone with a court order in San Jose, Hejlsberg built C# from the ashes under the code name "COOL." Nobody at Microsoft planned for what happened next: Unity adopted C# as its sole language, and the tool Microsoft built for enterprise Windows development became how an entire generation learned to code — by building video games.
Read in the book →The Graphics Card That Ate the World
NVIDIA built CUDA to let scientists run calculations on graphics cards. Everyone assumed it was a niche research tool. Then a University of Toronto team used GPU-accelerated neural networks to win ImageNet by a historic margin, and the deep learning revolution began. Every major AI breakthrough since — GPT, DALL-E, self-driving cars, protein folding — has run on NVIDIA hardware. The company that rendered video game explosions became the foundation of artificial intelligence, and a $7 billion company became worth $4.5 trillion.
Read in the book →The Waiting Room
In Cambridge, Don Syme added a quiet feature to F#: wrap any block of code in `async { }`, and the compiler would rewrite clean, readable source into the complex state-machine logic that asynchronous execution actually requires. He announced it in a blog post in October 2007. Almost nobody noticed. Then C# adopted `async/await`. Then Python. Then JavaScript. Then Rust, Kotlin, Swift, Dart, and TypeScript. The feature Syme invented in a research language at Microsoft’s Cambridge lab now runs in nearly every modern programming language on earth.
Read in the book →Which Part of Windows Is Confusing You?
Jeffrey Snover wrote a manifesto arguing that Windows needed a real command line. His colleagues were blunt: Bill Gates had declared cmd.exe the final word. That was settled. Done. Snover pushed the idea anyway and paid for it — sidelined repeatedly for refusing to let it go. He kept building. PowerShell shipped quietly, then became essential. Microsoft eventually promoted Snover to Technical Fellow — one of the highest technical positions in the company — for the same idea they’d tried to bury.
Read in the book →The Savings Account
Rich Hickey spent two decades watching every large system he built accumulate the same category of bugs — race conditions, data races, concurrency failures that emerged only under load, when real money was moving. The problem was always mutation: multiple threads reaching for the same piece of changeable state at the same time. So Hickey saved enough money to live on for two years, quit his job, and built Clojure — a language where data never changes. He bet his livelihood on the idea that the best way to fix concurrency was to remove the thing that made it dangerous.
Read in the book →Twenty-One Floors
In 2006, Graydon Hoare came home and found his apartment building's elevator broken — again. He lived on the twenty-first floor. As he climbed the stairs, he started designing a programming language that would make the kind of bugs that crash elevators structurally impossible. He worked alone for three years before Mozilla noticed. The White House would eventually tell the rest of the software industry to follow his lead.
Read in the book →“From Swift to Mojo: The Architect of Everything”
In 2000, a grad student named Chris Lattner started a compiler project called LLVM. It became the infrastructure behind Swift, Rust, Julia, and dozens of other modern languages. At Apple, he pitched a new language to replace Objective-C — Swift launched in 2014 to a standing ovation and displaced the incumbent within two years. Then he left to tackle AI's hardest problem: scientists prototype in Python because it's easy, then rewrite everything in C++ because Python is too slow. His answer is Mojo.
Read in the book →The Donation Button
Andrew Kelley walked away from a six-figure salary to work on a programming language full-time — funded not by venture capital or a startup, but by donations. Strangers’ goodwill. He’d been building Zig on nights and weekends, and some weekends he had only enough time to merge pull requests with none left for the actual roadmap. The language he’s building is a direct challenge to C — the decades-old foundation everything runs on — and his bet is that a community of volunteers and donors can build something that trillion-dollar companies couldn’t.
Read in the book →Call of Duty: COBOL
In April 2020, New Jersey's unemployment system collapsed under pandemic claims, and the governor held a press conference asking for emergency help from anyone who knew the language. He called it "Cobalt." He meant COBOL — written in 1959, quietly processing trillions of dollars in transactions every day, and suddenly the most urgently in-demand programming skill in America. The volunteers who answered the call were mostly over sixty, making them especially vulnerable to the very crisis they'd been summoned to fix.
Read in the book →