p1_01_terminal_glow

It was 1969. Humans had just walked on the moon. But back on Earth, the world of computing was fractured.

LOCKBOT: In the 1960s, every program was built for one machine. In the 1960s, every program was built for one machine.
WRENCHBOT: Move to a new computer? Start over from scratch. Move to a new computer? Start over from scratch.

Every operating system was hand-stitched for one machine, like a suit that only fits one person. Write a program on an IBM mainframe, and it could never run on a DEC minicomputer. Write it on a PDP-7, and it was trapped there forever.

Three computers, each in a locked cage with their programs trapped inside

Software was powerful — but it was chained to the hardware it was born on.

That was about to change. In a quiet lab in New Jersey, two programmers were about to build something small, elegant, and radical. Not a bigger machine. Not a faster processor. An idea — a philosophy that would reshape all of computing.

NARRATOR: They called it Unix. They called it Unix.

Before Unix, software was trapped. One program, one machine. The story of the 1970s and 1980s is the story of setting software — and people — free.

Where did this revolution begin? Not in a Silicon Valley garage. In a telephone company's research lab — the most productive square footage in the history of science. ▶
Bell Telephone Laboratories, Murray Hill, New Jersey
LABCOAT: Nine Nobel Prizes. The transistor. Information theory. Nine Nobel Prizes. The transistor. Information theory.
THINKERBOT: All from one building. What was in the water? All from one building. What was in the water?

Bell Telephone Laboratories was the research arm of AT&T, the American telephone monopoly. It operated from 1925 onward, and its output was staggering.

Bell Labs hallway — physicists and mathematicians cross-pollinating ideas
Bell Labs cafeteria — napkin sketches that changed the world

The transistor? Bell Labs, 1947. Information theory — the mathematical foundation of all digital communication? Claude Shannon at Bell Labs, 1948. The laser? Bell Labs. Satellite communications? Bell Labs. Radio astronomy? Bell Labs.

AT&T was a regulated monopoly. It could afford to fund pure research with no expectation of immediate profit. Bell Labs management had a radical philosophy: hire brilliant people, give them freedom, and trust that useful things will emerge.

BELLBOT: Brilliant people, given freedom. That was the secret. Brilliant people, given freedom. That was the secret.

In the late 1960s, a corner of this extraordinary environment — Department 1127, the Computing Techniques Research group led by Doug McIlroy — became the birthplace of Unix. Ken Thompson and Dennis Ritchie were there. So was Brian Kernighan. So was Lorinda Cherry, a mathematician and programmer who developed tools like eqn — software that let scientists typeset mathematical equations directly from a terminal.

Lorinda Cherry's eqn tool — mathematical typesetting from a terminal

But the Bell Labs of the 1960s and 1970s was overwhelmingly white and male. This was not accidental — it reflected structural barriers. Most top CS programs admitted few women and fewer people of color. Bell Labs recruited almost exclusively from these programs. The contributions that did come from people outside that narrow slice — like Cherry's — only make the question of what was lost sharper.

Bell Labs proved something profound: when you give brilliant people freedom and surround them with other brilliant people, the results can change the world. Not in spite of having no immediate commercial goal — because of it.

GATEKEEPER: Imagine what was lost. Who was locked out. Imagine what was lost. Who was locked out.
Two of those brilliant people were about to do something nobody asked them to do. One of them had just gotten three weeks of uninterrupted time — because his wife took the baby to visit her parents. ▶
STORYTELLER: Meet two people whose work runs in every device you own. Meet two people whose work runs in every device you own.
Ken Thompson — quiet, unassuming hacker
Dennis Ritchie — the deliberate designer

Ken Thompson (born 1943, New Orleans) joined Bell Labs in 1966. He was a quiet, unassuming hacker — the kind of programmer who would think about a problem for days, then write the solution in a single explosive burst of coding.

Dennis Ritchie (1941–2011, Bronxville, New York) joined Bell Labs in 1967 with degrees in physics and applied mathematics from Harvard. Where Thompson was the rapid-fire builder, Ritchie was the deliberate designer — modest, self-effacing, someone who expressed himself more clearly in writing than in speech.

WHISPERBOT: You have probably never heard of them. You have probably never heard of them.

Both had worked on Multics, an ambitious time-sharing OS built by Bell Labs, MIT, and GE. But Multics was over-engineered and behind schedule. Bell Labs pulled out in early 1969. Thompson was frustrated — he liked the ideas, but thought they were buried under too much complexity.

Summer 1969 — Thompson alone with a PDP-7 and three weeks of uninterrupted time

In the summer of 1969, Thompson's wife Bonnie took their infant son to California. Suddenly, Thompson had uninterrupted time and a barely-used PDP-7. In roughly three weeks, he wrote a working prototype: an OS kernel, a shell, an editor, and an assembler. "One week, one week, one week."

But this initial burst was just the beginning. From 1970 onward, Ritchie was deeply involved. McIlroy contributed the pipe mechanism. Kernighan named it and helped document it. Joe Ossanna built text-processing tools. Lorinda Cherry developed mathematical typesetting. The name "Unix" was coined by Brian Kernighan as a pun on Multics: uniplexed versus multiplexed. Unix was Thompson's spark — but it was Bell Labs' fire.

From Multics to Unix1964Multics beginsEarly 1969Bell Labs exitsSummer 1969Thompson: 3-weekprototype on PDP-71970–73Ritchie joinsC rewrite1973Unix rewritten in C

From Multics to Unix: the timeline of a revolution

Think About It: Thompson stripped Multics down to its essentials. Multics tried to do everything; Unix did a few things, simply and well. Have you ever noticed that the most useful tools in your life are often the simplest ones?

Thompson had built a working prototype. But it was written in assembly language — locked to one machine, just like everything else. To truly set software free, they needed a new kind of language. And they needed a philosophy. ▶
Doug McIlroy at the whiteboard — the garden hose analogy for Unix pipes

Doug McIlroy, the head of the department where Unix was born, had been championing an idea since the early 1960s. In a 1964 memo, he wrote: "We should have some ways of coupling programs like garden hoses — screw in another segment when it becomes necessary to massage data in another way."

FOREMAN: One sorts. One deduplicates. One counts. One sorts. One deduplicates. One counts.
CONDUCTOR: Connect them and that is a system. Connect them and that is a system.

In 1973, Thompson implemented it. The pipe — represented by the | symbol — allowed the output of one program to flow directly into the input of another. Before pipes, you had to save intermediate files at every stage. Pipes eliminated the friction and made composing small programs feel natural.

sortalphabetize|uniqdeduplicate|wc -lcount lines42sort names.txt | uniq | wc -l → "How many unique names?"

Three tiny tools, one useful answer — the Unix pipe in action

CHROMEBOT: That conveyor belt? It is a Unix pipe. That conveyor belt? It is a Unix pipe.

McIlroy articulated the Unix philosophy: "Write programs that do one thing and do it well. Write programs to work together. Write programs to handle text streams, because that is a universal interface."

Old Way vs. Unix WayOLD WAY:Prog A → save file1.txt → Prog B → save file2.txt → Prog C → resultUNIX WAY:Prog A | Prog B | Prog C → result

No intermediate files. No friction. Just flow.

Don't build a Swiss Army knife. Build a knife, a screwdriver, and a can opener — and make them work together. The power is in the composition, not in any single tool. Remember this idea. It will come back 55 years later, in a very surprising way.

Unix had a philosophy. It had pipes. But it was still written in assembly — still chained to one machine. To break that chain, Dennis Ritchie was building something: a new programming language named after a letter of the alphabet. ▶
Dennis Ritchie designing the C language — BCPL → B → C
PUNCHCARD: Before C, each OS spoke only one machine's language. Before C, each OS spoke only one machine's language.
PEARLBOT: C was the Rosetta Stone. Write once, compile anywhere. C was the Rosetta Stone. Write once, compile anywhere.

Thompson's B language (1969–70), derived from Martin Richards's BCPL (Cambridge, 1966), worked on the word-addressed PDP-7. But when Unix moved to the byte-addressed PDP-11, B's lack of a type system became a problem. Between 1971 and 1973, Dennis Ritchie transformed B into something new: C — named simply because it came after B.

C occupied a unique position. High-level enough that humans could read it. Low-level enough to replace assembly language. It gave you pointers — direct addresses into memory, like GPS coordinates to specific locations in the machine. Powerful, yes. But dangerous: point to the wrong address and your program crashes, or worse.

CBOT: Powerful and dangerous. That was the tradeoff. Powerful and dangerous. That was the tradeoff.

In 1973, Thompson and Ritchie rewrote Unix in C. For the first time in history, a production operating system was written in a high-level language instead of assembly. Unix was no longer locked to the PDP-11.

One source, many machines — if you have a compilerC Source CodePDP-11 CompilerVAX CompilerIntel CompilerPDP-11 Machine CodeVAX Machine CodeIntel Machine Code"One recipe, many kitchens — but each kitchen needs a chef who speaks C"

The C portability model — write once, compile for each target

Portability did not mean "press a button." Writing a C compiler was hard work — weeks or months of effort. But the savings compounded: every new program written in C could run on every platform with a C compiler. Write once, compile anywhere — if you had a compiler.

A legal quirk accelerated Unix's spread. AT&T was barred from selling software under a 1956 consent decree, so Unix was distributed to universities with source code included. At UC Berkeley, Bill Joy created the Berkeley Software Distribution (BSD), an influential variant that would shape networking, free software, and eventually macOS.

hello, world — the most famous first program in history, from K&R 1978

In 1978, Brian Kernighan and Ritchie published The C Programming Language ("K&R"), establishing the tradition of starting every programming tutorial with a simple "hello, world" program. Kernighan said the phrase came from a cartoon of a chick hatching from an egg.

C's gift was portability. Before C, every operating system was a custom suit for one body. After C, an operating system was a sewing pattern — but adapting it for each new fabric still required a skilled tailor (the compiler writer). The revolution was not that porting was easy. It was that porting was possible.

HELLOBOT: Hello, world. The most famous first program ever. Hello, world. The most famous first program ever.

Think About It: C trusts the programmer completely — it will let you do anything, including things that crash the machine. Is it better for a tool to be safe or powerful? What are the tradeoffs?

Unix and C had set software free from specific hardware. But computing itself was still locked away in universities and corporations. To reach ordinary people, computers needed to escape the lab — and fit on a desk. ▶
Popular Electronics, January 1975 — the Altair 8800 that ignited a revolution
PIXELBOT: In 1974, computers filled rooms. In 1974, computers filled rooms.
ROCKETBOT: By 1981, one fit on your desk. By 1981, one fit on your desk.

In January 1975, the MITS Altair 8800 — a kit computer based on the Intel 8080, sold for $439 — had no keyboard, no screen, and was programmed by flipping toggle switches. It was, by any practical measure, nearly useless. It changed everything.

Bill Gates and Paul Allen writing Altair BASIC on a Harvard PDP-10

Bill Gates and Paul Allen wrote a BASIC interpreter for the Altair on a Harvard PDP-10, using a homemade emulator — they had never touched an Altair. When Allen demonstrated it, the software ran on real hardware for the first time. It worked. They founded Microsoft in April 1975.

In February 1976, Gates published an angry "Open Letter to Hobbyists," arguing that copying software without paying was theft: "Who can afford to do professional work for nothing?" The hobbyist community was furious — many saw software as something to be freely shared. This letter is often cited as the opening shot in the free vs. commercial software debate.

Steve Wozniak at the Homebrew Computer Club with the Apple I circuit board

Steve Wozniak designed the Apple I single-handedly and gave away the schematics. His friend Steve Jobs convinced him the designs could be sold. They founded Apple Computer on April 1, 1976. The Apple II (1977) — with color graphics and eventually the VisiCalc spreadsheet (1979), the first "killer app" — made personal computing real for businesses.

The IBM PC, August 1981 — computing goes corporate

On August 12, 1981, the IBM PC launched with an open architecture. Microsoft supplied MS-DOS, acquired for around $50,000–$75,000 — and crucially retained the right to sell DOS to other manufacturers. This single licensing decision would make Microsoft one of the most valuable companies on Earth.

The personal computer did not just shrink computing. It changed who could compute. Power shifted from institutions to individuals. From priests of the mainframe to anyone with a desk and a dream.

DESKBOT: Power shifted from institutions to individuals. Power shifted from institutions to individuals.
Computers were personal now. But they were islands — each one isolated, unable to talk to any other. Quietly, in the background, a network had been growing since 1969. And its first message was an accident. ▶
October 29, 1969 — Charley Kline types 'LO' before the system crashes
MINTBOT: The first Internet message? Supposed to be LOGIN. The first Internet message? Supposed to be LOGIN.
SUNNYBOT: It crashed after LO. Lo and behold! It crashed after LO. Lo and behold!

The idea began with J.C.R. Licklider, a psychologist and computer scientist at ARPA, who imagined an "Intergalactic Computer Network" where researchers everywhere could share data freely. Larry Roberts designed ARPANET's architecture. Leonard Kleinrock at UCLA developed the theory of packet switching. The team at BBN built the routing machines.

On October 29, 1969, UCLA graduate student Charley Kline tried to send "LOGIN" to Stanford Research Institute, 350 miles away. He typed L. Confirmed. O. Confirmed. G. The system crashed. The first message sent over ARPANET was "LO."

MAILBOT: Split the message into packets. Send separately. Split the message into packets. Send separately.

Myth: ARPANET was built to survive nuclear war. Reality: Completely false. ARPANET was built to let researchers at different universities share expensive computing resources. The confusion comes from Paul Baran's separate 1964 RAND Corporation research on survivable networks.

Packet Switching: The Postal System of the InternetH E L L OOriginal messageHELLOSplit into packetsDifferent routes through the networkLOHELH E L L OReassembled!

The Internet is a postal system, not a phone call

Vint Cerf and Bob Kahn — the architects of TCP/IP

Vint Cerf (born 1943, hearing-impaired since childhood) and Bob Kahn (born 1938) published the design for TCP/IP in May 1974 — a universal set of rules for breaking messages into packets, addressing them, routing them, and reassembling them.

On January 1, 1983 — "flag day" — every computer on ARPANET switched to TCP/IP simultaneously. Many historians call this the true birth of the Internet: the moment when different networks could, for the first time, speak the same language.

FLAGBOT: One protocol. Every network, same language. One protocol. Every network, same language.

ARPANET did not just connect computers. It established a principle: build a universal protocol, and any network can join. TCP/IP did not care what kind of computer you had. It only cared that you spoke the same language. That openness is why the Internet scaled to billions of devices.

Software was portable. Computers were personal. Networks were growing. But a new threat was emerging: software was becoming a product — locked behind licenses, hidden from the people who used it. One programmer at MIT decided this was a moral crisis. ▶
The printer jam that sparked a revolution in software freedom
JUDGEBOT: Should you have the right to look inside your tools? Should you have the right to look inside your tools?
FREEDOMBOT: Stallman said yes — and dedicated his life to it. Stallman said yes — and dedicated his life to it.

Richard Stallman (born 1953, New York) had been a programmer at MIT's AI Laboratory since 1971. In the AI Lab of the 1970s, software was freely shared. If a program had a bug, anyone could fix it. The code was a commons.

Stallman confronts proprietary software — ACCESS DENIED

Stallman had modified a Xerox printer's software to notify everyone when a jam occurred. When MIT received a newer printer, he asked for its source code. Xerox refused — the code was now proprietary. "It was my first direct encounter with a nondisclosure agreement," Stallman later said, "and it taught me that nondisclosure agreements have victims."

On September 27, 1983, Stallman announced the GNU Project (GNU's Not Unix) — a complete, free operating system compatible with Unix. In 1985, he published the GNU Manifesto and founded the Free Software Foundation.

The Four Freedoms of Software0RunUse the programfor any purpose1Study & ModifyRead and changethe source code2RedistributeShare copieswith anyone3Improve & ShareDistribute yourmodified versions"Free as in freedom, not free as in beer."You CAN sell free software — the GPL permits it.What you CANNOT do is hide the source code.In 1989, the GNU General Public License (GPL) introduced copyleftusing copyright law to keep software free forever.

The Four Freedoms — the foundation of the free software movement

By the early 1990s, the GNU project had produced essential tools — the GCC compiler, the Emacs editor, the bash shell — but lacked a kernel. That missing piece would arrive in 1991, from a Finnish college student named Linus Torvalds. But that is a story for Issue 5.

Stallman's insight was this: software freedom is not a technical issue. It is a moral issue. If you cannot read, modify, and share the tools you depend on, you are not truly free. This idea would reshape the entire software industry.

FREEBOT: Cannot read the code? Not truly free. Cannot read the code? Not truly free.
Software was being set free. But there was another kind of freedom still missing. For most people, using a computer still meant memorizing obscure commands. What if, instead, you could just point at what you wanted? ▶
The Xerox Alto (1973) — windows, icons, menus, and a mouse

Xerox PARC (Palo Alto Research Center), established in 1970, was building the future. By the mid-1970s, PARC had invented the graphical user interface, Ethernet, laser printing, WYSIWYG editing, and object-oriented programming. Their Alto (1973) integrated all of it — windows, icons, menus, and a mouse.

ARTISTBOT: Before the GUI, computers meant memorizing commands. Before the GUI, computers meant memorizing commands.
RAINBOWBOT: The GUI meant point at things instead of typing. The GUI meant point at things instead of typing.
Steve Jobs at Xerox PARC, December 1979 — 'Why aren't you DOING anything with this?!'

In December 1979, Steve Jobs visited PARC. Researcher Larry Tesler demonstrated the Alto. Jobs was electrified: "Why aren't you doing anything with this? This is the greatest thing! This is revolutionary!" Not everyone agreed — Adele Goldberg, a key developer of Smalltalk, objected to giving away PARC's crown jewels. She was overruled by management.

The Apple Macintosh, January 24, 1984
Windows 1.0, November 1985 — slow, limited, but the beginning

The Apple Macintosh launched January 24, 1984 at $2,495, with the legendary "1984" commercial by Ridley Scott. Apple didn't just copy PARC — they added the menu bar, drag-and-drop, and a simplified single-button mouse. Microsoft followed with Windows 1.0 in November 1985 — clunky and limited. It wasn't until Windows 3.0 (1990) that their GUI succeeded commercially.

Engelbart1968Xerox Alto1973Star $16k1981Lisa $10k1983Macintosh1984Win 1.01985Win 3.01990Engelbart → PARC → Apple → Microsoft — each built on what came before

The GUI lineage — no single entity invented the graphical interface

The GUI did not make computers more powerful. It made computers more accessible. And accessibility is a kind of power — the power to bring millions of new minds into the computing revolution.

Think About It: Every interface is a tradeoff. Command lines are fast and precise for experts. GUIs are intuitive for everyone. Today we are asking the same question again: is talking to an AI assistant the next interface revolution? What will that make possible?

MOUSEBOT: Click or type? Every interface is a tradeoff. Click or type? Every interface is a tradeoff.
Step back and look at where we are. In just two decades, software became portable. Computers became personal. Networks began connecting them. Interfaces became visual. And a movement declared that software should be free. But there is one enormous problem still unsolved... ▶
The late 1980s — brilliant, powerful, personal computers, all isolated from each other
REFLECTBOT: Unix gave us a philosophy: small tools, composed together. Unix gave us a philosophy: small tools, composed together.

Unix and C had proven that software did not have to be trapped on one machine. The Unix philosophy — small tools, piped together, doing one thing well — had shown that the most powerful systems are built from simple, composable parts. (Hold on to that idea. It will matter again, profoundly, in our final chapters.)

TRAVELBOT: C, portability. PCs, independence. GUIs, accessibility. C, portability. PCs, independence. GUIs, accessibility.

Personal computers had moved computing from the institution to the individual. The GUI opened the door for everyone. Richard Stallman and the free software movement declared that code should belong to everyone. And ARPANET had quietly evolved into something with the potential to connect all of it.

GLASSBOT: All these machines are islands. Brilliant, lonely islands. All these machines are islands. Brilliant, lonely islands.

But in 1989, the vast majority of personal computers were still isolated. You could write a document, play a game, run a spreadsheet — but sharing anything with another human being meant copying it to a floppy disk and physically carrying it across the room.

The State of Computing, ~1989Unix/CPortable,composablesoftwarePCsComputingforindividualsTCP/IPUniversalnetworklanguageThe GUIAccessibleinterfacesFree SoftwareCode thatbelongs toeveryoneThe missing piece: the connection that ties it all together→ Issue 5

Five pillars of the computing landscape — but the connection between them is still missing

Humans had built brilliant, powerful personal machines. They had built a network that could, in theory, connect them all. But the two revolutions had not yet merged. It would take a physicist at a European research lab, trying to solve a very mundane problem — sharing documents with his colleagues — to connect the final wire.

The 1970s and 1980s were an era of liberation. Software was freed from hardware. Computing was freed from institutions. Users were freed from expertise. Code was freed from ownership. But the final liberation — connecting every machine and every person — was still waiting. That story begins next.

ISLANDBOT: All the pieces exist. They just need connecting. All the pieces exist. They just need connecting.
References & Further Reading
Next issue: "The Connected World" — A CERN physicist invents the Web, a Finnish student gives away an operating system, and suddenly all the world's knowledge — and code — is connected. Everything changes. Again. ▶