From Turing to LLMs and Beyond · Issue 4 of 10
Issue 4 · 1970s–1980s

Small Tools, Big Ideas

← Previous Issue: Teaching Machines to Understand Us

Every Program Was Trapped on One Machine
In the late 1960s, every program was written for one specific machine. Move to a new computer? Start over from scratch. Software was a prisoner of its own hardware. Tera

It was 1969. Humans had just walked on the moon. But back on Earth, the world of computing was fractured.

Every operating system was hand-stitched for one machine, like a suit that only fits one person. Write a program on an IBM mainframe, and it could never run on a DEC minicomputer. Write it on a PDP-7, and it was trapped there forever.

Software was powerful — but it was chained to the hardware it was born on.

That was about to change. In a quiet lab in New Jersey, two programmers were about to build something small, elegant, and radical. An idea — a philosophy that would reshape all of computing. They called it Unix.

IBM Mainframe IBM PROGRAMS ONLY DEC PDP-7 DEC PROGRAMS ONLY Honeywell H-200 HONEYWELL ONLY Every program was trapped. No sharing. No portability.
Before Unix, software was trapped. One program, one machine. The story of the 1970s and 1980s is the story of setting software — and people — free.

Where did this revolution begin? Not in a Silicon Valley garage. In a telephone company’s research lab — the most productive square footage in the history of science.

Bell Labs — The Most Productive Research Lab in History
Nine Nobel Prizes. The transistor. Information theory. The laser. Unix. The C programming language. All from one building. What was in the water at this place? Tera

Bell Telephone Laboratories was the research arm of AT&T, the American telephone monopoly. It operated from 1925 onward, and its output was staggering.

The transistor? Bell Labs, 1947. Information theory — the mathematical foundation of all digital communication? Claude Shannon at Bell Labs, 1948. The laser? Bell Labs. Satellite communications? Bell Labs.

AT&T was a regulated monopoly. It made so much money from telephone service that it could afford to fund pure research with no expectation of immediate profit. Bell Labs management had a radical philosophy: hire brilliant people, give them freedom, and trust that useful things will emerge.

Lorinda Cherry, a programmer in the mathematics group, built critical text-processing tools (including deroff and parts of eqn) that helped Unix spread through Bell Labs’ patent department — one of the earliest examples of Unix proving its value in real work.

In the late 1960s, a corner of this extraordinary environment — Department 1127, the Computing Techniques Research group led by Doug McIlroy — became the birthplace of Unix.

Bell Labs by the Numbers 9Nobel Prizes Transistor1947 Info TheoryShannon, 1948 The Laser1958 Unix1969 C Language1971–73 C++1979 & moresatellites, radio Murray Hill, NJ — “the most productive square footage in the history of science”
Bell Labs proved something profound: when you give brilliant people freedom and surround them with other brilliant people, the results can change the world. Not in spite of having no immediate commercial goal — because of it.

Two of those brilliant people were about to do something nobody asked them to do. One of them had just gotten three weeks of uninterrupted time — because his wife took the baby to visit her parents.

Ken Thompson, Dennis Ritchie, and the Birth of Unix
Meet the two people whose work runs inside nearly every device you have ever touched. You have probably never heard of them. Tera

Ken Thompson (born 1943, New Orleans) joined Bell Labs in 1966 with degrees from UC Berkeley. He was a quiet, unassuming hacker — the kind of programmer who would think about a problem for days, then write the solution in a single explosive burst of coding. He was also a passionate chess player who later built a computer that achieved Master-level rating.

Dennis Ritchie (1941–2011, Bronxville, New York) joined Bell Labs in 1967 with a degree in physics and applied mathematics from Harvard. Where Thompson was the rapid-fire builder, Ritchie was the deliberate designer — modest, self-effacing, and someone who expressed himself more clearly in writing than in speech.

Both had worked on Multics, an ambitious time-sharing operating system built jointly by Bell Labs, MIT, and General Electric. But Multics was over-engineered and behind schedule. Bell Labs pulled out in early 1969.

Then fate intervened. In the summer of 1969, Thompson’s wife Bonnie took their infant son to visit her parents in California. Suddenly, Thompson had uninterrupted time and a barely-used PDP-7 minicomputer sitting in the lab.

In roughly three weeks, he wrote an operating system kernel, a shell, an editor, and an assembler. He later described the allocation as “one week, one week, one week.” The system — originally called Unics (a pun on Multics, widely credited to Brian Kernighan) and later respelled Unix — would go on to underpin essentially all of modern computing.

1964Multicsbegins 1969Bell Labspulls out Summer 69Unix on PDP-7in ~3 weeks 1970Unix movesto PDP-11
Think About It: Thompson stripped Multics down to its essentials. Multics tried to do everything; Unix did a few things, simply and well. Have you ever noticed that the most useful tools in your life are often the simplest ones?

Thompson had built a working system. But it was written in assembly language — locked to one machine. To truly set software free, they needed a new kind of language. And they needed a philosophy.

The Unix Philosophy — Do One Thing Well, Pipe It Together
Imagine you have three workers. One sorts cards. One removes duplicates. One counts what is left. Alone, each does one small job. But connect them with a conveyor belt, and you have a system. That conveyor belt is a Unix pipe. Tera

Doug McIlroy, the head of the department where Unix was born, had been thinking about an idea since the early 1960s. In a 1964 memo, he wrote: “We should have some ways of coupling programs like garden hoses — screw in another segment when it becomes necessary to massage data in another way.”

For nearly a decade, the idea sat waiting. Then in 1973, Thompson implemented it. The pipe — represented by the | symbol — allowed the output of one program to flow directly into the input of another.

The Unix philosophy crystallized around this mechanism. McIlroy articulated it most famously:

“Write programs that do one thing and do it well. Write programs to work together. Write programs to handle text streams, because that is a universal interface.”
sort names.txt | uniq | wc -l

Step 0 / 5
Don’t build a Swiss Army knife. Build a knife, a screwdriver, and a can opener — and make them work together. The power is in the composition, not in any single tool. Remember this idea. It will come back 55 years later, in a very surprising way.

Unix had a philosophy. It had pipes. But it was still written in assembly — still chained to one machine. To break that chain, Dennis Ritchie was building something: a new programming language named after a letter of the alphabet.

The C Language — Portable, Powerful, Dangerous
Before C, writing an operating system was like writing a novel in a language only one person speaks. C was the Rosetta Stone — write it once, and any machine can read it. Tera

Thompson’s B language (1969–70), derived from Martin Richards’s BCPL (Cambridge, 1966), worked on the word-addressed PDP-7. But when Unix moved to the byte-addressed PDP-11, B’s lack of a type system became a problem.

Between 1971 and 1973, Dennis Ritchie transformed B into something new. He added a type system, structures, and direct compilation to machine code. The result was C — named simply because it came after B.

C occupied a unique position. It was high-level enough that humans could read it. But low-level enough to replace assembly language. It gave you pointers — direct addresses into memory, like GPS coordinates to specific locations in the machine. Powerful, yes. But also dangerous.

Then came the breakthrough. In 1973, Thompson and Ritchie rewrote Unix in C. For the first time in history, a production operating system was written in a high-level language. Portability: write once, compile anywhere.

main() { printf("hello, world\n"); }
The most famous first program in history. From K&R, 1978.
C Source Code PDP-11 machine code VAX machine code Intel machine code One recipe, many kitchens.
C’s gift was portability. Before C, every operating system was a custom suit for one body. After C, an operating system was a sewing pattern — take it to any fabric store, cut and stitch, and it fits. This single idea made the software explosion possible.
Think About It: C trusts the programmer completely — it will let you do anything, including things that crash the machine. Is it better for a tool to be safe or powerful? What are the tradeoffs?

Unix and C had set software free from specific hardware. But computing itself was still locked away in universities and corporations. To reach ordinary people, computers needed to escape the lab — and fit on a desk.

The Personal Computer — From Altair to Apple to IBM PC
In 1974, computers filled rooms and cost millions. By 1981, you could buy one for your desk. In just seven years, computing went from ‘institution’ to ‘individual.’ One of the fastest revolutions in human history. Tera

The January 1975 cover of Popular Electronics featured the MITS Altair 8800 — a kit computer based on the Intel 8080 processor, sold for $439. It had no keyboard, no screen, and was programmed by flipping toggle switches. It was nearly useless.

It changed everything.

The Altair ignited a hobbyist firestorm. At the Homebrew Computer Club in Menlo Park, California, engineers and enthusiasts gathered to share designs and swap ideas.

Bill Gates and Paul Allen wrote a BASIC interpreter for the Altair on a Harvard PDP-10, using a homemade emulator — they had never actually touched an Altair. When Allen flew to Albuquerque to demonstrate it, the software ran on real hardware for the first time. It worked. They founded Microsoft in April 1975.

Steve Wozniak designed the Apple I single-handedly and gave away the schematics. His friend Steve Jobs convinced him the designs could be sold. They founded Apple Computer on April 1, 1976.

The Apple II (1977) — with the VisiCalc spreadsheet (1979), the first “killer app” — made personal computing real for businesses.

In February 1976, Gates wrote his famous “Open Letter to Hobbyists,” complaining that hobbyists were copying Altair BASIC without paying. He argued that software was intellectual property deserving compensation. The debate between free sharing and paid software had begun — and it is still going today.

Then IBM arrived. On August 12, 1981, the IBM PC launched with an open architecture and off-the-shelf components. Microsoft supplied MS-DOS — which it had not written itself, but purchased from a small company called Seattle Computer Products and adapted. Crucially, Microsoft retained the right to sell DOS to other manufacturers. This single licensing decision would make Microsoft one of the most valuable companies on Earth.

1975Altair8800 1976Apple I 1977Apple II 1979VisiCalc 1981IBM PC
The personal computer did not just shrink computing. It changed who could compute. Power shifted from institutions to individuals. From priests of the mainframe to anyone with a desk and a dream.

Computers were personal now. But they were islands — each one isolated, unable to talk to any other. Quietly, in the background, a network had been growing since 1969. And its first message was an accident.

ARPANET and TCP/IP — The Internet’s Quiet Beginning
The first message ever sent over the network that became the Internet was supposed to be ‘LOGIN.’ The system crashed after two letters. So the first Internet message was ‘LO’ — as in, lo and behold. You cannot make this stuff up. Tera

On October 29, 1969, a UCLA graduate student named Charley Kline sat at a terminal and tried to send the word “LOGIN” to a computer at Stanford Research Institute, 350 miles away. He typed “L.” Confirmed. “O.” Confirmed. “G.” The system crashed. The first message sent over ARPANET was “LO.”

A common myth says ARPANET was built to survive nuclear war. It was not. It was built to let researchers share expensive computing resources remotely. But its underlying technology — packet switching, which breaks messages into small pieces that travel independently and reassemble at their destination — did give it inherent resilience.

Vint Cerf (hearing-impaired since childhood, which fueled his early interest in text-based electronic communication) and Bob Kahn solved the problem of incompatible networks. Meanwhile, Elizabeth “Jake” Feinler ran the Network Information Center at SRI, managing ARPANET’s first directory and host naming system — essential infrastructure that made the growing network navigable. In May 1974, they published the design for TCP/IP — a universal set of rules for breaking messages into packets, addressing them, routing them through any network, and reassembling them.

On January 1, 1983 — “flag day” — every computer on ARPANET switched to TCP/IP simultaneously. Many historians call this the true birth of the Internet.

Packet Switching: The Internet Is a Postal System HELLO L O H E L Each packet takes a different route through the network... Arrives out of order: L, O, H, E, L HELLO Reassembled!
ARPANET did not just connect computers. It established a principle: build a universal protocol, and any network can join. TCP/IP did not care what kind of computer you had. It only cared that you spoke the same language. That openness is why the Internet scaled to billions of devices.

Software was portable. Computers were personal. Networks were growing. But a new threat was emerging: software was becoming a product — locked behind licenses, hidden from the people who used it. One programmer at MIT decided this was a moral crisis.

Richard Stallman and Free Software — “Free as in Freedom”
Here is a question that still divides people today: if someone gives you a tool, should you have the right to look inside it, understand how it works, and improve it? Richard Stallman said yes — and he dedicated his life to that answer. Tera

Richard Stallman (born 1953, New York) had been a programmer at MIT’s Artificial Intelligence Laboratory since 1971. In the AI Lab of the 1970s, software was freely shared. If a program had a bug, anyone could fix it.

Then, around 1980, that culture began to erode. Stallman experienced this through a specific, infuriating incident: a Xerox printer that jammed constantly. He had modified the old printer’s software to notify users. When a newer printer arrived, Xerox refused to share the source code. A professor at Carnegie Mellon who had the code also refused — he had signed a non-disclosure agreement. “It was my first direct encounter with a nondisclosure agreement,” Stallman later said, “and it taught me that nondisclosure agreements have victims.” It was not just a corporation being proprietary. It was the collapse of a community.

On September 27, 1983, Stallman announced the GNU Project (GNU’s Not Unix). His goal: build a complete, free operating system compatible with Unix.

In 1985, he published the GNU Manifesto and founded the Free Software Foundation. He defined software freedom through four freedoms:

The Four Freedoms of Free Software 0Run the program for any purposeUse it however you want 1Study and modify the source codeLook under the hood 2Redistribute copiesShare with friends 3Distribute your modified versionsShare improvements “Free as in freedom, not free as in beer.”

In 1989, the GNU General Public License (GPL) introduced copyleft — a legal mechanism that requires any modified version of free software to remain free. It used copyright law to achieve the opposite of copyright’s typical purpose.

By the early 1990s, the GNU project had produced essential tools — the GCC compiler, the Emacs editor, the bash shell — but lacked a kernel. That missing piece would arrive in 1991, from a Finnish college student named Linus Torvalds. But that is a story for Issue 5.

Stallman’s insight was this: software freedom is not a technical issue. It is a moral issue. If you cannot read, modify, and share the tools you depend on, you are not truly free. This idea would reshape the entire software industry.

Software was being set free. But there was another kind of freedom still missing. For most people, using a computer still meant memorizing obscure commands. What if, instead, you could just point at what you wanted?

The GUI Revolution — Xerox PARC, Mac, Windows
Before the GUI, using a computer was like texting a very literal friend — you had to type exact commands and remember precise syntax. The graphical user interface changed computing from ‘type commands’ to ‘point at things.’ It brought computers to everyone. Tera

Xerox PARC (Palo Alto Research Center), established in 1970, was building the future and barely knew it. By the mid-1970s, PARC researchers had invented the graphical user interface, the desktop metaphor, Ethernet, laser printing, and WYSIWYG text editing. Their Alto computer (1973) integrated all of these innovations.

Xerox tried to commercialize its own research with the Xerox Star (1981), priced at $16,595 — roughly $55,000 in today’s money. It was a commercial failure.

In December 1979, Steve Jobs visited Xerox PARC. Adele Goldberg, a key developer of the Smalltalk programming environment at PARC, was among those present; she reportedly opposed showing Apple the technology. PARC researcher Larry Tesler demonstrated the Alto’s graphical interface. Jobs was electrified. Tesler later recalled that Jobs kept asking why Xerox was not doing anything with such revolutionary technology.

The result was the Apple Macintosh, launched on January 24, 1984, accompanied by the legendary “1984” television commercial directed by Ridley Scott. But Apple did not merely copy PARC — the Mac team added the menu bar, drag-and-drop, and a single-button mouse designed for first-time users.

Microsoft followed with Windows 1.0 in November 1985 — slow and limited. It was not until Windows 3.0 (1990) that Microsoft’s GUI became commercially successful.

The lineage is clear: Engelbart → Xerox PARC → Apple → Microsoft. No single entity “invented” the GUI. Each built on what came before.

1968Engelbartdemo 1973XeroxAlto 1981XeroxStar 1984AppleMacintosh 1985Windows1.0 1990Windows3.0
The GUI did not make computers more powerful. It made computers more accessible. And accessibility is a kind of power — the power to bring millions of new minds into the computing revolution.
Think About It: Every interface is a tradeoff. Command lines are fast and precise for experts. GUIs are intuitive for everyone. Today we are asking the same question again: is talking to an AI assistant the next interface revolution? What will that make possible?

Step back and look at where we are. In just two decades, software became portable. Computers became personal. Networks began connecting them. Interfaces became visual. And a movement declared that software should be free. But there is one enormous problem still unsolved...

Software Is Free. Machines Are Personal. But They’re Still Isolated...
We have traveled so far in this issue. Unix gave us a philosophy: small tools, composed together. C gave us portability. Personal computers gave us independence. The GUI gave us accessibility. Stallman gave us freedom. But look at all these machines. They are islands. Millions of brilliant, powerful, lonely islands. Tera

By the end of the 1980s, the computing landscape had been utterly transformed.

Unix and C had proven that software did not have to be trapped on one machine. The Unix philosophy — small tools, piped together, doing one thing well — had shown that the most powerful systems are built from simple, composable parts.

Personal computers had moved computing from the institution to the individual.

The GUI had opened the door for everyone, not just programmers and engineers.

Richard Stallman and the free software movement had declared that the code running on those machines should belong to everyone.

And in the background, ARPANET had quietly evolved into something with the potential to connect all of it.

But in 1989, the vast majority of personal computers were still isolated. Sharing anything meant copying it to a floppy disk and physically carrying it across the room.

It would take a physicist at a European research lab, trying to solve a very mundane problem — sharing documents with his colleagues — to connect the final wire.

The State of Computing, ~1989 Unix/CPortable,composablesoftware PCsComputingforindividuals TCP/IPUniversalnetworklanguage The GUIAccessibleinterfaces Free SWCode thatbelongs toeveryone The missing piece: the connection that ties it all together → That story begins in Issue 5...
The 1970s and 1980s were an era of liberation. Software was freed from hardware. Computing was freed from institutions. Users were freed from expertise. Code was freed from ownership. But the final liberation — connecting every machine and every person — was still waiting. That story begins next.
Next Issue: A CERN physicist invents the Web, a Finnish student gives away an operating system, and suddenly all the world’s knowledge — and code — is connected. Everything changes. Again.
Issue 5: “The Connected World” →

References & Further Reading