By the early 1950s, the machines from Issue 2 existed โ rooms full of humming metal and blinking lights, capable of thousands of calculations per second.
There was just one problem: talking to them was a nightmare.
These machines spoke only one language โ machine code. Raw sequences of binary digits. To tell the machine "add these two numbers," you wrote something like 00010110 01010000 01100001. And you had to get every single bit right.
Programming was translation work. Humans think in words and concepts. Machines think in electrical signals โ on or off, 1 or 0. Somebody had to bridge that gap.
Just how bad was programming in binary? Let's look at what it actually took to tell an early computer to do one simple thing.
Talking in Ones and Zeros
WHAT YOU WANT:
"Add the values in slots 5 and 6, store in slot 7."
IN MACHINE CODE:
LOAD slot 5 โ register A: 0001 0000 0101
LOAD slot 6 โ register B: 0001 0001 0110
ADD register A + B: 0010 0000 0001
STORE register A โ slot 7: 0011 0000 0111
WHAT THE PROGRAMMER TYPED:
000100000101
000100010110
001000000001
001100000111
โ One wrong digit = wrong instruction.
โ No labels, no names, no hints.
โ Every address tracked by hand, on paper.
Programmers spent up to 90% of their time finding and fixing errors, not writing new logic. The machine was fast, but the humans were the bottleneck.
Machine code is the language computers actually speak โ raw binary instructions. It is precise, unforgiving, and almost impossible for humans to read. The entire history of programming languages is the story of making machines meet us halfway.
Think About It: Think about how you give directions to a friend versus to a GPS. You'd say "turn left at the coffee shop." The GPS needs exact coordinates. Machine code is like giving directions in latitude and longitude. What would you invent to make it easier?
A Thin Veneer of Readability
By the late 1940s, programmers created shorthand. Instead of binary codes, they assigned short mnemonics โ tiny abbreviations a human could remember. ADD instead of 0010. LOAD instead of 0001.
This was assembly language. A simple program called an assembler translated these mnemonics back into binary. One line of assembly = one machine instruction.
THE SAME TASK AT THREE LEVELS:
ENGLISH:
"Add the values in slots 5 and 6, store in slot 7."
ASSEMBLY LANGUAGE:
LOAD R0, [5] ; Load value from slot 5
LOAD R1, [6] ; Load value from slot 6
ADD R0, R1 ; Add them together
STORE R0, [7] ; Store result in slot 7
MACHINE CODE:
000100000101
000100010110
001000000001
001100000111
โ Assembly is MORE READABLE, but still one
instruction per machine instruction.
But assembly had serious limitations: it was machine-specific (code for one computer was gibberish to another), and still tedious โ even simple tasks required hundreds of instructions.
Assembly was a Band-Aid on a broken arm. The real fix would require a much bolder idea.
Assembly language was the first abstraction over machine code โ replacing binary codes with human-readable mnemonics. It introduced a crucial idea: a program (the assembler) that translates human-friendly notation into machine-friendly binary.
What if a computer could understand something closer to actual English? One woman thought it could. Almost nobody believed her.
The Woman Who Taught Machines to Read
Grace Brewster Murray Hopper (1906โ1992)
Education: PhD in Mathematics, Yale University (1934)
Rank: Retired as Rear Admiral, US Navy
Known for: Pioneer of compilers; coined the term "compiler" (A-0, 1952), FLOW-MATIC, key architect of COBOL, popularizing "debugging"
Famous quote: "It's easier to ask forgiveness than it is to get permission."
In 1952, Hopper wrote what is widely regarded as the first compiler โ a program called A-0 that translated mathematical notation into machine code. Instead of a human translating each instruction by hand, a program would do it automatically.
"I had a running compiler," Hopper later recalled, "and nobody would touch it. They told me a computer could only do arithmetic."
Hopper kept building. A-0 led to A-2, then to B-0 (FLOW-MATIC), which could understand commands like: MULTIPLY PRICE BY QUANTITY GIVING TOTAL.
Hopper was also famous for handing out 11.8-inch pieces of wire during lectures โ the distance light travels in one nanosecond โ to make abstract computing concepts tangible.
A compiler is a program that translates human-readable code into machine code. Hopper's insight was profound: if a computer can manipulate symbols, it can manipulate words just as easily as numbers. The compiler is the bridge between human thought and machine execution โ and it's still how most software is built today.
The Language of Science
John Warner Backus (1924โ2007)
Role: Lead designer of FORTRAN at IBM
Known for: FORTRAN (1957), Backus-Naur Form (BNF) for describing language grammars
Fun fact: Backus flunked out of the University of Virginia, found programming almost by accident, and won the Turing Award in 1977.
THE ABSTRACTION LEAP:
THE MATH: y = ax + b
FORTRAN (1957): Y = A*X + B
ASSEMBLY (IBM 704):
CLA A ; Clear and load A
FMP X ; Floating-point multiply by X
FAD B ; Floating-point add B
STO Y ; Store result in Y
MACHINE CODE:
0101 0000 0000 1010 0110 ...
FORTRAN: 1 line
Assembly: 4 lines
Machine code: dozens of bits per instruction
That's the power of a HIGH-LEVEL LANGUAGE.
John Backus hated the tedium of programming. In 1953, he proposed to IBM: build a language where scientists could write formulas that looked like actual math. FORTRAN โ FORmula TRANslation โ was released in April 1957.
Skeptics said no automatic translator could match hand-optimized code. Backus's team proved them wrong: FORTRAN programs ran within 10-20% of hand-written assembly speed. Within a year of release, over half of all new IBM code was in FORTRAN.
FORTRAN proved you don't sacrifice power when you raise the level of abstraction. One line of FORTRAN could generate dozens of machine instructions โ and the compiler made them nearly as fast as hand-written code. FORTRAN is still in use today, over 65 years later, in scientific computing and weather modeling.
Programming in Almost-English
In May 1959, the CODASYL committee at the Pentagon โ organized by Mary Hawes and including key contributors like Jean Sammet (IBM) โ designed a single language for business computing. The result was COBOL โ the COmmon Business-Oriented Language. Hopper's FLOW-MATIC was a primary influence.
COBOL was designed so programs should be readable by managers, not just programmers.
Critics called COBOL verbose and inelegant. But that verbosity was the point โ COBOL was designed to be clear, not clever.
A critical factor in COBOL's rise: the U.S. Department of Defense told computer manufacturers that to sell machines to the government โ the world's largest computer buyer โ they had to provide a COBOL compiler. This mandate drove COBOL from specification to dominance virtually overnight.
By the mid-1960s, COBOL was the most widely used language in the world. By some estimates, there are still over 200 billion lines of COBOL in active production, processing an estimated 95% of ATM transactions.
COBOL proved that programming languages don't have to look like math โ they can look like English. By making code readable to non-specialists, COBOL opened computing to an entirely new audience: business professionals.
Think About It: COBOL was designed to be readable, FORTRAN to be mathematical. Today we have hundreds of languages, each with a different personality. Why haven't we settled on just one? What's the advantage of having many languages, each for different purposes?
The Abstraction Ladder
Abstraction means hiding complexity behind a simpler interface. You don't need to understand how an engine works to drive a car. Each layer of programming technology abstracts away the details beneath it.
The essential insight: no rung replaces the ones below it. Your browser still runs on machine code. The abstraction ladder is not a replacement chain โ it's a stack.
Abstraction is the superpower of computing. Each layer hides complexity and lets humans think at a higher level. But no layer works alone โ they all rest on the layers beneath. The entire history of programming is building this ladder, one rung at a time.
The Language That Dreams of Thinking
John McCarthy (1927โ2011)
Role: Mathematician, computer scientist, founder of AI as a field
Known for: Coined "artificial intelligence" (1955), created LISP (1958), Dartmouth Conference (1956), Turing Award (1971)
Fun fact: McCarthy also invented the concept of computer time-sharing โ anticipating cloud computing by decades.
In the summer of 1956, McCarthy organized a workshop at Dartmouth College โ the Dartmouth Summer Research Project on Artificial Intelligence. The gathering was informal: researchers came and went over the summer, with no single eureka moment. But it gave the field its name and brought together the people who would define it.
In 1958, McCarthy unveiled LISP (LISt Processing). Where FORTRAN dealt with numbers and COBOL dealt with records, LISP dealt with symbols and lists. It could manipulate code as data โ and introduced garbage collection, dynamic typing, and recursive functions, ideas decades ahead of their time.
Here's a remarkable twist: when McCarthy published LISP's theoretical definition, he intended it as pure mathematics โ not something to actually build. His graduate student Steve Russell looked at it and said, "Why don't I just program this?" McCarthy was skeptical. Russell built it anyway. Theory became working software.
; LISP: "Add 2 and 3"
(+ 2 3) ; โ 5
; Define a function to square a number
(defun square (x)
(* x x))
; Square the number 7
(square 7) ; โ 49
; Apply 'square' to every item in a list
(mapcar #'square '(1 2 3 4 5))
; โ (1 4 9 16 25)
; THE LAMBDA CALCULUS CONNECTION:
; Church (1936): ฮปx. x * x
; McCarthy (1958): (lambda (x) (* x x))
; Theory became practice. Math became software.
LISP was the first language designed for reasoning, not calculation. It treated code as data, introduced ideas decades ahead of its time, and drew directly from Church's lambda calculus. LISP proved that programming languages could be built for what we want computers to learn to do.
The First of Her Kind
Sister Mary Kenneth Keller, BVM (1913โ1985)
Education: PhD in Computer Science, University of Washington (1965)
Known for: One of the first two people to earn a CS PhD in the US, contributed to BASIC, founded and chaired the CS department at Clarke College for twenty years
Lifelong mission: Making computing accessible to everyone
Keller worked at Dartmouth College โ which at the time did not normally admit women โ where an exception was made for her in recognition of her talent. There she contributed to the development of BASIC (Beginner's All-purpose Symbolic Instruction Code), created by Kemeny and Kurtz in 1964. BASIC was designed to give students who were not science majors a way to use computers.
10 PRINT "WHAT IS YOUR NAME?"
20 INPUT N$
30 PRINT "HELLO, "; N$; "!"
40 PRINT "LET'S DO SOME MATH."
50 PRINT "WHAT IS 7 + 5?"
60 INPUT A
70 IF A = 12 THEN PRINT "CORRECT!"
80 END
BASIC: designed so a college freshman with
NO technical background could write a program
on their first day. The language that brought
computing to the people.
BASIC was the language that would later power the personal computer revolution of the 1970s and 1980s, putting programming within reach of millions.
Keller once said: "We're having an information explosion, and it's certainly obvious that information is of no use unless it's available."
Computing has always been shaped by people who believed it should be for everyone. Hopper wanted programmers who weren't mathematicians. COBOL was for business managers. BASIC was for students. Keller dedicated her career to this principle: the power of computers means nothing if it's locked away from the people who need it.
Think About It: In the 1960s, most programmers were men with advanced degrees. Keller believed computing should be accessible to all. Today there are hundreds of "learn to code" platforms. Is programming a basic skill like reading, or a specialized trade? What changes when MORE people can talk to machines?
One Machine Is Not Enough
By the late 1960s, scientists wrote in FORTRAN, businesses ran on COBOL, AI researchers dreamed in LISP, students learned in BASIC. Meanwhile, an international committee designed ALGOL โ a language that was never commercially dominant but whose innovations (block structure, formal grammar, lexical scoping) would influence virtually every programming language that followed, including C, Java, and Python.
But every program ran on one machine. Code for an IBM was gibberish to a UNIVAC. Two revolutions were coming: an operating system called Unix, built on the philosophy that small tools should chain together for complex tasks, and a programming language called C that would make software portable for the first time.
The 1950s and 1960s gave us the fundamental insight that still drives all of software: human thought and machine execution are different, and the gap between them can be bridged by layers of translation. Compilers, high-level languages, and abstraction turned an impossible task โ writing in binary โ into something anyone can learn.
Think About It: Every language in this issue was born from frustration. Machine code was too hard โ assembly. Assembly too tedious โ FORTRAN. FORTRAN didn't speak English โ COBOL. None could reason โ LISP. What frustration will drive the NEXT generation of programming? What's still too hard about telling computers what to do?
Next: two engineers at Bell Labs build an operating system in a closet-sized room โ and create a philosophy that still drives software fifty-five years later. Issue 4: "Small Tools, Big Ideas."