Picture the following age-old scene: a writer sitting at a kitchen table, pretending to work. Set it 40 years ago. The Reagan/Thatcher Conservatives are in power and everything is broken, but our subject is the writer’s stuff. On the table is a typewriter; to one side is a radio, to another is a phone; also in the room are a fridge, an oven, a hot plate, a toaster, a set of car keys and a vacuum cleaner. Now fast-forward to the same scene 40 years later. The Conservatives are in power again and everything is broken again; the room (and perhaps the writer) is a little shinier, but the stuff in the room is more or less the same. At least, it serves the same functions, if you swap laptop for typewriter, mobile for landline, Dyson for Hoover.
One big thing, however, is different. In 1983, that kitchen contained just a handful of transistors, all of which lived in the – there’s a clue in the name – transistor radio. In 2023, every item on that list of domestic objects uses microchips which are each made up of thousands, millions, billions of transistors. Ovens, fridges, vacuums, car keys, radios, speakers: all of them now contain microchips. An ordinary car contains dozens of them. A posh car contains a thousand. And those are just the standard consumer items of the mid-20th century. As for the things we think of as being this century’s new technology, they are some of the most complicated and beautiful artifacts humanity has ever made, mainly because of the chips they contain.
This writer’s phone is an iPhone 12, which uses a chip for the modem, a chip to control Bluetooth, a chip to detect motion and orientation, a chip for image sensing, chips for wireless charging and battery management and audio, and a couple of memory chips. All of these are bought by Apple from other companies, and all are simple beasts compared to the principal logic chip in that phone, Apple’s own-designed A14, which contains 11,800,000,000 transistors.
This writer’s laptop, a MacBook Air, uses another “system on a chip,” Apple’s M2. That single chip contains 20,000,000,000 transistors. The laptop contains so many transistors that if the writer travelled back in time to 1983, he could give every single person on the planet a transistor radio and still have a billion of them left over.
If you want a guide to how we got here, you won’t do better than Chris Miller’s comprehensive, eye-opening new book ‘Chip War.’ Insofar as we work, live and think differently from 40 years ago, we do so thanks to the revolutions in economics and communication whose enabling technology are those microchips, which have been both the necessary and the proximate cause of humanity’s pivot to the digital. This process began with the vacuum tube,
a lightbulb-like metal filament enclosed in glass. The electric current running through the tube could be switched on and off, performing a function not unlike an abacus bead moving back and forth across a wooden rod. A tube turned on was coded as a 1 while the vacuum tube turned off was a 0. These two digits could produce any number using a system of binary counting – and therefore could theoretically execute many types of computation.
Vacuum tubes could allow systems to be reprogrammed; they could be used repeatedly and flexibly. The tubes made complex computations possible, but they were unwieldy, both prone to breakage and laborious to repair. ENIAC, the US army’s world-leading computer, introduced in 1946, used 18,000 vacuum tubes to calculate artillery trajectories faster and more accurately than any human. That made it revolutionary, but its utility was limited by the fact that it was the size of a room, and that whenever a single tube failed, which happened on average every two days, the whole machine broke down.
The man who improved on the vacuum tube was the London-born American physicist William Shockley. After the war, Shockley was employed at Bell Labs, the research branch of the US telephone monopoly, AT&T. He realized that certain chemical elements could perform a similar function of encoding and transmitting 1s and 0s. Conducting materials conduct electricity; non-conducting materials don’t; semiconductors both do and don’t, and that ability to be in two different states makes binary computations possible. Shockley worked out first the theory of semiconduction, and then set his colleagues John Bardeen and Walter Brattain to work on a practical device to manipulate electrical current on a semiconductor. On 23 December 1947 they demonstrated the first working transistor. That invention won the three men the Nobel Prize for physics in 1956.
Shockley seems to have been peeved that it was Bardeen and Brattain who created that first circuit. Because Shockley ran the lab, he was able gradually to stop them working on transistors. Bardeen left for the University of Illinois, where he went on to do foundational work on superconductivity, becoming the first and only person to win a second Nobel Prize in physics. Shockley set out to be rich. He quit Bell Labs with his Nobel in his pocket and set off to found a new company, Shockley Semiconductor.
And this is where his mother comes into it. May Bradford Shockley, who grew up in back-country Missouri, was the daughter of mining engineers; in 1904 she had become the only female deputy surveyor of minerals in the US. Her affection for Palo Alto – she had gone to university at Stanford – led her to retire there. That fact in turn led Shockley in 1956 to found his company down the road in Mountain View, now better known as the home of Google. In those days that part of the world was called the Santa Clara Valley. It goes by a different name today. May Bradford Shockley, who spent the latter part of her life as a rather good painter and who died in 1977 at the age of 97, is the reason Silicon Valley is where it is.
There’s no way around the fact that the founder of Silicon Valley was an outstandingly horrible human being. Shockley was a terrible manager and a passionate racist, who devoted his post-Nobel decades to publicizing home-brewed theories about “dysgenics” or genetic degradation and racial differences being a form of natural “color-coding” to warn about low intelligence. It is striking that the National Academy of Sciences’ official memorial of him, by his old friend John Moll, contains not a single example of kindness or charm or goodwill, or indeed any anecdote which reflects any human credit on its subject. Instead Moll observes that Shockley’s “technical insights were counterbalanced by his lack of insight into human relations.” That had consequences.
(London Review of Books)