If you live in a first world country, you use software every day. You might even work at a company that makes software (like me). Companies like mine build web applications on top of an ecosystem of amazing technology that took generations to build. The product that the end user interacts with has a clean, easy to use interface that can be accessed any time, anywhere throughout the world using the internet. Knowledge of the underlying components are unnecessary for the customer.
50 years ago this process would have been considered magic. In many ways, it still is. You’re reading this article on a site that takes advantage of those same technologies. Anyone with internet access and a device can read it, whenever they want, and as many times as they want. To achieve the same effect 5 decades ago, I’d have to be published in a newspaper that had unlimited distribution throughout the world. It wouldn’t have been possible and I would have needed my editor’s permission to write this.
The Best Technology Is Hidden In Plain Sight
We tend to forget about the breakthroughs that make innovation possible. We quickly get used to new inventions, and we’re no longer surprised by them. To us, this is called ‘technology’ – but hardly anyone spends time thinking about how ‘mind blowing’ a website or basic app is in 2020. It’s simply part of the world we live in. Just like cell phones, cars, and modern plumbing, we take it for granted. I don’t hear many people referring to toilets as technology – they’ve been around for a while and we know they’ll work. Danny Hillis said it best:
Technology is everything that doesn’t work yet.”
— Danny Hillis
People tend to acknowledge ‘progress’ made by society only in the moments when grand inventions or engineering feats happen. Old Alfred, a famous mathematician, disagrees:
“Civilization advances by extending the number of important operations which we can perform without thinking about them. Operations of thought are like cavalry charges in a battle — they are strictly limited in number, they require fresh horses, and must only be made at decisive moments.”
-Alfred North Whitehead
The marker of technological advancement is how many things we can do without having to consider the underlying machinery. We all have limited mental capacity. Attention is finite. The less we need to think about how our tools work, the more time we can spend using them to solve new problems.
Particle accelerators and sequencing the human genome are both marvelous examples of our ability to push the laws of physics to their limits. There is no doubt that these engineering miracles have created wealth and knowledge for the world. However, these are not the sort of advancement that Whitehead is talking about. Most people don’t reap the benefits of genome sequencing or particle accelerators in their daily lives. They might eventually – when those engineering feats translate into actionable insights and end products that we can use. But this may take years, decades even.
Computers Give Us Superpowers
On the other hand, the fact that you will routinely get on a video call this week is a marker of human progress. Why? Because, without thinking about it, you can work together with people through face to face interaction without being in the same room. You can simultaneously edit a document, view the same page in real time, or give a presentation to hundreds of people using the same technology. This is possible to do with anyone in the world, provided that they have hardware built within the last ~10 years and a wifi connection. The process is no different whether the other person is 3 or 3000 miles away – except for the changes in time zones.
You can essentially teleport, and the technical work is something that you don’t need to think about. Zoom (and every other company that makes Zoom possible) takes care of it for you.
There are many innovations that drive the computing stack: the internet, cloud data storage, software design, hardware development, and countless more. However, the most fundamental innovation that been driving this whole process is one that you hardly ever see. The semiconductor.
You may have also heard a semiconductor referred to as a chip or integrated circuit. Semiconductor companies produce various types of chips – memory chips, analog chips, CPUs, GPUs, systems on a chip (SoC’s), and more. This is the most essential hardware in your computer. Think of it like the brain of the device.
Innovation in the development of chips has accelerated so quickly that it boggles the mind. It might be the perfect example of the sort of technological improvement that Alfred North Whitehead talked about. For decades, the semiconductor industry has enabled the revolution in computing. The progress that’s taken place in the development of computer hardware is arguably our greatest ever technological achievement – rivaling putting a man on the moon.
Semiconductors Make Computing Possible
Although we’ve been referring to semiconductors as a computer component, it’s also a term used in physics to describe a material that is neither a conductor or insulator. In other words, semiconductors are elements that don’t perfectly conduct electricity nor stop it from passing through. For example, most metals are conductors, while rubber is an insulator.
Semiconductors can be made into conductors by introducing a process called ‘doping.’ This ability to manipulate the material to allow current to pass through under some circumstances, and stop it from passing through in others, allows us to use these materials to create transistors. If you’re into understanding how this works from a technical perspective, this guy dressed like an atom can teach you far better than I can:
But why are transistors important for computers?
A transistor is an electrical component that can act as either a switch or an amplifier. It can turn current on and off, or simply strengthen the current. Computers, at their most basic level, take in binary inputs and give binary outputs – 1’s and 0’s. To perform these operations, they need switches and electric current. Binary code is a language in and of itself, and it’s how we tell computers what to do. If you’ve ever tried to use morse code – it’s similar to that. By having just 2 digits, you can combine them in endless ways to communicate. Here’s an example of english translated into binary:
English: “hello world”
Binary: “01101000 01100101 01101100 01101100 01101111 00100000 01110111 01101111 01110010 01101100 01100100 00001101 00001010”
This is why we’ve been cramming more and more transistors onto chips for decades. In 1971, a breakthrough chip from Intel had 2300 transistors. In 2020, Apple just released a Mac with its new M1 chip that has 16 billion transistors and uses 5nm process technology (and there are more advanced implementations possible). The phrase ‘5nm process technology’ is partially marketing jargon, but to put things in perspective, one nanometer is one billionth of a meter. The diameter of a human hair is 100,000 nanometers – 20,000 times as large.
The manufacturing process for modern semiconductors is extreme. It involves the coordination of many companies throughout the world and bleeding edge science. The entire production chain is beyond the scope of this post, but one process that blew me away was photolithography.
To create these atomic scale innovations, photolithography is used to print the complex graph of features and transistors onto the chip itself. Manufacturers use specially made lasers to create tiny, tiny wavelengths which project the layout onto the chip to create the nanometers wide pathways on a thin silicon wafer.
Recently, to keep up with Moore’s law, the photolithography technology had to be rethought from scratch to create a wavelength small enough to keep up with 5nm chip designs. Normal ultraviolet wavelengths are 193nm, but this had to be cut drastically to build new chips. It’s been compared to the process of being asked to write your name in smaller and smaller handwriting on a piece of paper. Eventually, you’ll need a different pen to get the job done.
To create a wavelength over 10x smaller than the normal UV wavelength used in photolithography lasers, a process called EUV (extreme ultraviolet) was developed by a Dutch company – ASML. The machine they built has 100,000 parts, costs $120M, and takes multiple 747 cargo planes and dozens of freight containers to ship. It pushes the laws of physics to the limit. If you’re interested in diving deeper, this video from Seeker is fantastic:
If you watched even a second of that video and marveled at the complexity of extreme ultraviolet lithography, keep in mind that is only one part of the process. The manufacturing (also called ‘fabrication’) of semiconductors is now so advanced that only a handful of companies are capable of keeping up. Many have become specialized in just one part of the process, and allow other companies to handle what they’re not as good at. Taiwan Semiconductor Manufacturing (TSMC), for example, has become the world leader in semiconductor fabrication, but doesn’t design any of its own chips. Other companies like Nvidia, AMD, and Apple design their own semiconductors and outsource the production. Intel remains the only integrated player in the space, doing both design and fabrication in house. However, they’ve been in the news recently for falling behind TSMC and others in in production capability.
In 1965, Gordon Moore, one of the founders of Intel, began making observations about the pace of development in semiconductors. He saw that there would be massive opportunities to continuously add more and more components to chips – making them faster in the process. He predicted that the pace of computing innovation would double every 2 years for the foreseeable future. As the director of R&D at Intel, he saw how much further the capabilities of integrated circuits could be pushed, and his prediction became a self fulfilling prophecy. For decades, Intel drove Moore’s law through the company’s integrated microprocessor design and manufacturing efforts.
The interesting part about Moore’s law is that it isn’t a ‘law’ at all. There is no principle in the laws of physics that states that processors will get exponentially faster. It’s been driven by multiple generations of ambitious engineers who choose to keep pushing the boundaries. They expect themselves to exponentially increase the amount of components we can fit on a chip, and create vast technological improvements along the way. It’s a testament to the power of the human mind and capacity to cooperate in service of a common goal.
However, Moore’s law has limits. Even Gordon Moore himself has acknowledged them:
“The fact that something’s gone on for 50 years is truly amazing, but no exponential can go on forever”
You can only shrink these components by so much, and we’re already pushing this incredibly far. The next paradigm of advancement in computing power might be centered around the software and unique application specific hardware created for particular use cases. For example, AI or VR focused chips might create breakthroughs in those industries. But, as the MIT technology review noted, the end of Moore’s law will likely spell the end of exponential increases in general purpose computing power. Innovations like systems on a chip (SoC) and quantum computing may allow Moore’s law to continue for longer, but the days of general exponential improvements across all devices may come to an end.
Chips Have Made Our World More Abundant
A relatively small proportion of people who use computers are even aware that this world exists. We’ve come to just expect faster devices and new technologies because ‘they’re getting better and better at making them.’ There is no ‘they.’ There are only engineers, operators, and innovators working all around the world to push the laws of physics.
They’re the people that keep your iPhone buzzing faster in your pocket. Their work brings us advancements in computer graphics, VR, AI, IoT, and everything else you hear tech industry futurists talking about. This mind numbing amount of effort by millions of people and thousands of companies is driving the world to become a more abundant & opportunity rich place – at least within the world of bits.
It’s my opinion that people who work with software regularly should expose themselves to the underlying technology that makes it possible for us to use the internet and do our jobs every day. We’re able to work from home, connect with friends on social media, and create products with unlimited distribution because of innovations in hardware above all else.
To those who work in computer hardware or the semiconductor space – you have my respect.