From a simple switch to swath of cat videos...
Simple units, multiplied by the millions, leads to unpredictable emergent properties.
Nothing exemplifies this more than the humble Computer Processing Unit, or CPU.
Since first appearing as a simple logic switch in 1903 (1), the humble CPU has inspired countless digital inventions, switched on the Internet, and sparked a technological revolution.
Let’s take a tour of the life of the CPU, from infancy to ubiquity.
Origins of the CPU
In the mid-1890s, prolific inventor Nikola Tesla was tinkering with ways to control one of several wireless devices selectively. To select a single device, he began working on what we now call electronic switches.
This work eventually led to a patent, realized in 1903.
From this auspicious start, the field ran dry for decades.
After all, it’s one thing to file a patent, and quite another to conceive and construct useful applications for the real world. Remember that at that time, the primary use of electricity was for household and street lighting.
The fundamental design of electronic switches really got sparked up with the invention of the first transistor. First appearing in the late 1940’s (2), these were large objects surrounded by a vacuum within glass tubes that resembled the classic lightbulb.
Although these first transistors were large--the size of a mobile phone today--they constituted the germinal form of everything that would later be miniaturized and embedded within today’s iPhone.
What Exactly is a Transistor?
Essentially, a transistor is an electronic switch that is turned on or off through the application of a small voltage. These two states are represented by the all familiar symbol I/O, meaning “on” or “off.”
The power of these comes in numbers...
While a single transistor has just possible states, the combination of two transistors leads to 22, or 4 possible states. Following this “power law” pattern, a humble 100 transistors generate an impressive array of over 1030 states, or over one thousand million billion trillion!
As they say in Latin, vires en numeris, or, power in numbers!
The integration of several transistors together leads to what is called an “integrated circuit.” While early experiments with these were performed upon various substrates, including germanium, eventually the specific properties of silicon made it win out as the semiconducting material of choice.
From Transistors to CPUs
Much like the human body has many essential organs, but a central governing brain, computers can have many components, which all depend on the Central Processing Unit to carry out the basic arithmetic, logic, controlling, and input/output (I/O) operations of computations.
A CPU is able to processes several operations simultaneously.
But what does that actually mean for my laptop?
Simply that you can download that file, stream music, and surf the web all at the same time. And the more powerful your CPU (and the more RAM you have), the more tabs you can have open at once.
But as our technology has evolved, have our demands of hardware.
We always want it to be faster, smarter, and more capable of multitasking. Fortunately, the sophistication of CPUs has made all this possible, and in doing so, powered the advancement of many industries, from transport to telecommunications.
Part of the solution has been to keep increasing the density of transistors on chips.
In fact, the density began by doubling each year in the 1960s (3).
More recently it has slowed down to a doubling just every 18 months, in a trend which has been termed Moore’s Law (3). With the constraints of nanotechnology, this is expected to keep slowing.
Specifically, the limitation is the distance between conducting and insulating parts of a silicon chip, technically called “pitch.” Although we keep inventing new ways to improve the resolution of the pitch, we keep approaching a fundamental limitation of physics. This is the fact that electrons can literally jump, or “tunnel” through insulating areas of the chip when these areas become so small.
A Little “Bit” of History
If you’ve ever shopped for a new computer, you’ll no doubt have been dazzled by an array of terminology - 64-bit, 32-bit, RAM and ROM. But what exactly are these ”Bits”?
Here’s a little bit of terminology.
A “bit” is a single unit of electronic data that can be represented by one of two numerical values: a 1 or a 0. Long sequences of these zeroes and ones together carry the data for everything that moves through your computer.
Every email, spreadsheet, and even every cat video is nothing more than a long stream of 1s and 0s.
Each CPU can only hold and process a limited number of bits at a time. Because a series of 4 single bits combine to make 16 (24) possible states, a 4-bit CPU can hold and process 16 bits at a time.
Early CPUs in the 1970s started with 4-bits (4). The 1980s saw expansion into 16-bit and 32-bit processors, which became the norm in the 1990s. Not until the 2000s were 64-bit processors manufactured en masse.
If this sounds like only incremental progress, remember that the power law applies. While a 4-bit processor of the held a humble 16 states, the 64-bit CPUs process over 18 billion bits at a time. The numbers can be deceptive, but the results are powerful!
Currently, we’re still using 64-bit CPUs.
Processors of 128-bits have yet to make it into mainstream usage. However, history would suggest that this new generation is right around the corner, with all the new speed of computing power that this will bring in.
The Price of Progress
We typically pay for things by quantity, so one would think that a CPU that’s twice as powerful would cost twice the price. Then why is it that while computers have become more powerful year over year, prices have remained counter-intuitively steady?
As technology has improved, the cost of manufacturing has decreased.
The net effect of this has been that every year or two when we upgrade our hardware we pay about the same, for about double the performance. While some people may prefer to have the same performance for half the price, this still seems like a good deal!
Let’s look at some numbers of CPUs from Intel, one of the largest CPU manufacturers, second only to Samsung.
The first microprocessor, the Intel 4004, was introduced in 1971 at a price point of $200 (2). With 2,300 transistors, this unit was capable of three-quarters of a million computing cycles per second (740 kHz).
Twenty years later, Intel released the 486SX chip. Impressively, this ran over twenty times faster (16 MHz) and kept the price to $258 (2).
While some Intel models such as the Pentium processor produced in 1993 have sold at higher price points for more power--$878 for a 60 MHz incorporating over 3 million transistors--these have been rare (2).
Instead, Intel has continued to improve the processing speed and density of transistors on CPUs while only incrementally raising the price tag.
From 2006 to 2013, the processing speed almost doubled from 2.4 to 4.4 GHz, and transistor density tripled for the popular Core 2 Duo E6600 to the Core i&-4790K, all the while prices moved merely nominally from $316 to $339 (5).
How CPUs have Shaped the World
So many inventions and companies have ridden the coattails of CPU advancement.
Among these is the largest company in the world- namely, Apple. It’s famous Macintosh computer was powered by the Motorola 68000. This CPU- the first widely-produced CPU by Motorola- was an order of magnitude more powerful than incumbent chips (6).
Although already integral to Apple’s earlier model, Lisa, the Macintosh computer managed to bring the retail cost down from the extraordinary $9,995 to the comfortable price point of $1,995, while keeping the powerful performance delivered by the 68000 (7).
The Motorola 68000 was the cutting edge of computing in 1979, and it’s still used today! (8)
When Apple looked to upgrade from the Motorola 68000, the PowerPC 601 was born.
Seeing that Microsoft and Intel were dominating the market, Apple, IBM, and Motorola got together and collaborated to create the PowerPC. These CPUs were wildly successful, also appearing in the Xbox 360 and the Nintendo Wii.
The winning streak dwindled in 2005 however, when Apple announced that it was moving future Macs over, from PowerPC to Intel processors.
But what is the most successful processor manufacturer?
Acorn Computers was a manufacturer of clunky desktop computers in the 1980s, and would seem an unlikely candidate. However, by adapting to the rapidly changing landscape of computing demands they managed to come out on top.
Acorn evolved their name from Acorn RISC Machines Ltd. to the shorter and catchier ARM.
Today ARM is the designer and licensor of the world's favorite processors, which appear in 95% of smartphones as well as in smart TVs, tablets and all kinds of portable devices. Impressively, they have shipped over 100 billion units already and are poised to surpass 200 billion units in 2021 (9).
The Future of CPUs
While CPUs are likely to continue their steady march of incremental improvement, there are several different developments that offer new perspectives on the whole industry.
Let’s briefly examine three of these, namely virtual CPUs, quantum computing, and blockchain-based computation.
Cloud computing has advanced rapidly and lowered the entry barrier to offering digital services. Where previously, a software company would be required to maintain a suite of state-of-the-art servers to host and run their operations, these can now be provided by the “cloud”.
What exactly is the cloud? Sounds kind of fuzzy...
Practically speaking, there is no abstract cloud, but rather a handful of major companies that fractionally rent out space on their massive computer servers. The largest of these is, of course, Amazon Web Services (AWS).
These hosts offer the equivalent of a physical machine and can subdivide the CPU operations into ‘virtual central processing units’ or vCPUs.
Spoken of with intrigue and mystique, quantum computing is widely considered as the next frontier of computing. While Moore’s Law continues to hold and advance CPU power in a slow and steady march, quantum computing has the potential to disrupt this in quite literally a “quantum leap.”
Get ready to bend your mind into the quantum world!
Actually, it all comes down to simple math.
While bits are binary, offering a choice between on and off, or zero or one, the quantum analog is quite different. A quantum unit, termed a “qubit” can exist in both of these states, as well as a superposition of both at the same time.
Is your head exploding, that’s okay- after all, we’re talking quantum mechanics here. All you really need to know is that qubit-based CPU could out-compute a standard binary CPU by orders of magnitude. This could have massive consequences for the entire semiconductor industry, not to mention computer security.
No need for concern yet though.
For now, quantum computers remain confined to research laboratories and still have less power than an ordinary pocket calculator.
We’ve discussed central processing units and virtual central processing units, and we’re now beginning to see the dawn of a new generation of decentralized virtual processing units. Let’s break it down.
We’ll start by looking at this new movement towards decentralization.
While cloud computing offers access to virtual CPUs, these are centralized, existing in a large server farm. Today, we’re beginning to see the advent of decentralized computations running upon what have become known as blockchains.
Blockchains are a lot more than just Bitcoin!
Sure, Bitcoin is the longest running and most valuable blockchain, but it simply decentralizes money. Innovators around the world are now experimenting to see what else can be decentralized and ‘put on the blockchain’.
The largest and most talked-about project is the Ethereum Virtual Machine, or EVM. This virtual CPU operates across a network of thousands of independently owned and operated computers worldwide. All of these perform the same computations in parallel and coordinate with each other to share results and ensure consensus by communicating across the ethereum blockchain.
I know, it’s almost as hard to get your head around as qubits!
It’s like a CPU wrapped around the planet, double and triple checking itself. The result of this is a slow, inefficient, and expensive virtual CPU. Hang on, that sounds terrible- why would anyone want to use that over the fast and affordable AWS?
For only one reason- to ensure that data is recorded and computed in an immutable, uncorrupted way, that is open to anyone and cannot be censored. Think about it, if Amazon wanted to, or was compelled to take down the data or operations running on their servers, there would be nothing we could do to stop them.
The EVM on the other hand operates independently of any corporation of government, freely processing all computations submitted to it. Unfortunately however, much like quantum computing, it’s still got a long way to go in improving performance.
It’s amazing how far CPUs have come in just a hundred years. From Tesla’s tinkering, to the supercomputers in each of our pockets today.
It used to take an entire room of computers to make the calculations of a simple pocket calculator. And today’s smartphones, with the same dimensions of a pocket calculator, can stream video calls, edit photos, and communicate with the globally decentralized ethereum virtual machine.
This is all thanks to advances in the CPU.
Central Processing Units remain the centrepiece of computers today, just as the brain/spine is the central nervous system for coordinating all the nervous functions of the body.
As of 2019, the 8-core 3rd-gen Ryzen 3000-series processor from AMD is leading the way in CPU design and even challenging Intel in terms of performance. However, with a 7 nm pitch of fabrication, its hitting the ceiling in miniaturization (10).
Quite strikingly, the Ryzen is priced at just $329 (10). Prices have increased just 65 % since their invention in 1971. Meanwhile, their performance has improved by leaps and bounds.
It’s not clear which direction the future holds for CPUs. But if history is anything to go by, they will be smaller, faster, and more than a little bit unexpected!