Intel released the first CPU, or central processing unit, in March 1971. The Intel 4004 was the first step on Intel’s path to dominance in the PC world.
It wasn’t expected to be as important as it became though. Intel made it as a side project to generate some cash while they worked on their main product, memory chips.
But here we are, nearly 50 years later, with modern CPUs in our pockets, on our wrists, and in virtually every part of our life. But what is a CPU? Let’s take a look.
What Is a CPU?
CPU stands for central processing unit. It’s the “brain” of a computer.
When you load an application like MS Word, run your favorite game, or browse the internet, the CPU in a computer does all the calculations and runs all the code that makes those applications function. It works alongside other components in the computer, like the RAM and graphics card, to make the computer do its thing.
The 3 Stages of Processing
CPUs do their thing through a 3-stage process:
The first step is to fetch information from the computer’s RAM. That could be the next command an application needs to do what it does or it could be part of a document you’re working on.
Once the CPU has that information, it decodes it into “machine code” that the computer uses to operate.
Finally, once the information gets decoded, the CPU executes it. This is what makes your applications function.
These three steps happen over and over, with millions of cycles per second. We’ll get into the clock speed of CPUs shortly.
For the first 30 years that CPUs existed, they contained a single processing core. The processing speed got consistently faster from one year to the next but those processors could still only process one thing at a time.
That changed in 2001 when IBM unveiled the first multi-core processor. That chip had two cores which meant it could process two different things at the same time.
A multi-core CPU is essentially two CPUs on a single chip. Each of the two cores can function independently but they can share other resources on the chip, such as cache memory and the data bus. This makes them more cost- and power-efficient than running two completely separate CPU chips.
In the nearly two decades since the first multi-core processor, chip designs have allowed more cores to be added. Modern CPUs can have as many as 28 cores on a single processor.
Multiple cores don’t always mean a faster computer, mind you. Software needs to be written to use multiple cores. If it isn’t coded to do so, it won’t run any faster than it would on a single-core chip. Processor-intensive apps like the latest 3D games get the biggest benefit from the extra cores.
Mobile vs Desktop CPUs
Another factor in CPU design is mobile vs desktop chips. Mobile chips are made for laptops and other portable devices that need to run on battery power like these Lenovo laptops.
They’re typically not quite as fast and have fewer cores than the latest desktop chips. Slower clock speeds and fewer components mean they use less power, which extends the battery life.
Desktop CPUs generally aren’t limited by power demands so they’re free to run as fast as possible. In a desktop computer, cooling is typically the most important factor.
Large towers with plenty of space for fans or liquid cooling systems can support fast chips that draw a lot of power. But smaller designs like Apple’s iMac have limited airflow so they may not offer the absolute fastest CPU available.
In some cases, those slim desktops use mobile processors since the lower clock speeds create less heat.
32-Bit vs 64-Bit Processors
Another change that has happened over the last 20 years is the move from 32-bit processors to 64-bit.
This refers to the size of the “words” they can work with. A word is a chunk of data.
As you can probably guess, a 64-bit CPU can handle larger chunks of information than a 32-bit CPU. That not only means they can chew through information faster, it also means they can address larger amounts of RAM and storage memory.
Most modern CPUs have 64-bit designs but there are a few exceptions. Most of those exceptions are low-cost chips meant to be used for specialized applications. You would be hard-pressed to find a 32-bit computer if you wanted to buy one for personal use.
CPU Clock Speed
The clock speed of a CPU refers to the frequency it runs at, measured in Hertz (Hz).
The first CPUs were measured in kilohertz (kHz). The Intel 4004 ran at 740kHz which meant it could process up to 92,600 instructions per second.
As CPU technology improved, they jumped to megahertz (1000 times faster than kHz) and then gigahertz (1000 times faster than MHz). Modern CPUs run as fast as 4.8 GHz, nearly 6,500 times faster than the first CPU.
Moore’s Law and CPU Speed
In 1965, the founder of Intel theorized that the number of transistors that could be packed into a set amount of space would double approximately every two years while the cost would halve. His name was Gordon Moore and this theory became known as Moore’s Law.
For the first 30 years or so of the personal computer industry, Moore’s Law held up surprisingly well. Over the last decade or so, CPUs have started to reach the limits of physics so the rate of change has started to slow down.
Computer RAM is fast and the CPU can fetch information from it millions of times per second. Even so, that information has to travel from the RAM to the CPU, which adds tiny fractions of a second of delay to each operation.
Cache memory is built into the CPU itself and offers even higher performance than system RAM. Because it’s on the same chip, the “travel time” from the cache memory to the processor is even shorter.
The catch is that a CPU can only hold a small amount of cache, relative to the amount of RAM in a typical computer.
With such a small amount, this memory is used to hold information the CPU uses over and over again. This has a bigger impact on the speed since the CPU can pull that information from the faster type of memory when it needs it, instead of getting it from RAM over and over again.
What’s the Different Between a GPU and a CPU?
In the mid-1990s a new class of graphics card showed up in the PC world. These cards used a dedicated processor to perform calculations for the latest 3D graphics.
These specialized processors are known as a graphical processing unit or GPU. Offloading the graphic processing to a dedicated chip frees up processing cycles on the computer’s CPU so it can focus on other calculations.
Modern GPUs look a lot like CPUs. They have multiple cores, large cache memory, and can handle millions of operations per second. They’re more specialized than a CPU though and can’t do more mundane tasks like running applications or the operating system.
Which Is More Important for Gaming?
One of the biggest questions when you’re considering a computer for gaming is whether you’re better off with a more powerful CPU or a more powerful GPU.
There isn’t a single answer to that question. It depends on the types of games you want to play and how fast each of the components is. When you get beyond a certain performance level, you’ll reach a point of diminishing returns.
If you’re a “casual” gamer and don’t play any games that have high-resolution 3D graphics, a fast CPU is the most important component. It will offer plenty of performance to run the game and display the graphics.
But if you play the latest 3D games and want to play them at the best possible resolutions, you need a powerful GPU to get the best experience. In this case, both the CPU and GPU play a big part in the overall speed so get the best of each that you can reasonably afford.
Choosing the Right CPU
Knowing the answer to the question, “What is a CPU?” is helpful when you’re looking at a new computer but choosing the right CPU for your needs is the more important thing.
How will you use your computer? Do you need a laptop that can run all day on battery power? Or do you need a powerful processor to work with video files or run the latest games?
Decide what you’re going to use your computer for first and then figure out which CPU will meet those needs. That’s the best CPU choice for you.
Be sure to check out the rest of our blog for more helpful articles about technology and other gadgets.