Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Remington Steel

macrumors 6502
Original poster
Dec 26, 2016
377
572
Ok...I never claimed to be a guru or an expert by no means. Could someone please explain in dummies term what is the significance of GHz. Obviously I know the higher the number, the "better". Is that correct? Thanks in advance.
 
Generally speaking a CPU's frequency, measured in Hz, is the number of clock cycles a CPU executes in one second. So a 2.6Ghz processor goes through 2.6 billion cycles per second. At face value this doesn't mean much though, because what a CPU can do in a cycle is dependent on a lot of different variables (instruction set/type of instruction set, how many instructions can be executed per cycle, pipelining, etc...). So it's only really meaningful to compare CPUs based on clock speed (Ghz) if they are in the same or very similar families.
 
Ghz, on a computer, is usually the clock speed of the processor (the CPU).
Higher number means the clock speed (the operating speed) of the CPU is faster.
That doesn't necessarily result in "better", just faster. Speed is not the final answer.
So, no, not technically correct.
 
Ghz, on a computer, is usually the clock speed of the processor (the CPU).
Higher number means the clock speed (the operating speed) of the CPU is faster.
That doesn't necessarily result in "better", just faster. Speed is not the final answer.
So, no, not technically correct.

Hmmm....ok
 
Not sure how much constructive information that I can add to this thread, but to echo others' posts, Ghz simply is a unit of measure to measure the speed of a processor (or whatever else you're measuring -- generally computers).

Giga means billion. Mega means million. A gigabyte is one billion bytes. Et cetera.

Generally speaking a processor with a higher processor speed is going to be faster than a processor with a slower processor speed (say 3.0Ghz vs 2.0Ghz), but as others have noted the processor speed isn't always the end-all. Think about cars (this analogy may not make sense if you're not into cars) -- a 5.0 Liter engine from Ford may not necessarily be as fast as a 5.0 Liter engine from Chevrolet (or vice versa). It depends on the engineering and other variables.
 
Not sure how much constructive information that I can add to this thread, but to echo others' posts, Ghz simply is a unit of measure to measure the speed of a processor (or whatever else you're measuring -- generally computers).

Giga means billion. Mega means million. A gigabyte is one billion bytes. Et cetera.

Generally speaking a processor with a higher processor speed is going to be faster than a processor with a slower processor speed (say 3.0Ghz vs 2.0Ghz), but as others have noted the processor speed isn't always the end-all. Think about cars (this analogy may not make sense if you're not into cars) -- a 5.0 Liter engine from Ford may not necessarily be as fast as a 5.0 Liter engine from Chevrolet (or vice versa). It depends on the engineering and other variables.

Ok....I can understand your analogy. Thanks. Just needed someone to put it in perspective like that. lol
 
  • Like
Reactions: v0lume4
Generally speaking a CPU's frequency, measured in Hz, is the number of clock cycles a CPU executes in one second. So a 2.6Ghz processor goes through 2.6 billion cycles per second. At face value this doesn't mean much though, because what a CPU can do in a cycle is dependent on a lot of different variables (instruction set/type of instruction set, how many instructions can be executed per cycle, pipelining, etc...). So it's only really meaningful to compare CPUs based on clock speed (Ghz) if they are in the same or very similar families.

I just had to say that was a very short, well written, concise answer. I really enjoyed reading it :)
 
Ok...I never claimed to be a guru or an expert by no means. Could someone please explain in dummies term what is the significance of GHz. Obviously I know the higher the number, the "better". Is that correct? Thanks in advance.

Hertz is a measure of frequency. 1 Hertz is one cycle per second. If you had a fan in your house or apartment and it rotated at 1 Hertz (or 1Hz), it would rotate once per second. Giga is 10^9 or 1 Billion. Giga Hertz (or GHz) is 1 Billion cycles per second. A computer with a 1GHz CPU is able to perform about 1 Billion instructions in 1 second. A computer with a 2GHz CPU is able to perform about 2 Billion instructions in 1 second, and so on. Since a computer program is basically a series of instructions that a computer follows, or executes, the faster the CPU, the faster the computer is able to run software, perform tasks/operations, etc.
 
Last edited:
Think about cars (this analogy may not make sense if you're not into cars) -- a 5.0 Liter engine from Ford may not necessarily be as fast as a 5.0 Liter engine from Chevrolet (or vice versa). It depends on the engineering and other variables.
In the terms of car analogy - I'd compare Hz to rpm.
6000rpm on both engines are 6000rpm, but the power output must not be the same.
Imporant thing here - both Hz and rpm include the notion of time. Both show how many times a phenomenon occurs in one unit of time.
 
Ok...I never claimed to be a guru or an expert by no means. Could someone please explain in dummies term what is the significance of GHz. Obviously I know the higher the number, the "better". Is that correct? Thanks in advance.
Hz, or Hertz means "per second". Giga means billion. So a Giga-Hertz is a billion "something"s per second.

The GHz rating on a processor is the clock rate (ticks per second) but, ignoring all the complicated details, the best way to think of it is that the "something" is a block of work. So a 1GHz processor does a billion blocks of work a second.

What's important to understand is that the block of work done is different for different processor designs. Some processors do more work in a block than others. In fact, one of the easiest ways to speed up the clock in a chip is to reduce the size of a work block. If you can do a certain amount of work per block at 1GHz and you choose to do half as much work per block then you can do it at 2GHz.

Another way to do more work in less time is to do two things at the same time-- so two parts of the chip are working on different blocks at the same time so you get 2 blocks done per clock. Now you're doing twice as much with the same 1GHz clock.

Increasing the number of ticks per second has a few other effects, most importantly it affects power. More GHz means more power consumption and more power consumption means more heat. Heat is all kinds of bad. Heat is probably the single most important factor limiting how fast we can compute.


So, in the end, nothing is as simple as looking at one number. What we're finding is that from generation to generation of processor chip, the clock rate doesn't seem to be increasing much-- we've been in the 2-3GHz range for quite some time-- instead we're doing bigger blocks of work per tick and doing more blocks in parallel. Within a generation though, the faster clock rate means doing more work in the same time because the rest of the chip design is mostly the same-- the same size block of work, so the fast clock means more blocks per second.
 
  • Like
Reactions: v0lume4
In the terms of car analogy - I'd compare Hz to rpm.
6000rpm on both engines are 6000rpm, but the power output must not be the same.
Imporant thing here - both Hz and rpm include the notion of time. Both show how many times a phenomenon occurs in one unit of time.
Ahhh. Awesome analogy. Well said. The power output of a Mustang GT at 6000 RPM isn't as powerful as a Camaro SS at 6000 RPM. That new Camaro is a monster. ;)

Though I still prefer the looks of the new Mustang any day.
 
If you want a bit more technical explanation: CPUs essentially operate in steps. During every step, a simple operation can be done (such as add two numbers, move some data from here to there). However, because CPUs are very complicated, one needs some sort of signal to synchronise the execution of these steps across its components. Imagine it as a group of soldiers that are marching in tact. Thats where the clock generator comes in play — it fulfils the role of the drummer essentially. This is basically a very fast metronome that goes TICK TICK TICK TICK, giving the tempo which which the CPU can do its job. One Ghz means one billion herz or one billion TICKS per second.

To make things more complicated, as others have pointed out already, the CPU contains multiple units that can do various operations at the same time. If the stars align correctly, a modern CPU can for example multiply and add 32 standard-type numbers in one single TICK.
 
Last edited:
Back in the day the MHz was that the CPU would execute a single computer instruction x million times a second. a 16 MHz CPU would execute computer instructions 16 million times a second.

Now a days we're in the GHz (billions) and thanks to CPU instructions the definition is rather more of a heartbeat. That is the computer does many things at 2.2 billions times a second (for a 2.2 GHz cpu).

It used to be that we'd see a linear increase in performance when we increased the cycle rate, but now thanks to the complexities of modern day CPUs increasing the GHz has a smaller affect and it also increases heat so there's a balancing act required.
 
Ghz, on a computer, is usually the clock speed of the processor (the CPU).
Higher number means the clock speed (the operating speed) of the CPU is faster.
That doesn't necessarily result in "better", just faster. Speed is not the final answer.
So, no, not technically correct.

To expand on what DeltaMac said (and others have mentioned), there are a lot of variables that factor into "better". A faster clock speed does not necessarily equate to a better overall computing experience. The CPU will ultimately need to access memory, the hard drive, etc. The speed at which these operate will factor into the overall experience. I remember in the early days of personal computers when CPU speeds were increasing it was commonly said "these higher clock speeds just mean the CPUs are waiting really fast".
 
GHz is how fast the processor is. Another factor is number of cores it has, usually 2 or 4 for mainstream computers. Some programs can only use 1 core, some can use 4. The latter is roughly 4x as fast (not quite, but let's say roughly for simplicity's sake).

But GHz and number of cores is not the end-all markers for performance. There's the architecture, or generation, of the processor. Imagine current V8 engines vs V8 engines from decades ago. Both with 8 cylinders, but the newer ones are stronger and more efficient.

This is a super simplified explanation and probably enough for 99% of the population.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.