Please explain "GHz".

Discussion in 'MacBook Pro' started by Remington Steel, Feb 14, 2017.

  1. Remington Steel macrumors member

    Joined:
    Dec 26, 2016
    #1
    Ok...I never claimed to be a guru or an expert by no means. Could someone please explain in dummies term what is the significance of GHz. Obviously I know the higher the number, the "better". Is that correct? Thanks in advance.
     
  2. estabya macrumors 6502

    estabya

    Joined:
    Jun 28, 2014
    #2
    Generally speaking a CPU's frequency, measured in Hz, is the number of clock cycles a CPU executes in one second. So a 2.6Ghz processor goes through 2.6 billion cycles per second. At face value this doesn't mean much though, because what a CPU can do in a cycle is dependent on a lot of different variables (instruction set/type of instruction set, how many instructions can be executed per cycle, pipelining, etc...). So it's only really meaningful to compare CPUs based on clock speed (Ghz) if they are in the same or very similar families.
     
  3. DeltaMac macrumors 604

    DeltaMac

    Joined:
    Jul 30, 2003
    Location:
    Delaware
    #3
    Ghz, on a computer, is usually the clock speed of the processor (the CPU).
    Higher number means the clock speed (the operating speed) of the CPU is faster.
    That doesn't necessarily result in "better", just faster. Speed is not the final answer.
    So, no, not technically correct.
     
  4. Remington Steel thread starter macrumors member

    Joined:
    Dec 26, 2016
    #4
    Hmmm....ok
     
  5. v0lume4 macrumors 68000

    v0lume4

    Joined:
    Jul 28, 2012
    #5
    Not sure how much constructive information that I can add to this thread, but to echo others' posts, Ghz simply is a unit of measure to measure the speed of a processor (or whatever else you're measuring -- generally computers).

    Giga means billion. Mega means million. A gigabyte is one billion bytes. Et cetera.

    Generally speaking a processor with a higher processor speed is going to be faster than a processor with a slower processor speed (say 3.0Ghz vs 2.0Ghz), but as others have noted the processor speed isn't always the end-all. Think about cars (this analogy may not make sense if you're not into cars) -- a 5.0 Liter engine from Ford may not necessarily be as fast as a 5.0 Liter engine from Chevrolet (or vice versa). It depends on the engineering and other variables.
     
  6. Remington Steel thread starter macrumors member

    Joined:
    Dec 26, 2016
    #6
    Ok....I can understand your analogy. Thanks. Just needed someone to put it in perspective like that. lol
     
  7. Mindinversion macrumors 6502

    Mindinversion

    Joined:
    Oct 9, 2008
    #7
    I just had to say that was a very short, well written, concise answer. I really enjoyed reading it :)
     
  8. bizack, Feb 14, 2017
    Last edited: Feb 15, 2017

    bizack macrumors 6502

    Joined:
    Apr 21, 2009
    #8
    Hertz is a measure of frequency. 1 Hertz is one cycle per second. If you had a fan in your house or apartment and it rotated at 1 Hertz (or 1Hz), it would rotate once per second. Giga is 10^9 or 1 Billion. Giga Hertz (or GHz) is 1 Billion cycles per second. A computer with a 1GHz CPU is able to perform about 1 Billion instructions in 1 second. A computer with a 2GHz CPU is able to perform about 2 Billion instructions in 1 second, and so on. Since a computer program is basically a series of instructions that a computer follows, or executes, the faster the CPU, the faster the computer is able to run software, perform tasks/operations, etc.
     
  9. priitv8 macrumors 68020

    Joined:
    Jan 13, 2011
    Location:
    Estonia
    #9
    In the terms of car analogy - I'd compare Hz to rpm.
    6000rpm on both engines are 6000rpm, but the power output must not be the same.
    Imporant thing here - both Hz and rpm include the notion of time. Both show how many times a phenomenon occurs in one unit of time.
     
  10. Analog Kid macrumors 601

    Analog Kid

    Joined:
    Mar 4, 2003
    #10
    Hz, or Hertz means "per second". Giga means billion. So a Giga-Hertz is a billion "something"s per second.

    The GHz rating on a processor is the clock rate (ticks per second) but, ignoring all the complicated details, the best way to think of it is that the "something" is a block of work. So a 1GHz processor does a billion blocks of work a second.

    What's important to understand is that the block of work done is different for different processor designs. Some processors do more work in a block than others. In fact, one of the easiest ways to speed up the clock in a chip is to reduce the size of a work block. If you can do a certain amount of work per block at 1GHz and you choose to do half as much work per block then you can do it at 2GHz.

    Another way to do more work in less time is to do two things at the same time-- so two parts of the chip are working on different blocks at the same time so you get 2 blocks done per clock. Now you're doing twice as much with the same 1GHz clock.

    Increasing the number of ticks per second has a few other effects, most importantly it affects power. More GHz means more power consumption and more power consumption means more heat. Heat is all kinds of bad. Heat is probably the single most important factor limiting how fast we can compute.


    So, in the end, nothing is as simple as looking at one number. What we're finding is that from generation to generation of processor chip, the clock rate doesn't seem to be increasing much-- we've been in the 2-3GHz range for quite some time-- instead we're doing bigger blocks of work per tick and doing more blocks in parallel. Within a generation though, the faster clock rate means doing more work in the same time because the rest of the chip design is mostly the same-- the same size block of work, so the fast clock means more blocks per second.
     
  11. v0lume4 macrumors 68000

    v0lume4

    Joined:
    Jul 28, 2012
    #11
    Ahhh. Awesome analogy. Well said. The power output of a Mustang GT at 6000 RPM isn't as powerful as a Camaro SS at 6000 RPM. That new Camaro is a monster. ;)

    Though I still prefer the looks of the new Mustang any day.
     
  12. leman, Feb 14, 2017
    Last edited: Feb 15, 2017

    leman macrumors 604

    Joined:
    Oct 14, 2008
    #12
    If you want a bit more technical explanation: CPUs essentially operate in steps. During every step, a simple operation can be done (such as add two numbers, move some data from here to there). However, because CPUs are very complicated, one needs some sort of signal to synchronise the execution of these steps across its components. Imagine it as a group of soldiers that are marching in tact. Thats where the clock generator comes in play — it fulfils the role of the drummer essentially. This is basically a very fast metronome that goes TICK TICK TICK TICK, giving the tempo which which the CPU can do its job. One Ghz means one billion herz or one billion TICKS per second.

    To make things more complicated, as others have pointed out already, the CPU contains multiple units that can do various operations at the same time. If the stars align correctly, a modern CPU can for example multiply and add 32 standard-type numbers in one single TICK.
     
  13. Remington Steel thread starter macrumors member

    Joined:
    Dec 26, 2016
    #13
    Now all of you all are going way over my head....lol. Thanks though.
     
  14. maflynn Moderator

    maflynn

    Staff Member

    Joined:
    May 3, 2009
    Location:
    Boston
    #14
    Back in the day the MHz was that the CPU would execute a single computer instruction x million times a second. a 16 MHz CPU would execute computer instructions 16 million times a second.

    Now a days we're in the GHz (billions) and thanks to CPU instructions the definition is rather more of a heartbeat. That is the computer does many things at 2.2 billions times a second (for a 2.2 GHz cpu).

    It used to be that we'd see a linear increase in performance when we increased the cycle rate, but now thanks to the complexities of modern day CPUs increasing the GHz has a smaller affect and it also increases heat so there's a balancing act required.
     
  15. Benchobi1 macrumors newbie

    Joined:
    Dec 10, 2016
    #15
    To expand on what DeltaMac said (and others have mentioned), there are a lot of variables that factor into "better". A faster clock speed does not necessarily equate to a better overall computing experience. The CPU will ultimately need to access memory, the hard drive, etc. The speed at which these operate will factor into the overall experience. I remember in the early days of personal computers when CPU speeds were increasing it was commonly said "these higher clock speeds just mean the CPUs are waiting really fast".
     
  16. RichardC300 macrumors 65816

    RichardC300

    Joined:
    Sep 27, 2012
    Location:
    Chapel Hill, NC
    #16
    GHz is how fast the processor is. Another factor is number of cores it has, usually 2 or 4 for mainstream computers. Some programs can only use 1 core, some can use 4. The latter is roughly 4x as fast (not quite, but let's say roughly for simplicity's sake).

    But GHz and number of cores is not the end-all markers for performance. There's the architecture, or generation, of the processor. Imagine current V8 engines vs V8 engines from decades ago. Both with 8 cylinders, but the newer ones are stronger and more efficient.

    This is a super simplified explanation and probably enough for 99% of the population.
     

Share This Page