Ok...I never claimed to be a guru or an expert by no means. Could someone please explain in dummies term what is the significance of GHz. Obviously I know the higher the number, the "better". Is that correct? Thanks in advance.
Ghz, on a computer, is usually the clock speed of the processor (the CPU).
Higher number means the clock speed (the operating speed) of the CPU is faster.
That doesn't necessarily result in "better", just faster. Speed is not the final answer.
So, no, not technically correct.
Not sure how much constructive information that I can add to this thread, but to echo others' posts, Ghz simply is a unit of measure to measure the speed of a processor (or whatever else you're measuring -- generally computers).
Giga means billion. Mega means million. A gigabyte is one billion bytes. Et cetera.
Generally speaking a processor with a higher processor speed is going to be faster than a processor with a slower processor speed (say 3.0Ghz vs 2.0Ghz), but as others have noted the processor speed isn't always the end-all. Think about cars (this analogy may not make sense if you're not into cars) -- a 5.0 Liter engine from Ford may not necessarily be as fast as a 5.0 Liter engine from Chevrolet (or vice versa). It depends on the engineering and other variables.
Generally speaking a CPU's frequency, measured in Hz, is the number of clock cycles a CPU executes in one second. So a 2.6Ghz processor goes through 2.6 billion cycles per second. At face value this doesn't mean much though, because what a CPU can do in a cycle is dependent on a lot of different variables (instruction set/type of instruction set, how many instructions can be executed per cycle, pipelining, etc...). So it's only really meaningful to compare CPUs based on clock speed (Ghz) if they are in the same or very similar families.
Ok...I never claimed to be a guru or an expert by no means. Could someone please explain in dummies term what is the significance of GHz. Obviously I know the higher the number, the "better". Is that correct? Thanks in advance.
In the terms of car analogy - I'd compare Hz to rpm.Think about cars (this analogy may not make sense if you're not into cars) -- a 5.0 Liter engine from Ford may not necessarily be as fast as a 5.0 Liter engine from Chevrolet (or vice versa). It depends on the engineering and other variables.
Hz, or Hertz means "per second". Giga means billion. So a Giga-Hertz is a billion "something"s per second.Ok...I never claimed to be a guru or an expert by no means. Could someone please explain in dummies term what is the significance of GHz. Obviously I know the higher the number, the "better". Is that correct? Thanks in advance.
Ahhh. Awesome analogy. Well said. The power output of a Mustang GT at 6000 RPM isn't as powerful as a Camaro SS at 6000 RPM. That new Camaro is a monster.In the terms of car analogy - I'd compare Hz to rpm.
6000rpm on both engines are 6000rpm, but the power output must not be the same.
Imporant thing here - both Hz and rpm include the notion of time. Both show how many times a phenomenon occurs in one unit of time.
Ghz, on a computer, is usually the clock speed of the processor (the CPU).
Higher number means the clock speed (the operating speed) of the CPU is faster.
That doesn't necessarily result in "better", just faster. Speed is not the final answer.
So, no, not technically correct.