GigaFlops a viable measure?

Discussion in 'General Mac Discussion' started by Stelliform, Nov 6, 2002.

  1. Stelliform macrumors 68000


    Oct 21, 2002
    OK, I am new to the Mac game. SuperPower laptop on order.... :D :D :D

    Anyway, I notice that the 1Ghz ranks at 7.5 Gigaflops. This sounds to me as a good way to compare processor speeds to Intel. However, when I start poking around, I cannot find the Gigaflops rating of any x86 processors. (All I can find are unofficial rumors, and 3rd party benchmarks by non reputable people)

    We all know that Mhz is a poor benchmark. Why shouldn't apple start Marketing Mac with Gigaflops. For example 1Ghz = 7.5 Gflps. It is still truthful advertising, and the average joe with think 7.5 Ghz. (at least for a second)

    I guess my question to the forum is, how reliable is Gigaflops as a benchmark? Can it be legitimately blasted by the critics? Does it reflect the preformance of a processor despite the design and OS differences?

    (Ok I guess there is 3 questions. ) :)
  2. alex_ant macrumors 68020


    Feb 5, 2002
    All up in your bidness
    Re: GigaFlops a viable measure?

    Gigaflops is a mostly meaningless benchmark that, in Apple's case with the PowerPC, only tells you theoretical peak single-precision vectorized floating-point performance. (In other words, any code that utilizes double-precision, and is not hand-optimized for AltiVec, is exempt.) I say theoretical because performance is essentially never anywhere near that high except in the most specialized benchmarks and applications, like RC5 and a few Photoshop filters, where it can come close. No other CPU companies release gigaflop numbers because they are about as meaningful as MIPS numbers (i.e. not meaningful at all). If Apple were interested in proving their CPUs' superiority, they would be using SPEC, but as it turns out they are the only computer company not to use SPEC because their CPUs perform so dreadfully in it.
  3. LethalWolfe macrumors G3


    Jan 11, 2002
    Los Angeles
    All benchmarks and speed measuresments are meaningless unless you use the software that tested the hardware. For example, a gamer wouldn't care about tests using Photoshop, and a graphic artist wouldn't care about tests using Unreal Tourney.

    For "day to day" computing any computer today is more than fast enough. If you are going to be doing specialized or hardware intensive tasks (like 3D animation, or DVD encoding) then look at the programs you are going to be using and see what hardware it performs best on.

    Sorry for the rant, but the desire for a "one size fits all" test/rating to determine the fastest hardware gets on my nerves.


Share This Page