Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacBandit

macrumors 604
Originally posted by Falleron
Sorry, I dont agree at all! All of the articles I read say that the old 1Ghz DP powermac is quicker than the current dual 1ghz systems. The reason, the extra cache in the old systems. Sorry to all those people with the new DDRAM system! :)



Sorry but there are 8 or 9 sites posting benchmarks of the new Dual/Ghz/DDR and they all show the new DDR machine faster at about %70 of the tasks given it. All this for $1,000 less then the old dual new.

This ArsTechnica thread is one place where you can see that the new machine is actually quicker. Also from my personal experience when you start mulititasking several high load apps at once the new Dual blows the old one in the dust.
 

Falleron

macrumors 68000
Nov 22, 2001
1,609
0
UK
Here is an extract from that page.

(Dual 1Ghz - DDRAM)

SMP Results:
# Threads Time Charged Time Score
2 56.9s 82.9s 163.5%


(Dual 1Ghz - NON - DDRAM)

SMP Results:
# Threads Time Charged Time Score
2 50.6s 97.7s 184.0%


Ok, I could well be interpretting the results wrong, but, the final score is higher for the Non-DDRAM (184% to 163%).
 

Falleron

macrumors 68000
Nov 22, 2001
1,609
0
UK
Here is my output, so, if someone with a DDRAM system wants to post theirs :

Time on my system = 40.4s (quicker than previously stated system 56.9s.

I know the systems are differently configured, + so its not a fair test really.
 

Attachments

  • tx-results.txt
    2.5 KB · Views: 149

barkmonster

macrumors 68020
Dec 3, 2001
2,134
15
Lancashire
i can put money on it that my 933 would smoke a 300 mhz G3 in anything, including non altivec tasks. One it is a later gen chip, meaning it has more technological advances...it has an L3 cache...and none of this even mentions the fact that the mhz rating is over 300% of the 300mhz G3 (100% denoting equivalency)...if my 933 is only 273% faster than a 300 mhz...im very disappointed.

I was meaning it terms of faster. By pure clockspeed a 900Mhz CPU is 200% faster than a 300Mhz cpu assuming all factors are equal.

I'm just say they're kind of meaningless because of the fact that there's a lot of benchmarks for the audio app I use and all of them put the 933Mhz G4 and similar models at over 200% faster than my 300Mhz G3. This goes without saying, the hard drive controller is faster, the bus speed is twice as fast, the cpu has a far higher clockspeed and the PPC7455 used in the G4 has a faster, larger cache than the one in mine. Not to mention the PPC74xx has significantly faster floating point performance than the PPC750 aswell. I'm just saying that in realworld terms if a 300Mhz G3 had a score of 100%, a 933Mhz G4 would be more like 310%

The same goes for the PPC7410 vs the PPC7450, on the same motherboard a lot of tests on barefeats came out with the 733Mhz G4 barely scraping past or even being beaten by the 533MHz G4.

Here's a link to a table of results for the audio benchmark I'm refering to.

Protools LE 'Dave C Test' Results

This stresses the CPU mainly but a SCSI card that handles the throughput of the data with it's own controller chip combined with a SCSI drive adds a lot to the performance. So does lot's of RAM and extension tweaks. Seeing as protools runs under OS 9 right now the OS doesn't have much overhead on the performance but people with dual CPUs don't get any extra performance out of the second one and nothing in protools is altivec enhanced either.

The sad thing is, we've got all the cool plug-ins and nice easy midi and patchname manangement on the mac, not to mention a great OS (9) and a downright fantastic OS (OS X) but an Athlon XP 1700+ with only ATA drives and 256Mb of RAM is about 50% faster than the fastest tested mac. In this test at least, the 2Ghz+ Northwood Pentium 4s and the more recent Athlon XP 2200+ based PCs are so sickeningly fast compared with the mac it's not even funny.
 

MacBandit

macrumors 604
Originally posted by Falleron
Here is my output, so, if someone with a DDRAM system wants to post theirs :

Time on my system = 40.4s (quicker than previously stated system 56.9s.

I know the systems are differently configured, + so its not a fair test really.

Here are my results.

Time on my system = 40.3s
 

Attachments

  • dual/ghz/ddr-results.txt
    2.5 KB · Views: 158

madamimadam

macrumors 65816
Jan 3, 2002
1,281
0
Originally posted by cr2sh


I don't trust these guys' data - some of it just seems contradictory, and while they know little about cpus I think though know even less about statistical errors. I love the idea of 'factoring out the errors in life.' The random errors I think you're talking about (while you can't discount) should in a test of like this be very minimal. Random errors occur from human inability to reproduce results among other things, the distribution will be strictly normal and an equally weighted least squares adjustment of the data will yield simple average, my point is though - there shouldn't be random errors. These are calculations performed based on code that doesn't change. The same machine should produce the same results in the same test under the same conditions. If it doesn't than there's a flaw in the design of the experiment (hard drive being accessed, other applications, or complications). There will always be outliers, I understand that, but this should be really freaking consistent.
You also mention user bias but if these guys' data is worth ANYTHING at all, than they should have checked that **** at the door.
As far as systematic errors, sure, the chip frequency could be buffed up a little or could be running a little hot. There are discrepancies in chip manufacturing, each chip is different, but under the same test it should result the same. These can be modeled though again by taking a larger sample of the same machines, the distribution of errors for these machines I have to believe is going to be minimal. That's an entire different test though, we wouldn't even be measuring the speed of the computers at that point - we'd only be looking at the distribution of errors in speed of the same machine. In effect, measuring Apple's ability to produce the same machine.
I don't like these guys, their experiment designs don't seem fair to all the machines, but that's just my opinion. Anyone else?

On a side note, I went out with this amazing girl last night and I'm just waiting on myself to screw it up... I really like her but jesus, I know I'm going to **** it up. :)


You are VERY incorrect.... you have to remember that the machines could have been configured differently and, as I pointed out, over time machines slow down if they are not maintained.

It is therefore HIGHLY possible that the DP 1GHZ DDRs in this test performed better than tha DP 1GHZ NO-DDRs because the DDRs are newer and still not heavily influenced by the factors that slow machines down over time.

Also, as I also pointed out, you have to think about the types of people that would be submitting results. Only a certain type of user post results which gives a biased view but then you have to remember that the type of DP 1GHZ user could be VERY difference from the iMac user and they could, therefore, have their machines configured and optimised VERY differently changing the distance between the machines.

And, so, as I mentioned in my earlier post, these results are not in stone, they are a guide and should be used in conjection with tests from other sources.
 

cr2sh

macrumors 68030
May 28, 2002
2,554
3
downtown
You are correct that differently configured systems will perform differently and a fragmented harddrive will operate more slowly and effect overall performance of the system (when was that in question?), but what is the point of benchmarking systems then? I have no interest in knowing what Joe Shmow managed to tweak out of his DP1gig (without at least knowing what exactly he did!) If you're going to run serious benchmarking and try to make ANY sense of the results and have it actually mean something - do stock configurations or do exact configurations with the only difference being integral system hardware (cpu type, fsb, mobo specs).

I have NO doubt that if you tweak a single 933 to the gills and freshly defragment the harddrive that you could smoke a dual cpu 1.25 with DDR that the harddrive has been written to a gross amount with no defragmentation and has had some stupid resource configurations. But what is the point!?? That in no way accurately shows the ability of a system, its just smoke and mirrors and complete nonsense.

If my idea of tweaking a dual 1.25ddr system was dropping it down the stairs, my 9600/300 could beat it hands down everytime.. but whats the point? ;)

But I agree with you, these guys have flaws in their results. There is I'm sure a bias in reults and I'm sure even a few submissions were doctored, and the results should be used for only a guide - but I'm not even sure that theyre worth that much guidance all together - I think its a lot of bunk.
 

madamimadam

macrumors 65816
Jan 3, 2002
1,281
0
Originally posted by cr2sh
You are correct that differently configured systems will perform differently and a fragmented harddrive will operate more slowly and effect overall performance of the system (when was that in question?), but what is the point of benchmarking systems then? I have no interest in knowing what Joe Shmow managed to tweak out of his DP1gig (without at least knowing what exactly he did!) If you're going to run serious benchmarking and try to make ANY sense of the results and have it actually mean something - do stock configurations or do exact configurations with the only difference being integral system hardware (cpu type, fsb, mobo specs).

I have NO doubt that if you tweak a single 933 to the gills and freshly defragment the harddrive that you could smoke a dual cpu 1.25 with DDR that the harddrive has been written to a gross amount with no defragmentation and has had some stupid resource configurations. But what is the point!?? That in no way accurately shows the ability of a system, its just smoke and mirrors and complete nonsense.

If my idea of tweaking a dual 1.25ddr system was dropping it down the stairs, my 9600/300 could beat it hands down everytime.. but whats the point? ;)

But I agree with you, these guys have flaws in their results. There is I'm sure a bias in reults and I'm sure even a few submissions were doctored, and the results should be used for only a guide - but I'm not even sure that theyre worth that much guidance all together - I think its a lot of bunk.

I guess we all just want a rough guide from somewhere since we have no hope in hell of Apple every producing true results.
 
Originally posted by gopher
snip... Law of averages and law of means. If you are going to rely on reports verify the source has multiple tests to confirm results for more believable statistics.snip...
Uh. Law of averages and law of means are the same. :p

Averages or more scientifically proper--means--by themselves are worth almost nothing in real life statistics except as a "point" in here and there, but IMHO the median is more significant than mean. I have yet to see a normalized distribution, a linear regression, a standardized distribution, or anything other than just the mean. If everyone used SPEC [http://www.specbench.org/] to benchmark computers, it would make life much simpler in benchmark statistics. Now to find out in a pretty much standard benchmarking suite [SPEC] what σ [standard deviation, in this case] are for all computers measured to figure out what area the Power Macs are in--in percentiles and compare to the rest of the computers tested. Now that'd be a lot more significant.
 

gopher

macrumors 65816
Original poster
Mar 31, 2002
1,475
0
Maryland, USA
Originally posted by MacCoaster

Uh. Law of averages and law of means are the same. :p

Averages or more scientifically proper--means--by themselves are worth almost nothing in real life statistics except as a "point" in here and there, but IMHO the median is more significant than mean. I have yet to see a normalized distribution, a linear regression, a standardized distribution, or anything other than just the mean. If everyone used SPEC [http://www.specbench.org/] to benchmark computers, it would make life much simpler in benchmark statistics. Now to find out in a pretty much standard benchmarking suite [SPEC] what σ [standard deviation, in this case] are for all computers measured to figure out what area the Power Macs are in--in percentiles and compare to the rest of the computers tested. Now that'd be a lot more significant.

Yes I know about that redundancy of means and averages, but I've also heard them spoken in the same sentence before as well in that manner. Though Spec benchmarks are based on tests that assume processes that have zero errors, a non-realistic assumption, and thus don't show what really can happen to code on a Pentium vs. a G4. They unfairly bias the Pentium because its very weakness is its error checking which gets stuck in its more numerous stages. Most all code you are going to run across has some minor errors, and it is up to the processor to decide whether or not they are significant enough to terminate the program. I wouldn't trust a benchmark unless it used realistic code.
 

MrMacMan

macrumors 604
Jul 4, 2001
7,002
11
1 Block away from NYC.
Re: More evidence

Originally posted by gopher
http://macspeedzone.com/html/reviews/machines/desktop/towers/aug-02/ghz-vs-ghz.shtml

It seems that the majority of tests the DDR 1 Ghz machine won. Barefeets caused unnecessary concern.
Woo, we can now stop woring. :rolleyes:
This almost sees as laughible as when the testing agency removed all the test that the Althon won over the P4 so the P4 kicked arse in that way.
Why would apple make a not faster system ? ;)
They Only did that ONCE in the past.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.