Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
67,728
38,262
Tom's Hardware has an article explaining AMD's new plan in it's battle against the megahertz myth. It seems that the new Athlons (based on the Palomino core) will be specified by MODEL rather than MHz. Model A1600, for instance, is a 1.4GHz Athlon, but AMD views it as equivalent in speed to a P4 1.6GHz, at least. They demand that no motherboard/BIOS maker ever reveal the actual clockspeed of the chip, and even go as far as to prevent the printing of the CPU's clockspeed in the motherboard manual. Interesting concept. Reminds me of Cyrix's model naming back in the early-mid 90's. I think having AMD around is definitely a good thing, but this tactic seems a bit, well, untoward.
 
Nice one

Well, Intel markets higher MHz chips as being faster than AMD's offerings. Its fair game I think!
 
better idea

I like the idea of battling the Mhz myth, but not this way. Model numbers will confuse people more, and they're based off of Intel's chips! I think a new rating should be established, like floating point operations/sec, or ICCs (instructions per clock cycle, someone already posted about this). Something more tightly tied to the hardware than a model number, like a car's horsepower. I think it will annoy the techies who know that a 1.4 ghz Athlon = P4 1.6.
 
I am a firm believer that education is the best way to get the word out, and is ultimately the right thing to do. This viewpoint goes against all things corporate though, and it may eventually backfire for AMD and others who try it. For instance, if Intel had done that earlier on in it's heyday, then the average Joe might understand now that clock speed means about as much as having a big bowl of tapioca pudding in a drive bay to use as a cd player, and everyone would be buying AMD. The government and corporate America (this may sound like a conspiracy theory) like to keep us all dumb as much as possible because, eventually, a dumb consumer is a good consumer, just as a dumb voter is a good voter. This is why I use Apple computers. I understand the insides of a computer fairly well, and I evaluate the pros and cons of a Mac vs. WinTell box debate better than the average Joe, and use the Mac. Most people don't get it, and think that fatal error messages and a computer that crashes once a day is just a way of life. They buy an HP or Compaq. I try to spread the knowledge and educate all, but unfortunately people LIKE to stay dumb because it is easier to stay that way. I will forever fight this, but unfortunately for Apple, I fell this might be the biggest losing battle in history. Time and good marketing will tell...

-Pete

P.S. Good to see you again blakespot, not been posting for a while?
 
Oh now it's a good idea.

When Apple said that Megahertz wasn't the most important part of the processor everyone laughed. But, I'm sure when CNET and TechTV and all the other technology sites report on this they'll all start doing reports on how real the Megahertz Myth is and how it's so great that AMD had the balls to come out and be the first to prove Intel wrong.
 
No way

I want to deal with the information, not be manipulated by hidding numbers. I'm agree education is the way or some other kind of raiting, but not "hiding".
I remember the times where you see an add of the new macintosh models (LC2) and you couldn't tell about the technology, Mhz was an unknow concept like video ram. You could hardly get a 50MB hard drive and a double sided floppy drive, that was the new stuff. Now people is learning about what is important about a computer and what is not, let people still learning...

Let's trow the hammer against the screen again!!!
 
Apple & AMD team up to defeat "Mhz Myth?"

Perhaps it is now time for Apple and AMD to team up and defeat Intel's grip on the Mhz Myth by rating the speed of their processors in a different manor. Two against one might just work!
 
Some good... Lots of bad.

I can certainly understand the reasoning, but this is going to backfire for several reasons... First political.

This is the equivalent of saying, "I told the police everything I know about Chandra Leavy. I have been cooperating. But, I'm not going to tell you."

There will be far fewer people willing to buy a computer of unknown MHz than there will be willling to buy a computer of less MHz and a good reputation.

Second, technical. Athlon is the chip of choice for high end users who know what they are doing. How can you custom build a box if you don't know how fast to set the clock... Rely on auto-detect BIOS? How can you overclock it by say, 5%, if you don't know what 5% is?

Man, Athlon was winning a lot of respect... PC World tested four Athlon 1.4 based systems against two P4 2.0 GHz based systems... The result: The fastest P4 was slower than the worst of the Athlon systems.

That is the kind of info sales people can show the average consumer. But, it helps not to have a cloud of mystery surrounding the whole thing!
 
Originally posted by ptrauber
The government and corporate America (this may sound like a conspiracy theory) like to keep us all dumb as much as possible because, eventually, a dumb consumer is a good consumer, just as a dumb voter is a good voter.

I think you have watched one to many Oliver Stone movie. Some how I don't think the so call "Megahertz Myth" has anything to do with government or corporate mind control.

"I voted for George W. Bush because Intel processors run 600 Megahertz faster then AMD processors"
 
the MEGA-hurts myth...

This is so funny.. I haven't laught this hard in a while.. I mean really now.. PC users are so obsessed with the MHz rating, who are they going to sell to when the first question out of the mouth is "how fast is it?".. Come on..

Why don't they instead just put it in fine print and advertise with dumbo-ear sized letters on the computer how many MIPS or MFLOPS instead of the clockspeed. MIPS and MFLOPS has a more real world performance indication, and a high ooooooohhh ahhhh factor to boot which will give those poor PC'rs the rosy glow of having high speed numbers that they need to sleep at night. lol

 
MIPS, FLOPS and others

I don't think MIPS and FLOPS and other indicators will help, because the P4 will win that battle.
The companies (everyone, Apple, Intel and AMD) would print theoretical limits that never occurr. Because of the P4s high clockspeed and its (too) long pipeline, it will win that contest (which doesn't mean it is faster when you are actually working with it).
 
The Myth is true, HOWEVER.....



I couldn't aggree MORE with the Megahertz Myth. A 867 G4 is quicker than a PIII running in the same clock speed or even 200 - 400 MHZ more. But is the latest G4 quicker than a 2GH P4 in anything but the "SuperLatestEditionAltiVecOptimizedBastard - WhatAppleuses init'sexpos - PHOTOSHOP"
I DON'T THINK SOOOO.....

Mother Apple try some FPS tests pleazeeee....

Let me tell you something that is more fair. Do not compare MHZ in terms of real performance but what MONEY can buy in terms of real performance. And I think here Intel and AMD have a great advandage over Apple.
I realy don't want to be bad. I am still a P2 user but I am waiting and hoping to buy a G5. I think this CPU will change things for Apple, and it should eventually lead Apple to tell LESS LIES in people in order to support its overpriced (however quality I must admit) products.
 
I though FPS depended more on the graphics card rather than the processors raw speed. A game otimised for say a GF3 would produce higher framerates than on the same system with an optimised ATI Rage 128?

 
MHZ, VIDEO CARDS AND BEYOND....



I would be crazy to ask systems with different video cards to be compared. I am talking about systems with identical components (where this is possible) to be tested.


andsince we are talking about video cards Apple shouldn't <<trash>> Intel because if it wasn't Intel, Apple would still have to use ISA bus for its video cards AS Both PCI and AGP are Intels creation. Apple innovates, that's for sure. BUT other companies also do THE SAME!!!
MAC fans open your eyes... don't believe Apple or Intel. The truth is always somewhere in the middle!!!
 
get over fps

FPS reflects the graphics card and how well the drivers and engines are optimized for the platform. It's not an accurate measure of overall speed. This is why my Athlon 700 w/Voodoo 3 16 mb performed better than my roommate's Athlon 800 with Matrox G400 32 mb. Even if you could get a P4 and G4 systems specced out exactly the same, the graphics drivers for the Mac are not as optimized or advanced for the platform yet. Thus the Mac will lose, even though it may be the faster overall system. Photoshop was used because it's optimized for the P4 AND G4. Photoshop can max a system's power and resources very easily. In my opinion, it's a better measure of speed than FPS, because it really uses all of the systems components, including the graphics card.

[Edited by ThlayliTheFierce on 08-30-2001 at 01:10 PM]
 


Can't you people see that Apple is ALWAYS using photoshop
for its tests? You cant realy figure out which CPU is better by using nly one programm for tests?
Let us say that Apple had best results in FPS than PC and worst reults in photoshop it would still use photoshop to demonstrate the G4??? Apple cleverly uses the program that gives its product a relative advance over Intel. But since it's only one program it can't give an OVERAL performance result for each CPU.
In simple English what I am trying to say that the Apple slogan: "Up to 45% faster than a Pentium 4" can very easily be converted to "Up to 50% slower than a Pentium 4" using some other program.
Nothing more, nothing less. And as Metallica say: "You know it's Sad but True"
 
a few points

Ok, so they use photoshop because it's faster, yeah. But that's not the only reason. They use it because:

1. Most people who use Macs use photoshop.
2. It's easy to script a test in.
3. It's optimized for both platforms, so it is more impartial than other tests might be.
4. Like I said before, it's very system-intensive, and can easily tax out all resources.

And Apple does NOT only use photoshop. I recall a Quicktime encoding test as well. Granted, a DVD-encoding test may be more equitable, but the point is photoshop isn't their only demonstration. Besides, what other test are they going to use? It's hard to compare systems that are so different. The photoshop test brings them closer to a level playing field than other tests would.
 
AMD and Apple

If AMD is going to use model numbers instead of clock speeds, why is the model number of a Athlon 1.4Ghz chip being called a model 1600. Sounds like direct aiming at Intel's Pentium 1.6Ghz chip. Benchmarks a nice way to gauge the performace of a certain chip. Maybe later in the future intel and amd will get together and decided on a way to gauge thier chips that makes it easier for the consumer to understand. If the consumers don't understand what they are buying they may not waste the money. That's OK. They could always wise up and get a mac.
 
It *IS* aimed directly at the P4 1600...says so in the article. AMD and Intel get together? That's like holding a picnic for the Black Panthers and the KKK! (Ok maybe not that bad but you get the point.) If another speed measure does come out it won't be because AMD and Intel buddied up and made it.

[Edited by ThlayliTheFierce on 08-30-2001 at 06:29 PM]
 
Apple fights against the BAD Apple, not Intel !!!



The conclusion is that noone cares about Apple. AMD has even named it's mobile Athlon, Athlon 4(??? where are the other two?) guess why? Pentium 4....
Intel on the other hand increases the speed of an average Chip (that's true) to make a difference from AMD

Apple should do some things in order to increase its pathetic 5% market share. (in USA at least because in my country it's 0.1%)
a) More programms!!!
b) CPU is not the only speed charachteristic of a system. Why Apple stays to ATA 66? Why the hard disk of its low end, yet $1.700 system, is 5.200 RPM?(!) Why the system Bus is still 133 MHZ? Don't you know that the AGPx4 slot alone can use the whole bus???? (33X4) Why RAM PC133? (yes because of the bus). Anyway these things are surely NOT the technology's edge today! Why should a SUPER COMPUTER (here is another marketing Lie from the GOD of lies MR JOBS) use outdated components?
c) A small price cut would be VERY helpful!!!
d) Stop the lies about the G4 being the quickest CPU.
It may be for some specific uses but it isn't
the fastest overal.

Finaly something to ThlayliTheFierce. You say that Photoshop is chosen for blablabla reasons. I that wants a computer for Recording where the hell should I know if "Cubase" and "Logic" run better in G4 or P4? By the photoshop tests?? Please give me a break!!! And the quick time tests you say, remember that it's APPLE QUICK TIME!
Yes, it's available for both platforms but it's still APPLE!
There should be better ways to compare CPUS other than for PURE APPLE MARKETING REASONS!

Finaly I am REALLY SAD to say all these things as a future Apple Customer. Someone should take notice and push things to the right direction!

 
what country would that be?

You that wants a computer for recording should search around and check if those programs have been optimized for the G4. If they are, then Photoshop is a very good indicator of speed for you, because it is optimized as well. My bet is that they are optimized. And the supercomputer thing is not a lie, but it is a little outdated. The measure of a supercomputer used to be whether or not it could do a gigaflop. The G4 was the first chip to attain this, in theory and practice, so it is a supercomputer by that definition. Steve Jobs doesn't lie (Spikey if you're reading this don't think I'm saying he's a saint), he omits and directs your attention to the better points of his product, like any good CEO would do. Are you going to fault the man for trying to sell what his company makes?
 
Greece

The measure of a supercomputer used to be whether or not it could do a gigaflop. The G4 was the first chip to attain this, in theory and practice, so it is a supercomputer by that definition ------>>>> IS THE APPLE SITE YOUR ONLY SOURCE OF INFORMATION????
 
keep smiling!

i know that in reality g4 isn't that fast as apple or some of you are making it,but,some day there will be g5 or something and we will have fast hdd,bus,memory and all other stuff!
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.