Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Re: AMD's numbering system..

Originally posted by jaykk
Why cant apple follow AMD's path - calling the fasterst chip 2200+ which operates at 1.73 Ghz!!! Why cant apple get rid of MHZ myth and just give us G5 whatever the chip may be :)

Apple could do the same thing as AMD (dont kill me). They could say like 1200+ for the 1Ghz, or whatever it is compared to a Pentium. Thats how apple should advertise. I mean, whoever that guy was thought AMD 1800 wer 1.8 Ghz. Apple could trick other people the same way.
 
I think a 1.4Ghz G4 with DDR might just match an athlon XP 2000+

If you got to barefeats and compare the single CPU results for a dual Ghz G4 vs a 1.4Ghz Athlon, you'll see that for the G4 to match the Athlon it would need to be either dead on 1.4Ghz or about 60Mhz over. We're talking about a CPU with DDR, no L3 cache a huge 512K L2 and 128K L1 compared to a CPU with a 2Mb L3, 256K L2, 64K L1 and plain old PC133 SDRAM.

If all we need at present is increase Mhz to match the speed, add a faster bus speed into the equation and I think come July, we might see a mac that can take on most of the Athlon XP cpus and at least match them.

I know this new AMD cpu is coming out early next year but that gives motorola another 5 months or so to come up with either an even faster G4 or the G5. I'm just glad we're slowly catching up, I mean the AMD chips are increasing in 66Mhz increments, at the rate Motorola are increasing the speeds of the G4 we're almost jumping 3 times as much as AMD with every speed bump. A 200Mhz increase on a 1Ghz cpu is quite relevant, 66Mhz on a 1.6Ghz CPU isn't exactly making people jump for their credit card.

It was around the time of the G4's introduction when you'd get nitrogen cooled 1Ghz Athlons on the cover of PC magazines all over the place with big "fastest PC in the world!" type phrases on the front. The potential speed of the G4 has more than doubled since then yet the athlon's are barely 2 3rds faster than they we're in 1999.

If motorola hadn't been stuck at 500Mhz for so long, the G4 could have over taken the athlon already, as it stands, we're in a position where they can catch up a little and then blow everything out of the water when the G5s come out. I know a lot of this is wishful thinking on the part of a mac "blowing away" a PC, it's enough to match a PC at everyday tasks these days let alone beat it. The fact the PC would cost less than the mac if it was equiped the same would still have little bearing on the true cost if user interface and future value were conciderations when buying a new computer.
 
There is a 1.8GHz Athlon XP: the Athlon XP 2200+. It was finally released to the public just recently.

The '2100' is 1.73GHz. The '2000' is 1.67GHz, the '1900' is 1.6GHz, and the '1800' is 1.53GHz. You might want to get your facts straight before blasting someone. However, since the 1.8GHz Athlon XP's were just barely publicly released, I think he is probably getting a 1.53GHz Athlon XP.

A simple math function to approximate the actual speed of an Athlon XP: take the 'Rating', subtract 1000, multiply by 0.667, and add 1000. Ex: ((2200 - 1000) * 0.667 ) + 1000 = 1800.4. (Yes, I was bored).
 
the last article i read in one of the pc realted magazines was very impressed with the athlon's rating and called it modest if anything

i agree and it might not be a bad idea if apple does the same thing as amd...at first i did not think the amd approach would work but when one sees 2000+, they are going to think 2 ghz somewhere in their mind, even if it's not quite at 2 ghz in reality

amd was smart to play this game conservatively and name a chip 2000+ when it actually benched faster than the 2 ghz pentium 4 chip... the 2100+ and 2200+ must be monster chips and at amd's pricing convention, they beat the heck out of intel and their pentium 4 offering

if the new g5's are rated at 1.4 ghz but function as fast as a 2 ghz pentium 4 chip, why not call the chip "the G5 2000+"?
 
I think ya'll are missing the big point here though...

A few people on this thread have touched the issue lightly, but I don't think it's something that should be taken lightly when considering this... Sure, you need the power to run high end professional audio software, understood. But no matter how powerful your hardware is, you need good operating software to back it up with. I started in the world of PCs before being "converted" to Mac, and although I may gripe about this and that, there's no real question in my mind about what to use. Great, so you've got the new AMD quadra-tetrahertz monster chip, but you're still running Windows on it, which almost guarantees data loss, frustration and consequently baldness from pulling your hair out trying to figure out why this conflicts with that conflicts with this... With a Mac, I know why my computer crashes, and it only happens when I'm running Microsoft products.

I like the idea of just soaking up a PC for it's shear processor power, but it's like using a flamethrower to peel paint, once you set your house on fire with it, you'll be wondering why you thought it was such a good idea.
 
1.8GHz Athlons

Actually, I believe it is indeed a 1.8GHz Athlon I'll be getting, since this is a used computer that was used in a rendering farm at a digital effects studio here in southern CA. However, I got this info second-hand, so you guys may be right. But either way, I got a fairly powerful machine coming to me soon. The comparisons in user experience between my Mac and new PC workstation are going to be interesting.
 
Right on arclyte!!!!

This is what I've been telling people for years. There is a HUGE reason why in the music/film sound industry, almost all projects are created on Macs or finalized on Macs. They work!! And you're not pulling your hair out in the process.

It amazes me how many times you can actually break a pc machine/or software when trying to create a project.



But try explaining that to the pc users, who always remind me of the sad kid working at the ice cream shop.
 
i hope tibook goes to at least 900 mhz and ibook goes to 800 mhz

1 ghz on both laptops by around year's end/mwsf '03 would be sweet

am i dreaming? he he
 
Originally posted by jefhatfield
i hope tibook goes to at least 900 mhz and ibook goes to 800 mhz

1 ghz on both laptops by around year's end/mwsf '03 would be sweet

am i dreaming? he he

I'll give you three guesses...the first two don't count! :D

Originally posted by modul8tr
It amazes me how many times you can actually break a pc machine/or software when trying to create a project.

What amazes me even more are all the peecee/pee4/craXP advertizements on TV happen to show some "fancy-shmancy" 2.0GHz system that can make a video in 30 seconds.

I'm wondering how long it took to upload the individual scenes onto the software so that it would (with luck) run on the day of the airing.

Then, again, those guys aren't throwing cameras off the stage! :D
 
Hey, guys
AMD's newest chip is called Opteron. Heres a quote from the mag, Maximum PC :

"Opteron- that will be the name of AMD's eighth generation 64 bit CPU... The Opteron will not replace the Athlon."

So HA pcuser. Now, whose got the last laugh now?

*Sry no clock speed given :( *
 
Originally posted by King Cobra


I'll give you three guesses...the first two don't count! :D



What amazes me even more are all the peecee/pee4/craXP advertizements on TV happen to show some "fancy-shmancy" 2.0GHz system that can make a video in 30 seconds.

I'm wondering how long it took to upload the individual scenes onto the software so that it would (with luck) run on the day of the airing.

Then, again, those guys aren't throwing cameras off the stage! :D

Lets just say, they spend a lot of time getting that PC video editing crapola to work. I have final cut pro 2 and it rules!! I've used Windoze Movie Maker too. It pales in comparison to even iMovie (which i dont personally like) and theres nothing even close to iDVD (which i also dont like, im getting DVD studio pro soon :p ) except for DVDit (sony uses it), but the lame LE version that they give u sucks. So, to sum it all up, Apple may be called "slower" than PCs, but, good luck editing 1 hour worth of quality DV on windoze movie maker.
 
G5orbust, you misunderstand. The Athlon line and the Opteron line will both be x86-64 (32-bit/64-bit hybrids), it's just the Opteron will be aimed at servers, and the Athlon at home users.

The Athlon line will get a new core (it's already on it's four core... the slot-A Athlon, the Athlon T-bird, the Athlon XP, and the very recent Thoroughbred version of the Athlon XP, each has a different core) that is based x86-64.

The Athlon line will be the single-chip lines (akin to the Athlon XP), and the Opteron will take the place of the dual chips (akin to the Athlon MP). (Also, the Opterons will be available in quad configs.)

AMD's roadmap demonstrating that BOTH the Opteron and Athlon will be x86-64.
http://anandtech.com/showimage.html...iews/cpu/amd/opteron/announcement/roadmap.gif

The ClawHammer ("Athlon") will have a single channel DDR memory controller and the SledgeHammer ("Opteron") will have a dual channel:
http://anandtech.com/showdoc.html?i=1591&p=2

So, I guess I have the last laugh. :)


Also, I get MaximumPC, and the full quote is:
"Opteron--that will be the name of AMD's eigth-generation 64-bit CPU. Derived from the Latin word Optimus or "best", Opteron is intended to suggest reliability and strength. The CPU will be used for the 2P and 4P versions of the Hammer CPU that will be introduced next. The Opteron name will not replace Athlon. AMD says it still has plans to introduce an x86-64 CPU that will use the Athlon name later this year."
 
info on ibm - may be of interest

INTERVIEW-IBM exec says tech changes mean moving on

By Caroline Humer

NEW YORK, June 25 (Reuters) - Sometimes, even the world's largest computer maker has to move on.

That's what International Business Machines Corp. (IBM) decided when some chip and storage technology lost its growth potential, Nick Donofrio, IBM's senior vice president for technology and manufacturing, said on Tuesday.

"Technologies come and go all the time," he said in an interview ahead of a speech at the TECHXNY, a trade show in New York. "As good as technology is and as fun as technology is it can also be very brutal. When its time is spent, its time is spent and it's time to move on."

Donofrio, a one-time engineer who has headed some of IBM's largest divisions -- including large corporate computers -- is now effectively the company's top technology officer. He is in charge of divisions ranging from IBM Research to manufacturing to strategy.

IBM last month said it would get out of the hard disk-drive business and it also announced plans to reorganize its microelectronics operations, phasing out certain older aluminum-based chip manufacturing capacity.

The Armonk, New York-based company has become increasingly critical of its money-losing business lines this year as corporations have cut tech spending amid the downturn.

As a result, new Chief Executive Officer Sam Palmisano is trying to find ways for the company to grow earnings and revenue again.

Donofrio said IBM is focusing on areas where the company sees the potential for growth to accelerate, such as its plan to offer semiconductor design services similar to its partnership with Sony Corp. (6758).

IBM is working closely with Sony to develop the chip that will be used in the company's next-generation gaming system, PlayStation 3. "It's a way more powerful thing then exists today," Donofrio said.

IBM is already building chips for its own computers and is thinking of adding perhaps two handfuls of customers in deals like Sony's to help give the IBM division size and scale.

"We have the experience to be able to do this for people faster, better, cheaper, which is what they want," he said.

While IBM will continue to make chips for other companies -- it currently makes the chip used in Apple Computer Corp.'s (AAPL) personal computers, for instance -- its emphasis will be on doing the more valuable design work, he said.

IBM will continue to focus on technologies that it says will dominate in the future, including self-healing computer systems that can automatically diagnose their own problems and fix them, Donofrio said.

In addition, IBM is increasingly emphasizing what the company calls utility computing, or computing on demand. That, Donofrio says, will enable companies to access computing resources as they need them.

"You have to have a long-term view to survive in this industry," Donofrio said.

IBM shares, which are trading at a five-year low, fell $1.13, or 1.6 percent, to close at $68.60 on the New York Stock Exchange.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.