Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Well, for you that's great. I would never game on a console because the controls are wonkey. Try running a realism flight simulator on Xbox: you'll never duplicate the hundred key combos, and then you want to attach a Thrustmaster Cougar and program that as well! PC (Mac) gaming = flexibility and customization. Also, there is a lot of fun to be had participating in the modding community. Taking your favorite game of the year and building new levels or whatever. I've been a Falcon 4.0 hacker for a while now, and I'm always looking for the best hardware to run it on. Different strokes for different folks! :apple:

Indeed, if you are truly that serious about your gaming you should just get a PC, period.

Why pay for 8 processors when 99.99% of the games out there are not even multi-threaded? Why pay 300% the price of a equivalent speed PC (clock speed wise) when you can get it quicker 18 months ago?

8 cores at 3.2Ghz is not going to make you any difference when 2 cores at 3.2Ghz with a faster GPU will toast your hide.

Macs are just not targetted towards games, period. If anything the rapid hardware updates, wide driver availability, and vast upgrade options should have clued you in a bit.

I doubt your Thrustmaster Cougar even works under OS X! Oh, and playing under boot camp? *accidentally hits Alt-Key* Windows menu popup... Opps!
 
Don't like the case?

Need to perk up the rendering room or office?

How about custom chrome colors? Black, white, green, blue, red, gold, silver?

Killer REAL chrome spray paint

But be sure to take everything out of the case first!
 
Unless you really like having very expensive engineers twiddling their thumbs, why not use big iron instead of a Mac? An SGI Altix 4700 maxes out at 1024 cores, 128 TB of memory, and thousands of TB of disk. I bet it won't take 8 days.

I was just the Linux dork, I don't know what the other duties of those people were. I was there to figure out why the NIC was slow in Linux in their environment. :)
 
[*]Today's Mac Pro is also only twice as fast as the PowerMac of 2 years ago, which was kind of my point-- faster but with no real apparent technology dividend.

Moore's law hard at work (effectively)

this was completely unexpected. but completely welcomed, i almost pulled the trigger on one of these last night. should be ordering in the next couple of days.
 
Nice

I'll probably buy one for the office - either that or a 24" iMac. Definitely makes me optimistic about what might be in store for next week at MacWorld. :)
 
Not half..

the displays are 24bit not 48.
that makes only half the bandwith you stated. plus the mentioned fact that a lot of calculations occur within the graphicscard itself. but shure, there would not me much space let on the bus for say, raid5 plus fibrechannel (not that ther were the slots available).

24bit is 2 to the 24th power.

48bit is 2 to the 48th power.

Isn't it?

So the difference isn't half, it is 2 to the 24th.
 
24bit is 2 to the 24th power.
48bit is 2 to the 48th power.
Isn't it?
So the difference isn't half, it is 2 to the 24th.

You're sort of correct in that 48 bit color would be able to represent 2**24 more colors (permutations) than 24 bit color can represent. However, in the context mentioned, the data is simply additive. 48 bits are double the data size of 24 bits. It takes twice as much time or twice as much bandwidth to transfer 48 bits compared to 24 bits.

1 bit color (0 or 1) can represent 2 colors and is 1 bit in size.
2 bit color (00 to 11) can represent 4 colors and is 2 bits in size.
3 bit color (000 to 111) can represent 8 colors and is 3 bits in size.

and so it goes...
 
By the way, I have NEVER gotten 6-10 FPS in UT2004. Ever.

Gamers are like freaking junkies on crack. I swear. How about you spend some cash on college tuition instead of $6,000+ systems that are "obsolete" in 1 year (according to most gamers).

"Gamers" are also people who may be playing games which are more modern than UT2004 and consequently demand better performing hardware than you would prefer to see them purchasing. Perhaps this difference in expectations is the reason your perspective seems to be in such violent disagreement with these other people.

Respectfully, I'd suggest that your apparent frustration is misplaced. Why on earth would you care what someone else spends their money on, even if they are college aged and unable to afford their tuition (which is a curious presumption).

* Full disclosure: I just bought a tricked-out Mac Pro. It cost comfortably less than "$6,000+". I haven't set foot in a college classroom in about 20 years. I've never played UT2004. I like pie.
 
You're sort of correct in that 48 bit color would be able to represent 2**24 more colors (permutations) than 24 bit color can represent. However, in the context mentioned, the data is simply additive. 48 bits are double the data size of 24 bits. It takes twice as much time or twice as much bandwidth to transfer 48 bits compared to 24 bits.

1 bit color (0 or 1) can represent 2 colors and is 1 bit in size.
2 bit color (00 to 11) can represent 4 colors and is 2 bits in size.
3 bit color (000 to 111) can represent 8 colors and is 3 bits in size.

and so it goes...

RGB (16bit per color channel).
 
Those prices are way too high for what you get. I left Apple behind and built a quad-core system with otherwise better specs 1 month ago for half the price of these new Macs and I see nothing here to cause me to regret my choice. The price premium on the Mac brand is absurd, and the lack of customization and expandability is a serious downside for users who want to apply the machine as a tool and not a conversation piece. Yes I miss the Mac operating system but to be honest my experience with XP so far has been much better than I expected. I don't feel at all guilty for abandoning the platform since Apple has basically abandoned meaningful computer design to cater to the lifestyle crowd.
 
Underwhelming, to say the least.

I'm guessing the most happy campers are the guys who couldn't wait in October-November and bought a Mac Pro then. They must be saying: "Good!!! We didn't miss much by not waiting!"

Also, I'm not too hopeful for MacWorld: if MacPros take the backstage, I'm expecting the iPod, iPhone, iTune store, or other non-Mac thing to be the Next-Big-Thing.

Apple Computers is no more; we're dealing with Apple now.

Nisaea

My thoughts exactly. After last year's disaster I'm not really looking forward to this year's keynote anymore. This Mac Pro announcement is a bad sign, this is exactly the kind of news that used to make up the MWSF keynotes. Especially since this is the first real update to the Mac Pro in about 2 years (no, I don't count the 8-core BTO option as an update). So important news like this not being kept for next week's keynote means that the focus is not really on macs again next week.

It's a shame, it is still the Macworld after all. Not the 'latestoverhypedgadgetworld'.

This is my fear too. When Apple dropped the "Computer" in their name I began to lose hope. It almost feels like Apple is just going to become this fancy gadget company. They make great computers and I think there will always be a need for powerful desktops. I hate to see Apple paying so little attention to the Mac Pros lately. Don't get me wrong, I'm glad for this update, but it's been what? 2 years? And they update the MB/MBP's and even the iMac much more regularly. I hope this update means much more regular updates for the Mac Pro.
 
Moore's law hard at work (effectively)
Define technology dividend, smart guy.
Moore's Law says the number of transistors in an integrated circuit will double every 18 months. My point is that we've met Moore's law by steadily reducing process geometries, but we haven't been rewarded with improved performance. Improved process technology used to yield three benefits: faster transistors, lower switching power, and reduced cost per transistor. We've really only seen a small power reduction. System cost per core has gone down, but not because less silicon is being used-- in fact more silicon is being used. Performance per silicon area has changed very little.

I simply find that interesting and have been thinking about where that's going to leave us over the next couple years.
Well hot damn, if you know so much, why don't you work for a CPU manufacturing company? Or make your own? I'm so sick of people acting like they know their **** on this forum.
You mean knowing **** like what people do for a living? :rolleyes:
 
Yes, I did say that. Let me say it again: after 2 years and two process shrinks, Intel still has to use twice as much silicon, 4 times as many transistors, and boost their clock 30% to double the speed of the G5. I'd say the G5 is holding its own.

One 970MP has the same performance as one Penryn die. The 8 core Mac Pro uses 4 dual core dice, just like the hypothetical nonexistent 8 core PowerMac G5 would have 2 years ago. The difference is that Intel had to go through two process shrinks to get here. Those 820 million transistors in a 4 core Harpertown could be rewired into 8 G5s with more than enough left to wire up an onchip memory controller and maybe a GPU to boot.

There's a lesson to be learned here-- if the only way to improve performance is by increasing the amount of silicon linearly, or the transistor count quadratically, we've hit the wall and someone has to find a new approach.


I think you're missing my point which was simply to clear all the transistor count pixie dust out of my eyes, get beyond just admiring the sheer colossal effort put forth and look at what's been accomplished and I'm realizing that it's not a whole lot.


I'm sorry, I think you're missing something. All you prove here is the fact that, clock for clock, the G5 is as fast as a Penryn. So what ? AMD's athlon was clock for clock faster than any P4 that came out, but it still was destroyed when S478 came out. I think the old Athon was even faster then Athlon64, clock for clock. Does that mean the original Athlon still "holds its own" today :rolleyes:.

Have you forgotten why the G5 failed ? It didn't scale properly, remember, because it was way to hot (liquid cooling ring a bell ?) Penryn scales fantastically, so it is definately the better technology. performance per Watt, move over G5 ;)
 
I used the Store chat just now and they told me that while the 8800GT 2.0

here is a breif summary

i, my name is Wanda. Welcome to Apple!
#

Wanda: Good afternoon.
#

You: can the 8800GT be purchased for use with the previous Mac Pros? If so how much is it to buy the 8800 on its own
#

Wanda: I'm happy to assist you. The 8800GT works in the PCI Express 2.0 slot. The previous generation Mac Pros have the PCI 1.0 slot, so it will not work in that slot.
#

You: the 2.0 is not backwards compatible?
#

Wanda: 2.0 is backwards compatible, but 1.0 is not frontwards compatible. The prevoius Mac Pro has the 1.0 slot

Kinda makes you wanda...:confused:
 
Wow! This Appears To Be Welcome News . . .

Finally. Surprised it was released so soon instead of next Tuesday.
 
Kinda makes you wanda...:confused:

Doesn't make a whole lot of sense, since the specs for the x2600 XT also state it is a PCI Express 2.0 card, yet you can configure the Mac Pro with 4 of them... when there are only 2 PCI Express 2.0 slots.

Other evidence elsewhere on the 'net seems to say otherwise as well.
 
I'm sorry, I think you're missing something. All you prove here is the fact that, clock for clock, the G5 is as fast as a Penryn. So what ? AMD's athlon was clock for clock faster than any P4 that came out, but it still was destroyed when S478 came out. I think the old Athon was even faster then Athlon64, clock for clock. Does that mean the original Athlon still "holds its own" today :rolleyes:.

Have you forgotten why the G5 failed ? It didn't scale properly, remember, because it was way to hot (liquid cooling ring a bell ?) Penryn scales fantastically, so it is definately the better technology. performance per Watt, move over G5 ;)

I nearly wanted to disagree with Analog Kid too but he actually made good points.

He wasn't arguing anything like you are arguing, nor why the G5 actually failed. Just pointing out that it took intel a lot of work to truly beat it.

The G5 would have been a great chip if IBM spent the money on making it better. However they didn't and apple couldn't wait for them forever.
 
I nearly wanted to disagree with Analog Kid too but he actually made good points.

He wasn't arguing anything like you are arguing, nor why the G5 actually failed. Just pointing out that it took intel a lot of work to truly beat it.

The G5 would have been a great chip if IBM spent the money on making it better. However they didn't and apple couldn't wait for them forever.

Well yeah, it did take Intel al lot of time to truly beat it, don't disagree there. But it took Intel a lot of time to beat AMD too, until June 2006 actually. Intel has had a difficult time in 2005-2006 and Apple jumped on its wagon just in time, a very good decision indeed if you look at where AMD is now. And indeed, they could not wait for IBM indefinately. The G5 was a good chip in its time, and whether or not IBM could make it compete today I honestly don't know.

The 8800GT not working in a PCI 1.x slot is totally untrue by the way, unless Apple made it so with some special hardware trickery.
 
Well yeah, it did take Intel al lot of time to truly beat it, don't disagree there. But it took Intel a lot of time to beat AMD too, until June 2006 actually. Intel has had a difficult time in 2005-2006 and Apple jumped on its wagon just in time, a very good decision indeed if you look at where AMD is now. And indeed, they could not wait for IBM indefinately. The G5 was a good chip in its time, and whether or not IBM could make it compete today I honestly don't know.

The 8800GT not working in a PCI 1.x slot is totally untrue by the way, unless Apple made it so with some special hardware trickery.

I was going to argue that the die size of the prescott was smaller than the athlon (venice core) at the time so was cheaper but it turns out i was wrong.

Anyway i prefer Intel on the PC side. I had two years of actual using a PC when i had my P4 with 875 chipset, and then two years of debugging problems** when i upgraded to an AMD X2 with nForce. Honestly i think that AMD make excellent processors, they are just let down by the chipsets.

**Which, unlike what most people like to think, were all hardware issues and had nothing to do with windows. I don't think even OSX would have been stable on it unless, as you had to do to make the hardware and thus windows stable, you didn't use any of the advanced features such as the built in firewall and onboard IDE at full speed.
 
Can the new Mac Pro *really* support eight 30 in cinema displays? The bandwidth for 8 screens would seem to be something like:
(2560*1600 pixels) (48 bit/pixel color) (8 screens) (30 frames/sec) = 43.9 gigabits/sec = 5.49 gigabytes/sec

I don't know how much the graphics card takes over the screen drawing, etc etc, but this seems like it would be a significant chunk of the system bus' bandwidth.
There is plenty of bandwidth to support that many screens. Some of the cards (well the quadro) can support two quad hd screens all by itself.

  • Harpertown 3.2GHz is close to its performance limit too. Intel wouldn't ship it otherwise.
If it is anything at all like the desktop chips then there is more room. They have probably another 800 Mhz on air.They have another 1.4 ghz to go if they were to start using water cooling . They probably have another 2 or more if they were to go with phase change cooling.
 
Thread too long...

But where the he11 is my MacBook Pro and Cinema Displays to go along with this beautiful machine? I would love to have that redesigned MacBook Pro sometime in the near future.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.