Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I'd personally like to see the 8800GTX or the 8800GT and nothing from ATI. Besides the fact that the HD 2900xt is now being EOL'd and being replaced by a newer 55nm part, it is loud and hot and is ourperformed by the 8800GTX.

Seeing that Nvidia can't write Mac OS X drivers worth even a tenth of a s**t, I'd like to see Apple kick Nvidia to the curb and stick with ATI, which has much more stable drivers on the OS X platform.

There have been NO END of problems with the Nvidia 8600 mobile chipsets in the Macbook Pros due to driver problems. People playing Second Life, WoW, and other games have had crashes, glitching, and so on, and none of these problems have been fully fixed yet.

Please, oh please let Apple stick with ATI cards. My X1900XT in my Mac Pro is *ROCK SOLID*, as is the X1600 chipset in my rev 1 Macbook Pro. There have been minor instability issues in the aluminum iMacs, but they have been fixed as far as I know. Nvidia has yet to fix their drivers.

-Z
 
(with fingers crossed) 8800GTX, 8800GTX, 8800GTX...

Read my response above. And yes, I feel really strongly about this. I know several Santa Rosa MBP owners who are suffering with the crappy nvidia drivers all the time. They have to boot into WINDOWS to have decent GL performance.

Nvidia 8xxx on Apple is garbage. Check out the Apple support forums and read if you don't believe me.
 
Please, oh please let Apple stick with ATI cards. My X1900XT in my Mac Pro is *ROCK SOLID*, as is the X1600 chipset in my rev 1 Macbook Pro. There have been minor instability issues in the aluminum iMacs, but they have been fixed as far as I know. Nvidia has yet to fix their drivers.

Well lets hope that if Apple does go with ATI, they use the new R670 parts - the 3850 looks pretty good, single slot cooler. Even if the performance is a little less, the 8800GT and 3850 are both good cards (an improvement over X1900/8800GTS).
 
Intel themselves have estimated that the top-end quad core chips represent a 45% increase in performance over previous quads.

Where did you see those numbers? The best I could find was a 10 to 15% performance increase but running much cooler and more energy efficient.

To be honest I doubt there will be a lot of real world performance difference in typical applications.

The GPU will probably make the biggest difference. I know MM raves about SS but he is the only one that I have heard who seems to think it will make a big difference.

Most of the excitement surrounding the new Penryn chips is the energy efficiency, and low temperature operation.
 
Read my response above. And yes, I feel really strongly about this. I know several Santa Rosa MBP owners who are suffering with the crappy nvidia drivers all the time. They have to boot into WINDOWS to have decent GL performance.

Nvidia 8xxx on Apple is garbage. Check out the Apple support forums and read if you don't believe me.

(with fingers crossed) 8800GTX, 8800GTX, 8800GTX...
 
"report" != "rumour"

It was reported on several websites three weeks ago that Apple was placing large orders for Intel Harpertown Xeons...

Report ;) ?

There was a rumour from one source that was copied on several Mac sites - nothing confirmed or even second-sourced.
 
Read my response above. And yes, I feel really strongly about this. I know several Santa Rosa MBP owners who are suffering with the crappy nvidia drivers all the time. They have to boot into WINDOWS to have decent GL performance.

Nvidia 8xxx on Apple is garbage. Check out the Apple support forums and read if you don't believe me.


Yeah, I wish Apple would let NVidia write their own drivers for the mac. The performance difference and reliability between the Windows NVidia drivers and the Apple drivers is like night and day. Windows Nvidia drivers are great, OS X Apple driver's suck.

It is really arrogant of Apple to think that they can write better drivers than the manufacturer.
 
When has Apple announced any new design from Intel even on the same day as the other manufacturers? Those days aren't "gone", they never existed except as a fanboy myth.

Who cares about when they announce something new, isn't the important part when they ship?

And didn't the first intel macs start shipping at the same time or sooner than PC's with those same chips? I believe there may have been similar situations later on.

I don't get why some people get their panties in a bunch when a PC company announces hardware that is a month or more away from shipping and insists that since apple hasn't made an announcement yet that makes them "late". If there is a trend, it simply seems to be that apple has more of a tendency to not announce things until they are ready to ship.
 
Yeah, I wish Apple would let NVidia write their own drivers for the mac. The performance difference and reliability between the Windows NVidia drivers and the Apple drivers is like night and day. Windows Nvidia drivers are great, OS X Apple driver's suck.

It is really arrogant of Apple to think that they can write better drivers than the manufacturer.

Does Apple actually write the Nvidia drivers? I was under the impression that Nvidia writes them, and Apple incorporates them into the OS.

I think Apple would be foolish to write their own graphics drivers. Something that works on such a low level with the hardware should be written by the GPU manufacturer, not Apple.
 
ATI vs Nvidia

I think Nvidia has the edge right now on 3D, but then there are really no benchmarks for DirectX10. Crysis will be released next week. My understanding is that when true DirectX10 gaming begins, ATI's stream processor architecture will kick in and kick butt. Also, OpenGL is still out there. John Carmack, for one, will only use that API. Any shader coolness that is invented for DirectX10 will be represented in OpenGL. It's "open" after all.

I have always been an ATI brand loyalist (sounds better than fanboy) because I think ATI also excells in multimedia playback. I think ATI has way more features for advanced HD everything. I actually do watch DVD movies on my computer, so it's an issue for me.

On the other hand, today's news is that AMD / ATI is in pretty bad shape with 3 quarters in a row of massive financial losses ever since the merger. The CEO of ATI has bailed out. I was afraid of that. So if I have to become an Nvidia loyalist, so be it.
 
Not necessarily - Apple had a jump on their competition during the last revision after receiving "preferential treatment" from Intel.

I assume that you're referring to the 3.0 GHz Clovertowns that showed up in the Mac Pro five months after HP/Dell/... were shipping quad-core Clovertowns.

  1. That wasn't a "revision", it was a "binning". Over time, silicon production lines improve and stabilize - resulting in a higher average speed of the chips. By May, Intel had enough 3.0 GHz Clovertowns coming off the production lines to sell them.
  2. It wasn't an Apple exclusive - several of the 2nd tier "white box" vendors were also shipping 3.0 GHz quads, and the chips were available on the grey market.
  3. Because it was a "factory overclocked" chip, it ran hot - 150 watt TDP
  4. The other manufacturers declined to use the 150 watt chips, because the newer, faster 120 watt "G0 stepping" was coming soon.

Apple has typically been slow in announcing new Intel designs, and now we have another case here with the Penryn Xeons.
 
Does Apple actually write the Nvidia drivers? I was under the impression that Nvidia writes them, and Apple incorporates them into the OS.

I think Apple would be foolish to write their own graphics drivers. Something that works on such a low level with the hardware should be written by the GPU manufacturer, not Apple.

My understanding was that Apple wrote their own drivers for the Nvidia cards, not Nvidia. As a result there is a huge difference between the drivers available for the mac as compared to the drivers available for windows.
 
Bring 'em on, let xmas come early.

The bigger question is at what price points will these bad boys be coming in. If they're in the same case design they best not raise prices considering the chip costs are staying pretty much the same. Apple needs to stay competitive and not revisit the 90s pricing structures.
 
Where did you see those numbers? The best I could find was a 10 to 15% performance increase but running much cooler and more energy efficient.

To be honest I doubt there will be a lot of real world performance difference in typical applications.

The GPU will probably make the biggest difference. I know MM raves about SS but he is the only one that I have heard who seems to think it will make a big difference.

Most of the excitement surrounding the new Penryn chips is the energy efficiency, and low temperature operation.

yeah, these numbers sound much more reasonable. 45% increase from the FSB bump? don't think so. energy efficiency is always good.

has anyone actually reached a CPU limit on the 8 core mac pros? i am no computer engineer, but i would assume that there is some bottleneck somewhere else on the system. is this cpu update a drop in replacement or should we expect improvements in the rest of the mobo?

thedudeabides
 
Penryn 45% Performance Increase over Previous Intel Quads

Here:

http://www.eweek.com/article2/0,1759,2114747,00.asp

At the April 17 conference in Beijing, company officials went a step further and disclosed new performance specifications. For example, a Penryn processor with a 1600MHz FSB (front side bus) in a workstation or Penryn processor with a 1300MHz FSB in a server will offer 45 percent better performance for bandwidth-intensive applications and 25 percent greater performance for a server using Java. In this scenario, Intel compared the two pre-production processors with a quad-core Xeon 5355 processor.

Intel also compared a Penryn processor with a clock speed of 3.3GHz, a 1333MHz FSB and 12MB of Level 2 cache with its quad-core Core 2 Extreme QX6800, which the company just released on April 9. The results showed a 25 percent greater performance with three-dimensional rendering, a 40 percent increase in gaming performance and a 40 percent increase in video encoding.

[I think that the massive 12 MB cache and whatever access sharing scheme Intel has implemented is contributing to this performance as it doesn't seem to be *so* dependant on the FSB speed.]
 
First Intel Penryn Based PCs Arrive... Apple to Follow?

That headline isn't right...none of them have "arrived", they have just been announced. With them shipping in december and january, it's entirely possible that apple could ship these first (which would be "lead" not "follow").
 
Well either these will wait for WWDC, Apple has a big surprice for us (iTablet), or we are going to have a very boring WWDC.

I wonder what will be in the keynote.
 
I assume that you're referring to the 3.0 GHz Clovertowns that showed up in the Mac Pro five months after HP/Dell/... were shipping quad-core Clovertowns.

  1. That wasn't a "revision", it was a "binning". Over time, silicon production lines improve and stabilize - resulting in a higher average speed of the chips. By May, Intel had enough 3.0 GHz Clovertowns coming off the production lines to sell them.
  2. It wasn't an Apple exclusive - several of the 2nd tier "white box" vendors were also shipping 3.0 GHz quads, and the chips were available on the grey market.
  3. Because it was a "factory overclocked" chip, it ran hot - 150 watt TDP
  4. The other manufacturers declined to use the 150 watt chips, because the newer, faster 120 watt "G0 stepping" was coming soon.

Apple has typically been slow in announcing new Intel designs, and now we have another case here with the Penryn Xeons.

Thanks Aiden, your knowledge and insight is always appreciated. :cool:
 
New Mac Pros = New Cinema Displays?

so if, as seems likely based on today's news/rumor, the arrival of new mac pros is imminent, does that mean we can also finally expect new :apple: cinema displays to complement them?

apparently it's fairly common for :apple: to release new ACDs with new pro machines. any thoughts?
 
Competitors of Apple are shipping in Jan, I think this is because Apple got Intel to reserve the bulk of the chips for Apple and everyone elase has to wait. I read something about that not long ago, here I think.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.