Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
RacerX said:
The only thing we should worry about is whether AMD will continue to support Macs. AMD has no history with Apple and may not consider ATI's Mac business worth the time or effort. The problem isn't either Apple or Intel, the problem is that AMD is an unknown with no history with the Mac community.

It really depends on how AMD approaches the takeover. When you are taking over a successful market leader with a strong brand, the best approach is usually to keep the marketing and development functions of the bought company intact and independent. I work for a large company that fairly recently bought a small company that has a product that is strong in the Mac space. The parent company has no particular interest in Macs (and no other product for Macs) but they are keeping the Mac business intact because a. it pays for itself, and b. it is a free way to keep an eye on a niche market (e.g., Macs) that they weren't tracking before.
 
I don't think there's any harm in a switch from ATI to nVidia. Indeed, if a selling point of the Mac is now, supposedly, that it runs more operating systems than anyone else (? Ok, I'm sceptical too that that's a major selling point, but I've heard it argued) then ATI is liability as their in-house drivers suck.

A more interesting question is whether this might become a non-issue in the future. Intel's GMA950 isn't fantastic, arguably it's sub-Radeon 7500, but it's slowly getting there. If the T&L stuff was implemented, and given the shouting there is from people unhappy about Apple putting GMA950s in the game-playing consumer machines, and Radeons in the spreadsheeting/Photoshopping business machines, I wouldn't be surprised if there's quite some pressure on Intel at the moment to design the next generation to be good, low-end (rather than sub-low-end) graphics cards.
 
peharri said:
A more interesting question is whether this might become a non-issue in the future. Intel's GMA950 isn't fantastic, arguably it's sub-Radeon 7500, but it's slowly getting there. If the T&L stuff was implemented, and given the shouting there is from people unhappy about Apple putting GMA950s in the game-playing consumer machines, and Radeons in the spreadsheeting/Photoshopping business machines, I wouldn't be surprised if there's quite some pressure on Intel at the moment to design the next generation to be good, low-end (rather than sub-low-end) graphics cards.

T&L (Transform and Lighting, more generally vertexshaders) are getting less and less important, at least for high end graphics. The real work is done in pixelshaders. Actually, lots of the work is getting moved from vertexshaders to pixelshaders, like per-pixel light calculations, to improve image quality. The next series of Macs will all have two CPUs capable of doing four single-precision floating-point operations per cycle; that is plenty to do vertex shading.
 
I don't know why Apple would drop and excellent chipset like ATi for nVidia considering the fact the new (as of 12JUL06) nVidia chipsets are only designed for Vista systems and are only optimised for Direct X.

I can't fault the ATi video cards for the Apples and I think it would be rather rash to drop ATi for nVidia.
 
gnasher729 said:
Are you saying Apple shouldn't use ATI graphics cards because ATI doesn't have good Linux drivers? That doesn't make any sense whatsoever.

Sure it makes sense (to a point). There are many cases where people make thier purchasing decisions on how a company "acts" and if you feel that it is important to, for example, have open source drivers or at LEAST provide half decent closed drivers then I feel that is perfectly sensable. I know that my next computer (unless things change by then) will have a Nvidia GPU (I have not decided on AMD vs. Intel yet) primarily because it will run anything BUT Windows (I don't like how the company "acts") and Nvidia does indeed have MUCH better support for Linux.

EDIT: Sorry that argument was not complete... So what I mean to say is that even though the OP will be purchasing an Apple product he might care to some extent who he is indirectly supporting... Like ATi... Personally, I hope that apple gives us choice... Heck, I want to see the ability to get "AMD inside" ;)
 
brepublican said:
I'm trying to not say anything about this one. Uh-uh. I've been wrong one time too many. I dont wanna say "ATI is awesome!" today and be asking myself why Apple didnt use Intel or NVIDIA graphics all along...

Quite. NVIDIA in Apple could pan out very well...
 
JFreak said:
Replace Nvidia with Intel and you could be talking about general-purpose processors as well ;) I would love it if Apple continued to sell PPC hardware along with the Intels, and also offer ATI and Nvidia GPU's along with the integrated Intels for whoever needs something better.

It's sad to see that it looks like Apple is denying us the choice.

Intel graphics card would not be comparable to ATI or Nvidia. Apple having a choice between integrated Intel graphics card and Nvidia is no choice at all. The ATI not offering their GPU's is going to be a hard blow for Apple. I personally don't think ATI is going to go only AMD though.
 
Intel sources confirm that they will continue to include ATI chipsets on select Intel boards despite the buyout earlier in the week by AMD as reported by Daily Tech here.
 
I heard Nvidia is developing a integrated/dedicated hybrid graphics configurations for notebook computers to conserve battery power without sacrificing performance. I can't wait to see that in the MacBook line.
 
Thank you for the link.

topgunn said:
Intel sources confirm that they will continue to include ATI chipsets on select Intel boards despite the buyout earlier in the week by AMD as reported by Daily Tech here.

This should put this silly rumor to rest once and for all. If Intel is still willing to MANUFACTURE boards that use an ATI integrated-video chipset after the buyout, even though Intel makes their own integrated-video chipsets, then there is no way Intel would even consider suggesting to Apple that they drop ATI, much less 'pressure' or 'force' them to do it...
 
ehurtley said:
Yet again, people, Intel can't force any vendor to do anything. Not even their own Motherboard department.

True. But they can certainly encourage vendors to do things using discounts with strings attached. You want this chip at this price? Don't use ati graphics.
 
ehurtley said:
This should put this silly rumor to rest once and for all. If Intel is still willing to MANUFACTURE boards that use an ATI integrated-video chipset after the buyout, even though Intel makes their own integrated-video chipsets, then there is no way Intel would even consider suggesting to Apple that they drop ATI, much less 'pressure' or 'force' them to do it...


Actually the boards in question are server boards, using server class Northbridge and Southbridge chipsets from Intel. The ATI graphics are a seperate low-end ATI graphics chip; nothing special. Intel does not have an integrated graphics chip for servers, to which I would add: Intel doesn't modify their server lineup frequently so that customers have consistant products for the enterprise.

Once again, these are not ATI northbridge chipsets, Intel does not use 3rd party northbridge/southbridge chipsets on their boards. Just 3rd party accessory chips.
 
1) Intel really isn't in a position to dictate to Apple what graphic card vendors Apple can or can't use.

2) Apple (Read: Steve Jobs) isn't the kind of company that allows itself to be pushed around. Usually Apple is doing the pushing, and especially of late.

That said, NVidia will be under pressue now that AMD and ATI can share marketing expenses and bundle. I think if Apple drops ATI it will be because NVidia gave Apple a sweet deal.
 
indigo144 said:
Unlikely, but one can dream:
Down the road I'd really not only like to see ATI but also AMD technology in Apple products. In Cpu-Land, Intel is more like the fat guy in the PC box and AMD the young kid doing cool things with his Mac. Behind AMD's dominance for real multiple-core architectures lies it hypertransport technology that probably will take graphics to the next level. This must be the rationale behind AMD's move.


This puts me in a precarious position because while I've preferred AMD chips in the PCs I built, I've never liked ATi, nor been impressed with their wares. The winning combo was always for me AMD+Nvidia (and before that, AMD and 3dfx).

Me thinks Nvidia cost too much for AMD to realistically try to acquire/merge with them. There's probably also too many antitrust issues to allow for an Intel+Nvidia merger, but fanboy dreams would be Apple+Nvidia acquisition. Although realistically, it'll probably be IBM Microelectronics acquiring Nvidia to win the lion's share of Sony PS3 chip contracts.
 
milo said:
True. But they can certainly encourage vendors to do things using discounts with strings attached. You want this chip at this price? Don't use ati graphics.

They certainly can't. At least not without being dragged to the nearest court and being made to be millions.
 
dguisinger said:
Actually the boards in question are server boards, using server class Northbridge and Southbridge chipsets from Intel. The ATI graphics are a seperate low-end ATI graphics chip; nothing special. Intel does not have an integrated graphics chip for servers, to which I would add: Intel doesn't modify their server lineup frequently so that customers have consistant products for the enterprise.

Once again, these are not ATI northbridge chipsets, Intel does not use 3rd party northbridge/southbridge chipsets on their boards. Just 3rd party accessory chips.

Intel makes server products with ATI graphics chips, yes.

But they also make a Desktop Board that uses the ATI Radeon Xpress 200 integrated-graphics Northbridge.
 
milo said:
True. But they can certainly encourage vendors to do things using discounts with strings attached. You want this chip at this price? Don't use ati graphics.

Intel makes public the price they charge EVERY OEM. They charge every OEM the same price. They do not offer discounts of this sort. (They have gotten in trouble for it in the past.)

At BEST, they offer 'advertising money' for the OEM to produce an ad that includes the 'Intel Inside' logo at the end of the ad. And since Apple doesn't do those ads, they couldn't lose that 'bonus'.
 
Lynxpro said:
This puts me in a precarious position because while I've preferred AMD chips in the PCs I built, I've never liked ATi, nor been impressed with their wares. The winning combo was always for me AMD+Nvidia (and before that, AMD and 3dfx).

I agree with this. I used to build pc's as well and my favourite old card was a voodoo. That being said, Being new to Mac here and loving it, I have a friend that still builds PC's and is strictly an AMD lover. He told me that years ago, when the wars of AMD and Intel were really starting, Intel told one of the motherboard makers, Not sure of whom but might have been Asus, that if they built boards for AMD they would no longer support them. He still has a couple of Motherboards built buy this manufacturer with no markings on it at all. No brand name. No serial numbers. Seeing Intel has tried to do this once before I can see them trying it again. That fact its with AMD I think its entirely possible. I would love to see Nvidia in some macs. Would be very nice.
 
nvidia is not that bad though, and with their (semi) monthly updates that improve the overall performance of the GPU around 25%, i don't see nvidia as a bad option.
 
SiliconAddict said:
Heh. Whatever. If Apple drops ATI it ain't going to be because of something Intel is suggesting. Jobs doesn't work that way. If they drop them its going to be one of several reasons:

-ATI now is only offered on AMD chipsets.
-AMD jacks up the price of ATI's stuff but only offers deep discounts for those using AMD wares.
-Any discounts that ATI is giving Apple is discontinued.

Basically it boils down to AMD making changes on ATI’s arrangements with Apple. I don’t see this happening because at the end if the day there are WAY more Intel systems out there then AMD. I simply don’t see AMD not wanting to make money on all those systems. The core reason AMD purchased ATI was because of its interconnect hardware. It’s been a key item missing from AMD’s hardware. No longer.
Those reasons are totally illegal, well in the EU and USA. It comes under, once a manufacturer supplies you with a product, they cannot change and refuse to sell you products made by them. If AMD have a new product, Apple are entitled to purchase it.

It is a bit like Coca Cola and Pepsi. Coca Cola have refused to sell stores their products as the store sells Pepsi. A lot of stores in the UK are currently taking Coca Cola to court over this.

Another thing, you are not allowed to sell a product to someone at discount, if there is a discount, it has to be available to everyone. No price fixing please!
 
howesey said:
....

It is a bit like Coca Cola and Pepsi. Coca Cola have refused to sell stores their products as the store sells Pepsi. A lot of stores in the UK are currently taking Coca Cola to court over this....

That is SO pathetic!!! I hate some of the buisness scams these large companies get up to! :mad:
 
indigo144 said:
Unlikely, but one can dream:

Down the road I'd really not only like to see ATI but also AMD technology in Apple products. In Cpu-Land, Intel is more like the fat guy in the PC box and AMD the young kid doing cool things with his Mac. Behind AMD's dominance for real multiple-core architectures lies it hypertransport technology that probably will take graphics to the next level. This must be the rationale behind AMD's move.

That's not really a fair assessment. Sure, a year ago, Intel was committed to a dead architecture. Now, with the Core 2 architecture, Intel is moping the floor with AMD, and they're not slowing down. They're actually moving their roadmap up by several months because this new architecture is so incredibly versatile. Here's the bottom line, AMD doesn't want to make GPUs, they want the technology behind GPUs. A GPU is easily the fastest component in any computer system, and AMD would like to harness that power. Intel's integrated graphics can't compete with a discrete graphics adapter, but they are not meant to. integrated graphics are designed for bottom shelf Dells and whatnot. And finally, GPUs are already quasi-multicore architectures, far moreso than CPUs at the least. So I think AMD is looking to GPUs and their parallel processing technology to improve its own product line rather than the other way around.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.