Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Those talking about intel graphics,have you heard of Larabee ? It promises a lot,it's true,but it can be good,who knows?

I did know about Larabee but always thought it was in the far off future. Will it be ready for a new generation of Macs is another question to be asked.
 
Larrabee is an ambitious idea that only Intel could pull off. The future of graphics (5+ years) could very well be advanced software rendering; photon mapping, ray tracing, voxels-based rendering. For Intel, getting their early Larrabee design in Macs would be a huge win. All they have to do is best the lowly 9400M or even the 9600M GT in performance. Larrabee would also be the perfect component for OpenCL and Grand Central. The big problem, I think, will be the thermal envelope. I don't think Larrabee will run cool enough to be placed in a laptop.
 
Maybe PowerVR will make a comeback in the desktop market. I don't think laptops were popular enough back when they were still making graphics cards, but I'm sure they could have developed something by now.

I remember my PowerVR card was pretty good, but then within a couple of generations ATI and nVidia seemed to kill them off :(
 
Maybe PowerVR will make a comeback in the desktop market. I don't think laptops were popular enough back when they were still making graphics cards, but I'm sure they could have developed something by now.

I remember my PowerVR card was pretty good, but then within a couple of generations ATI and nVidia seemed to kill them off :(

They had good hardware (considering the tiling) but the drivers were crap. Plus they couldn't seem to keep up with the optimizations that Nvidia and ATI were doing (still do).
 
I see this rumor being true. nVidia has not only dropped the ball with Apple many many times, but they are in a big fight with Intel right now. Apple will chose Intel's side 100% of the time.
 
how about this instead...

Either have your engineers create your own chipsets again (let's face it, apple... you've done it before and done it spectacularly well, you can do it again), or liscense NVIDIA's design and have it manufactured independently so you can monitor quality more carefully. Going back to the basic Intel chipset is such a step backward...
 
b) For servers people want something that does not come in the way and require a lot of power.

HP and Dell use ATI discrete GPU chips in most of their servers. I don't believe that there are any multi-socket chipsets with integrated graphics.

Even most low-end servers use discrete GPUs (but note that Dell says "integrated graphics" in some places for their low-end tower servers, but the spec sheet says ATI RN50).


Apple's PPC beat the Intel chips back in the day and we never thought we'd see them in a Mac.

Actually, it was Motorola's PPC - and "the day" didn't last very long. Except for some Altivec-sweet applications Intel chips held their own in most tests.
 
I didn't read all of the replies, but I'm just throwing my 2 cents' worth in the thread.

I would believe that they are dropping nVidia chipsets for upcoming models, because of the licensing issues. I think that Intel can probably do it better anyway with their onboard memory controller (is this new? I don't even know) and the new advances with its upcoming platforms. Fortunately, the removal of the nVidia chipset makes more room for dedicated graphics chips across the board (we have 2 chips currently, with Intel's CPU/Chipset combo, we'd have 1 chip with physical space for a second, ie: a GPU).

Also, with the upcoming Open CL that was so highly pushed, I doubt that Apple would abandon that technology in roughly half of it's machines.

So I think we will start seeing dedicated GPU's across the board, whether from nVidia or ATI.
 
who cares as long as they dont go back to intel graphics

I remember saying "why? nvidia chipsets havent been any good since nforce3" when the first rumor came out though
 
Really, now. These IGA are getting really, really, powerful. How much power do you need if you are in the market for a low-end computer? :eek:

The next generation Intel IGA should be good enough for many, tasks - especially in Snow Leopard. This will also keep costs lower (for Apple and maybe the customer if Apple decides to pass the savings on).

I think that Apple will probably end up with ATI discreet graphics for mid and higher end machines.
 
could be very well true

Apple is very frustrated about defects on the 8600M GT. They decided to go mcp79 because x3100/x4500 was not performing good under anything. (OpenCL, OpenGL)

nVidia and Intel have a lawsuit going on over Nehalem chipset production. Since Nehalem has integrated memory controller, Intel claims that this violates the current chipset manufacturing agreement, and they claim a new agreement has to be established ( that probably means more royalty pays for nVidia, and that's why nVidia is pushing that current agreement should not be violated)

Intel is not happy that they lost Apple chipsets to nVidia. Now Labree finishing the design phase, Intel is confident that it'll gather integrated chipset performance crown. And with nVidia's situation about nehalem chipsets, that'll put intel back in the game.

Apple may very well ditch nVidia chipsets and they may very well ditch GeForce line of GPUs from all Macs until nVidia staightens out it's manufacturing problems ( Quadro option would probably stay )

As long as we get more performance and high quality, I don't care really. I own a 13" MBP with 9400m, and I'm very happy.
 
Apple is very frustrated about defects on the 8600M GT. They decided to go mcp79 because x3100/x4500 was not performing good under anything. (OpenCL, OpenGL)

nVidia and Intel have a lawsuit going on over Nehalem chipset production. Since Nehalem has integrated memory controller, Intel claims that this violates the current chipset manufacturing agreement, and they claim a new agreement has to be established ( that probably means more royalty pays for nVidia, and that's why nVidia is pushing that current agreement should not be violated)

Intel is not happy that they lost Apple chipsets to nVidia. Now Labree finishing the design phase, Intel is confident that it'll gather integrated chipset performance crown. And with nVidia's situation about nehalem chipsets, that'll put intel back in the game.

Apple may very well ditch nVidia chipsets and they may very well ditch GeForce line of GPUs from all Macs until nVidia staightens out it's manufacturing problems ( Quadro option would probably stay )

As long as we get more performance and high quality, I don't care really. I own a 13" MBP with 9400m, and I'm very happy.

No professional is going to buy a laptop with Larrabee graphics inside. That's like asking for trouble. Who buys a first generation product and expects it to last?
 
No professional is going to buy a laptop with Larrabee graphics inside. That's like asking for trouble. Who buys a first generation product and expects it to last?

Remember ATI 9500/9700? They were first generation DX9 cards that were gems. They lasted over 3 years a no opponent from nvidia was present to match them.

All I'm saying that Apple won't surprise me if they kick out nvidia as their mobile chipset choice. Considering that Intel will build Labree inside it's cpus that could very well be the next step.
 
Remember ATI 9500/9700? They were first generation DX9 cards that were gems. They lasted over 3 years a no opponent from nvidia was present to match them.

All I'm saying that Apple won't surprise me if they kick out nvidia as their mobile chipset choice. Considering that Intel will build Labree inside it's cpus that could very well be the next step.
The Radeon 9600/9700 (R300) brings back fond memories of a killer product. The 9600 gave you good midrange performance even into 2005.

nVidia's G80 comes to mind for more recent times. The first DirectX 10 cards out and with killer performance as well. The 8800GTS 320/640 is still quite the passable card even today.
 
Maybe PowerVR will make a comeback in the desktop market. I don't think laptops were popular enough back when they were still making graphics cards, but I'm sure they could have developed something by now.

I remember my PowerVR card was pretty good, but then within a couple of generations ATI and nVidia seemed to kill them off :(

Yes, they have, the GMA500...

http://en.wikipedia.org/wiki/Intel_GMA#GMA_500
 
The Radeon 9600/9700 (R300) brings back fond memories of a killer product. The 9600 gave you good midrange performance even into 2005.

nVidia's G80 comes to mind for more recent times. The first DirectX 10 cards out and with killer performance as well. The 8800GTS 320/640 is still quite the passable card even today.

G80 was good, too, but R300's successs was beyond anything I had seen. It remembered me Voodoo days. Good old 3dfx...
 
I didn't read all of the replies, but I'm just throwing my 2 cents' worth in the thread.

I would believe that they are dropping nVidia chipsets for upcoming models, because of the licensing issues. I think that Intel can probably do it better anyway with their onboard memory controller (is this new? I don't even know) and the new advances with its upcoming platforms. Fortunately, the removal of the nVidia chipset makes more room for dedicated graphics chips across the board (we have 2 chips currently, with Intel's CPU/Chipset combo, we'd have 1 chip with physical space for a second, ie: a GPU).

Also, with the upcoming Open CL that was so highly pushed, I doubt that Apple would abandon that technology in roughly half of it's machines.

So I think we will start seeing dedicated GPU's across the board, whether from nVidia or ATI.

Getting a dedicated graphic card (in that sense) would be near impossible without better cooling or under clocking the GPU. Today's graphic chips run hot and enough cooling is the problem.

Heat (and voltage spikes) are enemies to all electronics. Electronic Manufacturers specify a range the temp. is ok to be within...Engineers design to keep all components inside the normal operating heat range... in a laptop its necessary to balance heat dissipation, power usage and performance.

When Intel comes up with a 32nm chip it will use less power. All heat is, is power burnt off not used (technical term:D ) due to inefficiencies. If you reduce inefficiency you at the same reduce the amount of heat (due to loss power) AND that 'would have been heat' is now usable power = ability to have FASTER chips (under same conditions).
These chips will be a STEP up and possibly before moore's law predicts

jdechko - AMD was the first to use on-chip-memory-controller (to my knowledge)
 
So I think we will start seeing dedicated GPU's across the board, whether from nVidia or ATI.

That's actually what I was hoping, except I gotta say, when I switch over to using the onboard video on my Oct. 2008 MBP I get another hour or so of battery life.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.