Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
You do realize that I ended up repeating myself the entire time, right? :eek:

Everything started in Post #213 and ended back again at the points that I made in that post. Wandering off to AMD/ATI didn't get anywhere.

Apple is in a very tough spot since MCP79A powers everything short of the Mac Pro. There's no future successor for it on Nehalem/Westmere.

Thats nice,

I've noticed that AMD isn't an option to you because you're being pessimistic. If your option might not work, go with the alternative. Thats why Apple went from PowerPC in the first place. If the intel route proves to troublesome for the effort. Go a different way.
 
Thats nice,

I've noticed that AMD isn't an option to you because you're being pessimistic. If your option might not work, go with the alternative. Thats why Apple went from PowerPC in the first place. If the intel route proves to troublesome for the effort. Go a different way.
Show me your AMD two chip solution then.

Regretfully it's still CPU + IGP/Northbridge + Southbridge I/O on the roadmaps. Llano is still too far out unless you want Apple to stay on Core 2 until 2011.

I'm not pessimistic. I'm just very well informed. Now that's nice. :D
 
Show me your AMD two chip solution then.

Regretfully it's still CPU + IGP/Northbridge + Southbridge I/O on the roadmaps. Llano is still too far out unless you want Apple to stay on Core 2 until 2011.

I'm not pessimistic. I'm just very well informed.

And it has to be a 2-Chip solution why???

Granted nVidia seems to be the only company interested in 2-Chip solutions, as there were 2-Chip AMD solutions. There just wasnt a need for them.
 
Why bring this up then?



Time for an Atom netbook from Apple? The MCP79A lifted the majority of Apple's hardware out of the Intel GMA slums and now there's nowhere to go unless you want another round of Core 2, again. Where is Apple going to slap in a discrete video solution in the new, thinner notebooks? :rolleyes:

It's all built around the MCP79A being IGP, northbridge, and southbridge. ION2 extends the life but paying for Core 2 in a $1,200 desktop is a complete joke when the Core i5 750 runs for $700 on the tower.

The same goes for the Studio 15 sporting a Clarksfield at $999.

I don't know... The successor to the 9400m combined with a core2duo would probably give the user more overall performance than going with the next gen Intel CPU with the craptastic 4500.

And prices on Core2Duo's should be dirt cheap at that point because demand will have dried up as other manufacturers go with Intel's next gen chip.
 
I don't know... The successor to the 9400m combined with a core2duo would probably give the user more overall performance than going with the next gen Intel CPU with the craptastic 4500.

And prices on Core2Duo's should be dirt cheap at that point because demand will have dried up as other manufacturers go with Intel's next gen chip.
Please tell me you've looked at Arrandale and Clarksfield. :(

The GMA 4500MHD isn't anywhere to be found either. Core 2 is now for budget machines and low voltage.
 
Please tell me you've looked at Arrandale and Clarksfield. :(

The GMA 4500MHD isn't anywhere to be found either. Core 2 is now for budget machines and low voltage.


The Arrandale integrated graphics is similar in power to a GMA 4500MHD.

The CPU is also not the limiting factor in the sort of operations we're worried about now that there won't be any 2 chip solutions. A GPU that's twice as powerful will give the users a bigger boost in their overall experience when compared to a faster processor with a worse GPU.
 
The Arrandale integrated graphics is similar in power to a GMA 4500MHD.
I don't think you've seen the initial benchmarks on the IGP then. We covered a lot of this on the first few pages as well.

The CPU is also not the limiting factor in the sort of operations we're worried about now that there won't be any 2 chip solutions. A GPU that's twice as powerful will give the users a bigger boost in their overall experience when compared to a faster processor with a worse GPU.
Where are they going to place the discrete graphics then in a Unibody notebook under 15"? They'd have to make it thicker like the older plastic MacBooks.
 
I don't think you've seen the initial benchmarks on the IGP then. We covered a lot of this on the first few pages as well.

Where are they going to place the discrete graphics then in a Unibody notebook under 15"? They'd have to make it thicker like the older plastic MacBooks.

Apple will have to use its reality distortion field. That or the card is already in there on the lower models and is just disabled.
 
Disabling Intel's 45nm IGP does open up some options but the bigger problem is the board area and internal volume wasted by it.

No, I meant, (Talking about the MacBooks here) what if the dedicated card is already there and is just disabled by the EFI. You know, one set of internals. So all of this is a moot point.

I don't actually know so I dont have anything to compare.

YAY Cold hard speculation.
 
No, I meant, (Talking about the MacBooks here) what if the dedicated card is already there and is just disabled by the EFI. You know, one set of internals. So all of this is a moot point.

I don't actually know so I dont have anything to compare.

YAY Cold hard speculation.
Please elaborate.

The 13" and 15" notebooks have different logic boards.
 
Please elaborate.

The 13" and 15" notebooks have different logic boards.

Psch, never mind it was only applicable to the 15" series. I was just going on about how the 9600M might already just be built but disabled by the EFI on lower models. But thats only applicable for the 15" series.

The only feasible thing I see is for AMD or Intel to make them special chipset or that they make the MacBook/Mini/iMac slightly thicker...
.
 
Psch, never mind it was only applicable to the 15" series. I was just going on about how the 9600M might already just be built but disabled by the EFI on lower models. But thats only applicable for the 15" series.

The only feasible thing I see is for AMD or Intel to make them special chipset or that they make the MacBook/Mini/iMac slightly thicker...
.
Either way, they won't add discrete components and not use them. It's a waste of funds.

The closest that would come to, is an empty place on the board for said component. But I don't think this will be the case either, as it's wasted real estate (really important on laptop/portable devices), and they'd wait for an actual solution before designing a board around it and manufacturing it. Prototype maybe, for an anticipated part or new design. But not a finished board currently in production.
 
Either way, they won't add discrete components and not use them. It's a waste of funds.

The closest that would come to, is an empty place on the board for said component. But I don't think this will be the case either, as it's wasted real estate (really important on laptop/portable devices), and they'd wait for an actual solution before designing a board around it and manufacturing it. Prototype maybe, for an anticipated part or new design. But not a finished board currently in production.

Depending on manufacturing costs it would be cheaper as you don't have to pay more people etc. Even leaving space means that another unique board has to be designed and made adding more cost.
 
Depending on manufacturing costs it would be cheaper as you don't have to pay more people etc.
You've lost me on this one (designers?). Would you explain this in some detail?

Even leaving space means that another unique board has to be designed and made adding more cost.
By leaving the empty space on the board (it's not a small chip), it increases the cost of the PCB, and likely other aspects of the system (the case for example, as it has to be made to accomodate a cooler in such a development scheme).

The added cost of design for a new board would be planned in product development anyway. They'd likely change some other aspects of the system as well at this point as well, in order to keep up with other products. It wouldn't make sense as an incremental change (we're not talking desktops here, which can take this methodology; Intel actually does this for the Tick cycles on desktop parts), and the product development planning (roadmap) is key. Though mistakes can occur, such as suddenly losing a vendor that is critical to the design. But again, there should be some warning, and plans made. Even if everything looks good, as "stuff" happens. :p
 
You've lost me on this one (designers?). Would you explain this in some detail?

Designers, Engineers, Manual Labour, more production lines, lost efficiency etc etc etc

By leaving the empty space on the board (it's not a small chip), it increases the cost of the PCB, and likely other aspects of the system (the case for example, as it has to be made to accomodate a cooler in such a development scheme).

The added cost of design for a new board would be planned in product development anyway. They'd likely change some other aspects of the system as well at this point as well, in order to keep up with other products. It wouldn't make sense as an incremental change (we're not talking desktops here, which can take this methodology; Intel actually does this for the Tick cycles on desktop parts), and the product development planning (roadmap) is key. Though mistakes can occur, such as suddenly losing a vendor that is critical to the design. But again, there should be some warning, and plans made. Even if everything looks good, as "stuff" happens. :p

Your conversation is more undecisive than a little kid given a choice between flavours of ice cream at a Mr Whippy van.

First you disagree THEN you agree etc.

Hopefully the EU/Market Control will get their A into G or Apple finds a nice alternative. Intel isnt the only company capable of making CPUS and Chipsets. AMD is the only other competent CPU/GPU maker. Or else were stuck with Crappy GMAs or face a price rise.

Oh God, what if its the "triumphant" return PowerPC. /Sarcasm... or is it??? XD
 
First you disagree THEN you agree etc.
I understand your point, but the agree/disagree bit was conditional. It makes sense from a systems POV, yes, and is well tolerated in the desktop market. There's enough PCB real estate to do such things. It's not so easy in laptops or portable devices given the board area.

If we were talking about an audio or ethernet chip, it would make sense, especially if the same board is used in multiple systems (feature changes to separate the models in a line).

But it's a GPU, which isn't small (for PCB reasons, including the trace routing it would require), nor exactly "inexpensive" to place & solder (to leave disabled).

Hopefully the EU/Market Control will get their A into G or Apple finds a nice alternative. Intel isnt the only company capable of making CPUS and Chipsets. AMD is the only other competent CPU/GPU maker. Or else were stuck with Crappy GMAs or face a price rise.

Oh God, what if its the "triumphant" return PowerPC. /Sarcasm... or is it??? XD
Others could take on the market, but they'd need time to realize any products. So AMD/ATI is the only possiblility for the short term, and would have a majore advantage in experience even if a competitor does arise. Board makers would be SOL in a sense, as the choices would be extremely limited in the beginning, and the prices would almost certainly increase, even on the budget end due to supply issues (real or manipulated). Then there may be performance and QC issues with a new competitor's offerings. It could be really ugly for awhile. :eek: :p
 
I understand your point, but the agree/disagree bit was conditional. It makes sense from a systems POV, yes, and is well tolerated in the desktop market. There's enough PCB real estate to do such things. It's not so easy in laptops or portable devices given the board area.

If we were talking about an audio or ethernet chip, it would make sense, especially if the same board is used in multiple systems (feature changes to separate the models in a line).

But it's a GPU, which isn't small (for PCB reasons, including the trace routing it would require), nor exactly "inexpensive" to place & solder (to leave disabled).


Others could take on the market, but they'd need time to realize any products. So AMD/ATI is the only possiblility for the short term, and would have a majore advantage in experience even if a competitor does arise. Board makers would be SOL in a sense, as the choices would be extremely limited in the beginning, and the prices would almost certainly increase, even on the budget end due to supply issues (real or manipulated). Then there may be performance and QC issues with a new competitor's offerings. It could be really ugly for awhile. :eek: :p

Intel needs to be shot. I would say AMD is the only self sustaining company but then they rely on Intel's processing standard.

Wait no, Intel is being childish ATM. They can't admit that their precious CPU is slowly becoming irrelevant, left to be forever assigning resources. So they throw a hissy fit. I hope nVidia revokes their SLI license and says that the license was only for the X series. That would just be the kick in the balls Intel needs. AMD/nVidia could really screw Intel over if they tried hard enough.
 
I guess it's not good for competition, seeing nvidia leaving the chipset business. I also hope that the new iMac's; that are reportedly coming - are based on intel's chipset.
 
Intel needs to be shot. I would say AMD is the only self sustaining company but then they rely on Intel's processing standard.
As it happens, Intel had to license AMD's 64 bit instruction set, as they couldn't get it right on their own. ;)

But AMD's not in the financial position to take on Intel squarely head on. They've been having some issues as well, and are intent on going Fabless for the CPU segment at least. Not that they can't compete this way, but it can have consequences, such as retaining control over the materials/processing stages, design revisions/verification (testing tape outs & fixing problems),... It slows you down, and also has QC implications.

Wait no, Intel is being childish ATM. They can't admit that their precious CPU is slowly becoming irrelevant, left to be forever assigning resources. So they throw a hissy fit. I hope nVidia revokes their SLI license and says that the license was only for the X series. That would just be the kick in the balls Intel needs. AMD/nVidia could really screw Intel over if they tried hard enough.
Intel has a lot of clout due to things like market share, and financial resources. Then there's their influence on various standards boards, MS,....

Ultimately, time will tell, as things are going to have to shake themselves out. I'm sure Intel's more than aware of the shifts going on, and Larabee is a clue to this being the case. I don't think the CPU will disappear, but more operations will be tasked to the GPU, and multiple GPUs will fuel the new "Processor" war. But this isn't going to happen overnight.

I fully expect to see Intel enter the GPU fray before the smoke clears.

I guess it's not good for competition, seeing nvidia leaving the chipset business. I also hope that the new iMac's; that are reportedly coming - are based on intel's chipset.
No, it's not. That effectively leaves AMD as the sole competitor. Better than nothing at all though. :D
 
As it happens, Intel had to license AMD's 64 bit instruction set, as they couldn't get it right on their own. ;)

But AMD's not in the financial position to take on Intel squarely head on. They've been having some issues as well, and are intent on going Fabless for the CPU segment at least. Not that they can't compete this way, but it can have consequences, such as retaining control over the materials/processing stages, design revisions/verification (testing tape outs & fixing problems),... It slows you down, and also has QC implications.


Intel has a lot of clout due to things like market share, and financial resources. Then there's their influence on various standards boards, MS,....

Ultimately, time will tell, as things are going to have to shake themselves out. I'm sure Intel's more than aware of the shifts going on, and Larabee is a clue to this being the case. I don't think the CPU will disappear, but more operations will be tasked to the GPU, and multiple GPUs will fuel the new "Processor" war. But this isn't going to happen overnight.

I fully expect to see Intel enter the GPU fray before the smoke clears.


No, it's not. That effectively leaves AMD as the sole competitor. Better than nothing at all though. :D

(Hypothetical Scenario)

Companies will ask AMD to make a chipset for Intel CPUs. AMD Delivers, the people rejoice.

Since the demand for the 9400 got pretty high, the people want an alternative. Its a mixed compliment, asking to make a chip set for your competitors CPU. :rolleyes: But the ATi IGP chips are better than the nVidia IGP chips though.
 
(Hypothetical Scenario)

Companies will ask AMD to make a chipset for Intel CPUs. AMD Delivers, the people rejoice.

Since the demand for the 9400 got pretty high, the people want an alternative. Its a mixed compliment, asking to make a chip set for your competitors CPU. :rolleyes: But the ATi AGP chips are better than the nVidia IGP chips though.
I don't think it's all that hypothetical. ;) Now that there's less competition in this market, I'd think they'd welcome it.

They can develop their own chipset/s, and a new line of mobile GPU's. The market would also last awhile, as if you take a Nehalem and strip away the IMC and QPI (for those that contain this portion), you essentially end up with Penryn cores. So the mobile market will continue with this tech for a bit, as Intel's just getting the Quad core mobile parts ready.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.