Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Well the fact that old PPC chips can still keep up to the current Intel chips shows how great PPC chips are. I'm sure if IBM and Apple were still working together they would have something blowing Intel out of the water. The PPC ISA is much cleaner then the X86 instruction set, not to mention Apple has had to release specific security updates for Intel based Macs, they just don't affect the PPC.

Yes, but really the latest g5's are the only ones that can even compare to just about any intel mac. And IBM pretty much made it clear that they didn't want to work with Apple anymore. The G5 was supposed to hit 3.0+ GHz and never did. Instead, IBM stuck with server chips and creating PPC chips for consoles (360, Wii and to a lesser extent the PS3). Apple would have had to settle for whatever IBM felt like making. Intel, however is continually innovating and competing with/displacing their own products in the market (hmmm, who else does that sound like).

Look at Microsoft's Component model. What differentiates HP from Dell etc... Not the OS. Maybe HP offers an integrated Card reader n Dell doesn't for example.

True, but Dell, HP, Toshiba, etc all compete on price as well, something that Apple simply does not do. And, to use your example, not one Apple computer offers an integrated card reader. The reason Apple is set apart in this industry is that they don't compete like the other companies do. And while I did say that the hardware is all the same, Apple is a vertically integrated systems supplier, while Microsoft, as you mentioned, uses a component model. And Apple's system has worked well for them. They've shown unheard of growth in a stagnant market. And people are foregoing the Dells and HPs to buy a more expensive Apple. It's not because of an Intel chip, Samsung RAM, and Toshiba hard drives, its the OS which is directly related to the user experience.

Perfect example of this is what Toshiba is doing by integrating Cell into their laptop. They are differentiating themselves and trying to improve the user experience with hardware not software.

Interesting on the Cell chip. They will ultimately fail with the user experience, I believe, though, because like many other companies that try to emulate Apple (and that's not to say apple is the only one who makes nice interfaces), few companies actually do "get" user experience.

The iphone differentiates itself with hardware as well. The iphone has multi touch in the touchscreen versus other touch screens that only accept one finger input. Why do you think Apple is patenting all the mul-touch on the hardware side?

The success of the iPhone, I believe has more to do with the user experience than the hardware. (I'm big on the whole user experience thing, in case you hadn't noticed :rolleyes:) The touch screen and multi-touch are just means to an end.

So its not just the OS, the hardware does matter. Its 50/50.

Sure, and I never intended to say that the hardware doesn't matter. I'd say, though that it's more like 80/20 for the OS/hardware. Given identical machines, OS choice plays a more important role in user experience than hardware. See bootcamp for example.

EDIT: And sorry for the late reply. I skipped a lot of other posts.

EDIT 2: Fukui GPU's are insanely fast at floating point calculations, faster than a CPU, and are great for things like calculating physics and other things. Additionally, nVidia's more recent GPUs are GP-GPUs (General Purpose GPUs) and with CUDA (nVidia's programming language) a programmer can offload general purpose computing to a GPU. ATI is also working on something like this, but I don't know as much about their implementation. diamond.g based on this information, I really think OpenCL is going to be a generic set of API's that can tap into the GPU much like CoreImage does and is not a software implementation of SLI/Crossfire.
 
I'm seeing OpenCL as like OpenGL - an open API. Considering that DirectX 11 is offering a standard API for using shaders to do computations, I see the DX11 GPGPU API vs. OpenCL much like DirectX vs. OpenGL of days gone by. I think that might just torpedo CUDA if true cause that makes a standardized platform for GPGPU coding.

BTW, PCPerspective has an article weighing in on the chipset and they think Apple might go with an Nvidia platform: http://www.pcper.com/article.php?aid=598
 
Actually after reading that editorial, a move to Nvidia seems likely since:

1) It allows use of the Intel CPU still
2) Improved IGP capability for MacBooks
3) Hybrid SLI means a discrete GPU can be turned on/off depending on demand for power saving
4) Likely support for CUDA/OpenCL/whatever GPGPU API that Nvidia allows down the line with those discrete GPU options

As pointed out, Apple developing its own chipset is out of the question right now, especially on the Core 2 platform since a year from now, mobility Nehalem will be here
 
except wasn't there some issue with defective nVidia chipsets awhile ago?... think that might ruin their appeal?
 
diamond.g based on this information, I really think OpenCL is going to be a generic set of API's that can tap into the GPU much like CoreImage does and is not a software implementation of SLI/Crossfire.
I'm seeing OpenCL as like OpenGL - an open API. Considering that DirectX 11 is offering a standard API for using shaders to do computations, I see the DX11 GPGPU API vs. OpenCL much like DirectX vs. OpenGL of days gone by. I think that might just torpedo CUDA if true cause that makes a standardized platform for GPGPU coding.

BTW, PCPerspective has an article weighing in on the chipset and they think Apple might go with an Nvidia platform: http://www.pcper.com/article.php?aid=598

I sure hope OpenCL doesn't disappoint. I wonder if the API would be too general to really make either card shine. ATI has always had hardware that was much stronger in math than nvidia but I fear that the API may hold that back.
 
Yes that's true it might, although ATI teamed up with Apple to support OpenCL. Math wise, theoretical FLOPS currently favor ATI cards though not by a whole lot 1.2/1.0TFlops vs. 933GFlops..
 
except wasn't there some issue with defective nVidia chipsets awhile ago?... think that might ruin their appeal?

Yes but it was supposedly a manufacturing defect and not one within the chip itself. Basically, Nvidia does their fabbing through TSMC since they are a fab-less company but supposedly one party or the other chose inferior silicon, thus creating more defects. I'm certain that if Nvidia picked up a major OEM like Apple that they'd choose better materials this time around.

It certainly hurts their reputation right now but a lineup of good products would help. And we might finally get our wish and get MacBooks with up-to-date graphics at least.
 
"... improved power consumption.." AND COST!

Monopolies who drive the prices of their products, drive the market and it's competitors. The ability to place other vendors hardware in Apple products will motivate more competition improving technology and driving prices lower for the consumer.

... Goooo Apple!!!:D:D
 
Yes that's true it might, although ATI teamed up with Apple to support OpenCL. Math wise, theoretical FLOPS currently favor ATI cards though not by a whole lot 1.2/1.0TFlops vs. 933GFlops..
It'll favor ATI a lot more once the 4870 X2 comes out (~2 TFLOPS), and the 55 nm GT200 most likely won't increase clocks much.
 
BTW, PCPerspective has an article weighing in on the chipset and they think Apple might go with an Nvidia platform: http://www.pcper.com/article.php?aid=598

Very compelling case made by this article. It makes sense on many levels, with Apple basically maintaining the balance of powers by using various chips and chipset from ARM, AMD, Intel and NVIDIA for its various products. Can't wait for September !
 
Yeah after re-reading the article and thinking about it a bit, it is definitely pretty compelling. That is especially since Nvidia could probably offer Apple a competitive package since it gives them a halo brand for their notebook chipset division while Apple maintains ties with Intel via CPUs and basically tells them that prices are to be set by bidding. That and since most complaints nowadays are about poor graphics performance, I can see a move to Nvidia happening.
 
Yeah after re-reading the article and thinking about it a bit, it is definitely pretty compelling. That is especially since Nvidia could probably offer Apple a competitive package since it gives them a halo brand for their notebook chipset division while Apple maintains ties with Intel via CPUs and basically tells them that prices are to be set by bidding. That and since most complaints nowadays are about poor graphics performance, I can see a move to Nvidia happening.

I also think nVidia is a likely choice. nVidia will give better graphics performance than Intel, and they are already licensed for Intel's CPU and FSB.
 
Actually, I have a technical question on this issue. From what the article says, the NVIDIA "chipset" has no north or south bridge, but I guess it still has a FSB. So can it be considered halfway between Montevina and Calpella? Or will Calpella still have north and south bridges and only the FSB will disappear to make room for the faster and more efficient Quickpath?
Reading an article in ars about the Via and Atom processors, they were pointing out how the Intel north bridge was a huge source of heat dissipation and thus inefficiency. Does the NVIDIA design, without these bridges (please enlighten me on what replaces these bridges), provide significantly lower power consumption than Intel's chipset?

Hopefully, these questions make sense :) Thanks in advance !
 
My least realistic but most wanted dream: Apple to go back to PowerPC.

My 2nd most wanted dream: Apple to make/combine the MacBook/Pro line into one. All are going to be aluminium except one $999 plastic MacBook (iBook replacement) and a 13" MacBook aluminium (True 12" PowerBook G4 replacement). Also, make all the Macs cheaper, even if the price cut is little.

My 3rd most wanted dream: Apple making custom chip-sets to differentiate themselves from PCs (if Apple does continue to use Intel processors).
 
Actually, I have a technical question on this issue. From what the article says, the NVIDIA "chipset" has no north or south bridge, but I guess it still has a FSB. So can it be considered halfway between Montevina and Calpella? Or will Calpella still have north and south bridges and only the FSB will disappear to make room for the faster and more efficient Quickpath?

AFAIK they simply combined NB and SB together but I haven't caught up on Nvidia notebook chipsets in a while. That's how they do it for their mATX budget boards though.

Reading an article in ars about the Via and Atom processors, they were pointing out how the Intel north bridge was a huge source of heat dissipation and thus inefficiency. Does the NVIDIA design, without these bridges (please enlighten me on what replaces these bridges), provide significantly lower power consumption than Intel's chipset?

Hopefully, these questions make sense :) Thanks in advance !

Actually, Nvidia desktop chipsets are even bigger power hogs/heat sources than Intel's are. The problem is, both have their memory controllers on their northbridges which means that due to the high frequency with which DDR2 and DDR3 get clocked, they get very hot. AMD chipsets have very low power/heat consumption on northbridges, but that's because they have an IMC (Integrated Memory Controller) on their CPU's so it offloads the big source of heat/power consumption.

But moving to smaller processes tends to lower heat and power use, such as the P35 -> P45 transition for intel.
 
My least realistic but most wanted dream: Apple to go back to PowerPC.

If we would have remained with PowerPCs, we'd be so far behind right now that Apple would have a very hard time selling Macs to traditional Mac users let lone the teenager and college students they've picked up. Freescale nor IBM would be investing much of their money into it, so whatever profits Apple was making would be going to chip development.
 
And Jon Stokes strikes back. Change of chipset not so likely according to him:
http://arstechnica.com/news.ars/pos...e-wont-drop-intel-chipsets-any-time-soon.html
That and this post at AppleInsider...

mdriftmeyer said:
What the analyst misses

is that most of Apple's design projects that reach the level of rumors have been in the design and development process for at least 3 years.
An Apple-Intel-PA Semi chipset would have been in development since 2005 or 2006, which is probably enough time for them to have created an Apple-only chipset.

So either Apple will use a custom chipset, or they're going to use NVIDIA just this one time before Nehalem.

ArsTechnica said:
The problem with this theory, however, is that Snow Leopard is scheduled to arrive sometime in the summer of 2009, which is also when Intel's Larrabee is set to launch. And I've heard from a source that I trust that Apple will use Larrabee
Well well well, what do we have here? ;)

I suppose the Apple-Intel relationship is still going strong then.
 
Interesting read.

Can't see Apple switching chipsets so early on in there relationship with intel, but hey as long the Nvidia chipsets work and work well I don't care what make it is inside.
Apple would most likely switch back to Intel with Nehalem.

Also people generally care less about the chipset than the processor, so Intel will still get recognition. Not to mention that Apple doesn't advertise the chipset on their website (at least I don't think so).
 
Based on nVidia's site, they only make "desktop class" MCPs. So does this preclude the use of Intel Mobile CPUs?

If so, I can't see how Apple can adopt them since the Wolfdale TDPs are significantly higher then the current Penryn mobile units.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.