Because it would be insanity. Do you know how much Intel spend on R&D, then the factories to make the chips? Apple would soon burn through their cash pile.
Ultra low power, specialised processors I can see happening. Desktop/laptop/workstation processors, no way, and rightly so. Leave those to the pro's.
I still believe that the G5/G4 architecture was superior to even Intel's current chips... there is too much legacy laying around on Intel chips circa 1978.
The real dark horse here is the possibility of a co-processor, maybe modelled on the vector cores in cell. But I'm not sure if this is realistic considering efforts put forth to make sure the coming embedded GPUs are OpenCL compatible. The way I see it if Apple is going to add additional features they will have to be features they can't get from standard components on a SoC.
Dave
By being at the bleeding edge of size, power, specified capabilities and leading edge I/O, Apple will not only be more advanced and desired, but more useful to more people.
The Wall Street Journal points out, however, that this trend is contrary to most big electronic firms who have moved towards outsourcing components in an effort to reduce costs.
So they bring (limited) R&D inhouse and outsource the fabrication. Stills gives them control and exclusive access to key components. Non-key components would be bought in in the usual manner.
Jobs reportedly told P.A. Semi engineers that he specifically wanted to develope chips within Apple to prevent knowledge of them leaking out. Apple, of course, is well known for their secrecy. This secrecy has been harder to maintain in recent years due to the number of partners Apple must work with in launching a product.
I'm sorry, but that's just wrong.
Not sure I disagree with your conclusion so much, but it has to be pointed out that "the Pros" actually work at Apple now.
The people they have hired recently represent some of the biggest best names in the field. The lead designer of PA Semi is the guy that almost single-handedly created the two best chips ever made. There's no one more "pro" than these guys, there are no "better" chip designers they should defer to, only equal (or mostly), lesser designers.
Proprietary parts and design. Increased per unit costs. Sounds like a grea fit for Apple. Brilliant.![]()
Because it would be insanity. Do you know how much Intel spend on R&D, then the factories to make the chips? Apple would soon burn through their cash pile.
Ultra low power, specialised processors I can see happening. Desktop/laptop/workstation processors, no way, and rightly so. Leave those to the pro's.
You have missed the point - the trick to making electronics cheap is to cut the chip count - this save manufacturing costs, improves reliability and cuts power consumption. Also allows you to reduce the size which is a big issue for Steve. If by designing their own chips Apple can cut the chip count in the device the savings in chip count would probably more than offset the increased cost of the individual chip. If it makes the device thinner and gives it better battery life it makes it cooler
Makes perfect sense to me
Nah, if you want graphics the pro is the only way to go. You'll just be dissapointed with a mac book. They do get warm but that's to be expected I think with desktop replacement machines, especially since the graphics are as good as they are.
And to answer your question, hot. I frequently get temp readings just shy of 170 and rarely around 180 (but not for long) and really nothing the cooling system can't handle.
It's like Ren and Skimpy. A plot to take over the world.
I need a job. Good vibe, thoughts and prayers please. First year anniversary in a few days, vacation cancelled as I was let go. Hugely depressed and bummed.
I suspect this is Apple's response to a phenomenon originating from China. I've been reading up about the Chinese knock-off cellphone, or "shanzhai phone", industry. It has become an incredibly lucrative business. Why? Because, it's now possible for a company, comprised of a small group of 3-5 people, to design, build (or rather contract to a factory), and market phones. The technology has gotten to the point where most of the difficult technical design hurdles have been removed by the presence of a cellphone-on-a-chip so-to-speak. Sound familiar to anyone around here? So you have these tiny, agile startups being able to compete with the big boys. A lot of the big boys are still in denial, and will probably try to respond to the shanzhai phone industry by attempting to get the Chinese government to crack down on it. Not Apple; their approach will be to out-innovate these players by playing off the one weakness that they have: that they cannot design their own cellphone-on-a-chip.
Build quality is a function of (a) product design and engineering and (b) manufacturing standards and tolerances. Controlling the former is obvious, but the latter is accomplished based on the ODM specs, and "PC" vendors, particularly those bottom-of-the-barrel systems often dug up for specious comparisons, simply don't pay for that level of work because their retail price doesn't allow for it.
Some of the more expensive PCs have excellent build quality--but it always comes down to the question of how much one is willing to pay for fabrication improvements and the R&D to make a workable design. It's less common for PC vendors to start with a truly good design, though, because like most people here, the intricacies of design and engineering are seen as either voodoo or a waste, in either case detracting from the bottom line. But consider this: you can follow the blueprints for a house down to the millimeter, but if it's a bad design, the south wall is going to collapse, and that reflects on "build quality" to the buyer--even though in fact it's not. It's a design/engineering flaw.
At the end of the day, commodity vendors just can't afford it...but customers can't evaluate its value and most commodity machines are "good enough", so the industry bar isn't set too high for Apple.
No, it isn't. It's an issue that customers and companies simply aren't willing to pay to resolve. The solution is simple, obvious, and practical. The only problem is the economics of the computing market.
You present a binary where none exists. There is a great deal of granular control in the availability, quality, and pricing of all bulk parts. "Off the shelf" isn't a measure of quality, and the essential comparison isn't between "good" and "poor" quality, but a rather more subtle distinction.
Huh?