Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Really looking forward to this years iPhone X iteration. New generation Apple made GPU and maybe some custom made battery and power control chips from Apple. Kinks and slowdowns from the current hardware and software worked out.

WWDC is just around the corner. I skipped the X but might have to save up for the next X plus.

Regarding this story, I certainly hope Apple doesn’t think they can drop Intel from their desktop and laptop lineup. I’m sure the A12X or whatever they will call it will be a monster and letting it run really fast and hot with a massive heatsink in a desktop would give out huge performance it isn’t that easy to transition from the x86 architecture.

Using it as a co-processor is much more likely. I wonder what kind of performance is possible with a desktop amount of RAM. Intel is fast and Apple cant really throw in 32 GB of memory in the A12 so I wonder if it would benefit or be hindered by using “external” memory. Also there is the question of the GPU setup. It flies on the iPhone screen at 2436×1125 pixels but it’s a whole other story when people want it for a 5K monitor or letting it take on som GPGPU tasks..

They could of course make a chip package without memory and gpu for the desktop but that will add costs for keeping yet another chip design.
 
Looks good the next iPhone but add some memory please. Cheap Android phones are selling with 4, 6 or even 8 GB. Please Apple add 4 GB to your phones and 8 GB to the iPad lineup.

Apple sells feature sets, not chipsets. Android needs ridiculous RAM to get anywhere NEAR the (perceived) speed of iOS, due to the incredibly inefficient way that Android was slapped together, using Java and being the total free for all which it is.
[doublepost=1527162599][/doublepost]
Really looking forward to this years iPhone X iteration. New generation Apple made GPU and maybe some custom made battery and power control chips from Apple. Kinks and slowdowns from the current hardware and software worked out.

WWDC is just around the corner. I skipped the X but might have to save up for the next X plus.

Regarding this story, I certainly hope Apple doesn’t think they can drop Intel from their desktop and laptop lineup. I’m sure the A12X or whatever they will call it will be a monster and letting it run really fast and hot with a massive heatsink in a desktop would give out huge performance it isn’t that easy to transition from the x86 architecture.

Using it as a co-processor is much more likely. I wonder what kind of performance is possible with a desktop amount of RAM. Intel is fast and Apple cant really throw in 32 GB of memory in the A12 so I wonder if it would benefit or be hindered by using “external” memory. Also there is the question of the GPU setup. It flies on the iPhone screen at 2436×1125 pixels but it’s a whole other story when people want it for a 5K monitor or letting it take on som GPGPU tasks..

They could of course make a chip package without memory and gpu for the desktop but that will add costs for keeping yet another chip design.

Who told you Macs would be using the A12? The speculation of this assumed CPU architecture change is precisely that, and already you presume to know the specifics of which chip? Forget A12, forget A-anything, forget the nonsense of attempting to attribute a brand name to a processor of chipset which is, as far as ANYONE outside Apple R&D is concerned, is merely a daydream.
[doublepost=1527163600][/doublepost]
...holy ****. First off, Apple did a massive renovation when they bought the place, so the reporting on what was made there before doesn’t have any bearing on what their doing there.

Second, you insisted they don’t own a fab, they clearly do. So you went from insisting that they don’t own any such thing, to now knowing what they’re actually doing with the place.

On the very basic premise this started on, does Apple own a chip foundry (regardless of what its used for or what it’s tooled for)? Yes? Yes.

cancel-all-my-meetings-someone-on-the-internet-is-wrong-19305481.png


:rolleyes:
 
that's not likely. Nvidia ****ed apple over with the faulty GPU chips they sold to apple, and they're not likely to forget that.
Because all those Mac Pro underclocked 7770s/7870s/7970s (oops I mean the “professional” D300s D500s and D700s) are super reliable right?
[doublepost=1527239622][/doublepost]
AMD and Apple have a good business relationship. Apple and nVidia do not.
Doesn’t change the fact that the loss is in the customers.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.