Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Mobile and desktop dies were always exactly the same. There is no difference between BRANDING of the GPU dies currently in Nvidia lineup, but GPUs are heavily down clocked compared to desktop ones. Thats why you get GTX 1060 currently. GTX 1050 is coming soon with under the 75W thermal envelope, on desktop, and around 40W in Mobile.

AMD offers already RX 470 in laptops, in the same branding manner as Nvidia. You can argue with Apple, but they will not use the Pascal GPUs. Look no further than kexts in macOS Sierra. There is support already for RX 460, 470 and 480. No sign of not only Pascal, but Maxwell GPUs in the system.
This isn't surprising, Apple would have to be completely off their rocker to even consider putting an NVidia chip in a Mac.

By the way Mac OS Sierra rocks big time. It looks like a release that most can transition to right away with little in the way of problems.
 
This is complete and utter nonsense.

no, it's not.

I'm not talking about app programming shops, or the graphic design places, where OSx is still leader.

I'm talkign about Banks, Corporations, conglomerates. Financial institutions, and just about the very foundation that modern digital era runs on.

it runs on windows and it runs on n*x. its almost entirely x86 and PowerPC.

Apple has done a fantastic job at convincing everyone through media sponsorship, product placement and student value that Apple is prolific everywhere, but it is not.

https://www.netmarketshare.com/operating-system-market-share.aspx?qprid=10&qpcustomd=0

and this is only for desktops.

this doesn't include servers which are 99% x86/PowerPC, since there are no modern Apple servers or server OS.
 
Maybe a custom mobile AMD Zen SoC? A separate GPU would be anathema to Apple's recent focus on dumbed-down Mac hardware. If eGPU support is refined it may not even be necessary. Imagine if MacBook users can jack into a high end video card when on AC power. Sure sure, they can technically do it now, but as I understand it remains a buggy solution.
 
You clearly have no idea what it entails to move from intel. Not just from apple but from every single developer and app that runs on mac period.
It isn't a big deal at all for mainstream app developers that code in high level languages. Don't believe me, then look at the number of apps in the various Linux repos running on ARM.

The only developers that suffer big time are those that don't code well (crap coding) and those that leverage processor specific features.
 
  • Like
Reactions: poorcody
It isn't a big deal at all for mainstream app developers that code in high level languages. Don't believe me, then look at the number of apps in the various Linux repos running on ARM.

The only developers that suffer big time are those that don't code well (crap coding) and those that leverage processor specific features.
There are a few Distro's of Linux made for arm, but many of them are not full featured, and many hvae much MUCH reduced repositories. just because the OS is ported, doesn't mean you'll be able to have a full solution

it's the same issue that Microsoft ran into with Windows RT. Just because they made the OS and gave a high level language doesn't mean everyone jumps on board when it means supporting potentially two, or more different platforms.
 
It seems to me this isn't a problem at all for Apple. Either they:

- update now with Skylake and the wait another year and half for an update with coffe lake

- or they could just update now with Skylake and then make a minor update at the end of next year with the same processor like the last one.
 
  • Like
Reactions: LordVic
They will not be obsolete. Quad core Skylake CPUs for mobile are with us, according to the roadmap to... 2018 year. Kaby Lake is exactly the same architecture as Skylake is, but with 200 MHz higher core clocks. No difference in IPC, according to SiSoft Sandra latest leaked benchmarks, all difference in higher performance due to higher Turbo clocks on Kaby Lake CPUs. They cannot even get past 200 MHz higher clocked Skylake CPUs. Why would they be obsolete? Because they are named differently?

Times when architecture changes bring very large increases in performance are past. Better get used to that with every die shrink we will not see any improvements, apart from... increased efficiency.

Very true.

The problem is that Apple save on the BOM without passing anything on to Mac users. Do we get cheaper MacBooks since Apple use old silicon? Does Apple sink the extra money into other performance enhancements, for example a higher performance GPU for use when plugged in? That could make for a better laptop - use a slightly slower CPU and spend more on a fast GPU. Even some extra ports would enable users to avoid carrying so many extra hubs and adapters.

The answer is always no. No new features. Instead they take features away and say it's courageous innovation.

So let's ask, if the user sees no benefit to a generation old CPU/chipset, then why not just use Intel's latest and best? Assuming the goal is to sell more MacBooks, of course.
[doublepost=1474570679][/doublepost]
It isn't a big deal at all for mainstream app developers that code in high level languages. Don't believe me, then look at the number of apps in the various Linux repos running on ARM.

The only developers that suffer big time are those that don't code well (crap coding) and those that leverage processor specific features.

What about booting natively into Windows? That's a huge feature that enables more users to transition to Macs despite needing to use one or two Windows-only apps or booting into Windows for casual gaming.
 
It's not about "courage" it's about the engineering. Intel FORCES companies to take the iGPU now. You can't buy a mobile chipset without it. So you're always going to be using mobile graphics... there's no way to turn it "off" in terms of battery life. Your only option is to ad an expensive and large third party GPU if you want better. You won't get Apple's simplicity of computer design then... it's simply not an option for the tiny notebooks now.

I'm definitely leaning toward a macOS ARM-based machine. Intel has backed PC makers into a corner on the graphics issue for almost 10 years now. AMD simply doesn't cut it for mobile workstations... so going to an AMD solution with better graphics isn't an option either. We haven't seen what A10x looks like yet... with extra graphics or cores it would be a monster and sip battery life.
Ultimately the ARM based laptop would be built upon slightly different hardware than is seen in the A series chips today. In an ideal world the chip would be running several cores from a big cache, im thinking six to eight cores here. That sounds like a lot but it actually isn't a big chip at all, if you look at chip layouts already on the net, eight cores and a big cache would be easy.

As for a GPU it could be external or a second die using TSMC new techniques. If we break the GPU out onto a separate chip the GPU is free to evolve independent of the processor system. It is interesting to note that some of Intel's apparent issues with Apple suitable chips revolved around actually getting the advanced GPUs to work correctly. Separate dies greatly reduce technology advancements being gated by one or the other subsections.

I'm at the point where I believe Aplle could produce an ARM based Mac if they really wanted to. These would be Macs with performance comparable to today's laptops possibly even substantially better for heavily parallel code sets. It is just a matter of them being willing to go the extra mile and do it. What should be obvious is that they need to do "silicon" if they expect to make progress on machine design. At this time a laptops design is basically contained in one or two pieces of silicon with little in the way of room for external engineering.
 
It's time for Apple to switch to their A10 Fusion processors for the Mac lineup. The situation is very similar to 11 years ago, when Steve Jobs decided to switch from PowerPC processors to intel.
The situation might seem similar on the surface but it isn't. An A10 running something intensive ( let's pick Resolve for now ) will overheat in a couple minutes. Right now it can only handle short bursts of full throttle. Apple has a long way to go to even maintain performance parity with Intel chips much less pass them.
 
If Apple moved away from Intel I would be very hesitant to continue investing in the macOS platform. I don't want to be stuck in an Apple only ecosystem (not saying this would happen necessarily but that would be my primary concern).

After this almost 500 day delay in the MBP I don't think I am alone in waiting to see where Apple is heading with regards to the Mac platform. Thankfully my current Macbook Air is still going strong (just ran out of Apple care at the end of August). I don't necessarily need a dGPU, latest chips etc. But I don't want the MBP to turn into the Mac Pro or Mac Mini with regards to updates. I don't want to go backwards in functionality (eg by replacing the headphone jack with lightening). I basically don't want Apple to turn everything into a rMB (aka an iPad with a keyboard - nothing wrong with that but I need a more flexible well rounded computer).

I'll be waiting to see how this all shakes out. Best case scenario, Apple figures out a solution for the 15" MBP that doesn't delay or screw up updates for the 13" MBP and/or MBA and they get back on a regular yearly upgrade schedule with Intel processors (and this year was just a terrible aberration). Worst case scenario, I buy a new battery for my current 2013 MBA, use it until it dies, and live with the fact that it may be the best laptop I'll ever own...:(
 
Still hvaing troubles finding them around Canada for less. Retail seems to know they can hold onto the $500+ CAD price tag, because Nvidia has priced their newest GPU's really high here. as I mentioned, the 1070's are all over $600 CAD, and the 1080's are all over $900 CAD. even compared to USD pricing this is several percentage points higher than the CAD to USD conversion, meaning that Nvidia is purposely raising their prices for us above the US market.

again, this has meant that those deep discounts to AMD's last gen comparable products aren't dropping. I'd have to buy used to get savings. Nvidia really screwed the Canadian market up with their pricing being so high, giving no incentive for last gen chips to drop in price here
http://www.newegg.ca/Product/ProductList.aspx?Submit=ENE&IsNodeId=1&N=100007708 601107975
[doublepost=1474566413][/doublepost]

sorry, I broke my train of thought in that post and responded elsehere

that was supposed to be the Mac Mini

Drive across the border!
 
Drive across the border!

I do that from time to time. It's a 2.5 hour trip to buffalo. right now though with the dollar disparity it's not really all that cost affective for me. hopefulyl as the dollar balances, i'll start doing more drives.

It's the only way I get my Cherry Coke-Zero fix
 
It depends on what your graphics needs are. I'm happy for someone else who is more knowledgable to weigh in, but as I understand it, the A processors like the A9 and A9X in the iphone 6s and iPad Pro use a PowerVR hexcore/dodecacore chip respectively. And those can perform ~150/350 Single Precision (FP32) GFLOPS. Compare that to a desktop Nvidia GTX 1080 which can do 8000+ Single Precision GFLOPS. The intel IRIS GT3e line is around 800. While GFLOPs alone are not everything in terms of gfx performance (memory bandwidth, bus bandwidth, CPU-GPU bottlenecking, programming + instruction sets, algorithms, etc all matter a lot too), but it gives you a general sense of computing power where iGPUs are compared to high end dGPUs.

To compare to consoles A PS3 / Xbox 360 are around 200 SP GFLOPS and the next Gen PS4 and Xbox One are around 1500 GFLOPs.

So don't expect too much too graphics performance soon out of shading, pixel fill, vertex calc etc for the apple chips (as they are). Still seems to make sense to go with a discrete GPU if Apple is serious about supporting things like VR in years to come, even if they ditch intel for their general CPU instructions.

You need to remember that the PowerVR solution is built to meet power requirements as much as it is built to performance requirements. Apple can get substantially better performance if they want to by a number of techniques. Given that they could easily produce a better GPU than what Intel ships. If they wanted too that is,it is a matter of allocating die space. One thing that is obvious on current A series chips is that a lot of space goes to "other" functionality. If that space can effectively be used for a laptop/desktop solution performance of an Apple designed laptop GPU won't be a problem.
 
I agree with many key points that people have brought up. Apple has simply waited too long to update all of the MAC hardware platforms, too little love has been given to these great little devices.

Intel GPU? Latest versions are sufficient for normal office / web / video watching activities. MBP needs to have a discreet GPU regardless of what people think Apples position is.

AMD ? It is possible that Apple has been waiting for AMD CPU/GPU/APU and that is part of the reason it has taken so long for us to get new MAC hardware. If AMD can make a Zen APU that is rock and roll on a budget and Apple wants to use it, cool with me.

ARM? It would be possible for Apple to make their own CPU and GPU using A11+ architectures. It would be possible for them to even optimize the OS for their own hardware. Would this make marketing sense? If they were able to make ARM versions of MB and MBP that had touch screens, good performance, etc. Would it be a good marketing decision? Would it possibly hurt their iPad Market share as people switched to this new MAC ARM platform?

It is hard to say. I like my iPad mini but if you told me I could get a 12" Retina MB with an ARM processor and good enough performance for a similar price to a iPad I would ditch the iPad. Other people might not want an ARM based MAC and would be much happier with the iPad. If I had a crystal ball and could predict this market I could find a way to make myself rich.
 
Mobile and desktop dies were always exactly the same. There is no difference between BRANDING of the GPU dies currently in Nvidia lineup, but GPUs are heavily down clocked compared to desktop ones.

The branding doesn't match up the the dies the same way. Nvidia is using the same 'name' ( 1060) but to a different die on deektop and mobile.

"... Which is not to say that the mobile parts are 100% identical to their desktop counterparts, or even configured in exactly the same manner. NVIDIA’s goal, after all, is near performance parity, so how they get there is, in essence, up to NVIDIA. We’ll dive into specs in depth on the next page but case in point will be the GTX 1070; the desktop version has 1920 CUDA cores, but the mobile version has 2048 cores clocked at a slightly lower clockspeed. Presumably, it’s more power efficient for NVIDIA to go wide and slower than pushing the clocks quite so hard. ... "
http://www.anandtech.com/show/10564...series-for-notebooks-unveiled-launching-today

They are clocking down, but it is a combination of clocking more cores down to a substantially slower pace (turning off subsections are hopefully don't need them.). Under modest loads it is a net lower but when pushing the outer edges of workloads it likely isn't.
 
They are targeting the only one nieche: Apple ecosystem. On the other hand: I will give you pretty steep task. Show me laptop from other company that has high resolution screen, fast, and large amount of RAM, fast SSD, Quad core CPU, and GPU with performance of GTX 960M/RX 480M, all locked in 85-90W power supply.

As far as I know, some ultrabooks have that power supply, but not such configuration.

Sorry, but which Apple in an 86-90W power supply is fast, has a large amount of ram and the graphics performance of a GTX 960M? The GTX 960M chip alone takes 60 watts, and Apple only offers integrated graphics. Add the 60W GPU to Apple's 90W machine and you're up to 150W.

So I'll one up you. You didn't specify thickness so I could pick a 1.5" thick Alienware, but I'll do it in a 0.7" thick form factor to disprove the Apple myth you need to sacrifice performance to make it that thin. Razer Blade 2016 edition QHD+. 14" 3200x1800 display, Intel i7-6700, GTX 1060 6 gig, 16 gig of ram standard, standard PCIe M.2 with up to 1TB preinstalled.

Before you start poking holes, and looking for the one insignificant detail where Apple wins, it has Thunderbolt 3, it has USB-C, stereo speakers, built in bluetooth, etc. It's also a capacitive multi-touch screen and a gorgeous machine with an RGB backlit keyboard. And it's cheaper than the 15" MBP.



And if you really want thinness, the Razer Blade Stealth matches the 0.52" thick rMB. But Razer still gives you the quad core i7, a full 4k screen, a larger battery, and its cheaper. And you get to keep the RGB keyboard from its bigger brother. It will run rings around any current mac laptop, but you do lose the dGPU so it doesn't meet your original requirement.
 
the problem for me is that while I personally like using Apple computers, for my job, I cannot use OSx. end of story. No questions about it. Our application, and our DB back end supports unix or windows. Thats it. They have ZERO interest in OSx and have outright said "OSx support will never come"
I work in automation / manufacturing and see the same thing with Windows specific software, we don't even get a Linux option. However Macs aren't accepted here anyways, you almost never see an automation engineer with one working in a plant.

The funny thing (well not so funny) here is that a good portion of our software can't run on the latest Windows versions either. We are often forced to run the software on XP VMs so compatibility isn't a forgone conclusion in the Windows world anymore.
We're talking about Enterprise level database systems and programming languages used in dozens of financial institutions around the world.

so, if Apple did go ARM, I would never be able to use their computers again for work.
Yep but it isn't a big deal really because Apple is already Locke doubt of a lot of enterprise functions already.
This goes for the front end, and about 99% of enterprise/corporate computing. OSx has made good inroads in consumer usage, but thats it. They have ZERO enterprise / corporate presence outside of the occasional ipad, or idevices using OWA.
Which is my argument that going ARM doesn't really matter. You can't loose business you never had.
One thing that many consumers don't realize is that the vast majority of the world is built on x86 and PowerPC. As mentioned before as well, is that OSx usage is roughtly only 5% of the worlds desktop users.
Actually in automation the Windows machines are almost never laptops or even desktops. It is an interesting world really, I've seen machines with their own rack systems two or four columns wide, sealed up with their own air conditioning system and all the other goodies associated with a data center installation. These are parked right next to the "machine" on the production floor. What I'm saying is that a good number of Windows installs have nothing to do with the desktop and it is an arena Apple doesn't play in.
if you can live 100% within the MacOS universe, yes, you'll likely be OK. but for the rest of us? the 95% of us? moving from Intel would be suicidal
You got the percentages all wrong! Moving to ARM would at best harm 5% of Apples user base. Consider the iOS economy, it is pretty clear Windows isn't needed here. For many Mac OS users the greater need for compatibility comes from the ability to use Linux derived software, here ARM isn't a negative at all.

I've been through this myself only coming back to Apple due to the ability to run Intel hardware and the UNIX underpinnings of MACOS. It has taken some time for me to realize that the only thing I really need supported on an ARM based machine is Java and that is due to a couple of software packages. In the end my usage of Windows has simply reverted to zero. In the end the majority of users have no need for Windows support on a Mac laptop. This isn't the Windows world at all.
[doublepost=1474574330][/doublepost]
One reason I suspect this is the case: according to this article (Jan 2015), Apple buys Intel processors for between $180-$300 per machine. Their own A-processor costs them around $25.


I really doubt that Apple is paying $25 a pop for the A series processors. For one they are always on the bleeding edge and that in itself means high prices. Second these are big complex chips using again a bleeding edge assembly process. The chips are still cheap of course compared to Intel but I wouldn't be surprised to see them costing Apple as much as $65 a pop, but certainly more than $25.
 
  • Like
Reactions: LordVic
Sorry, but which Apple in an 86-90W power supply is fast, has a large amount of ram and the graphics performance of a GTX 960M? The GTX 960M chip alone takes 60 watts, and Apple only offers integrated graphics. Add the 60W GPU to Apple's 90W machine and you're up to 150W.

So I'll one up you. You didn't specify thickness so I could pick a 1.5" thick Alienware, but I'll do it in a 0.7" thick form factor to disprove the Apple myth you need to sacrifice performance to make it that thin. Razer Blade 2016 edition QHD+. 14" 3200x1800 display, Intel i7-6700, GTX 1060 6 gig, 16 gig of ram standard, standard PCIe M.2 with up to 1TB preinstalled.

Before you start poking holes, and looking for the one insignificant detail where Apple wins, it has Thunderbolt 3, it has USB-C, stereo speakers, built in bluetooth, etc. It's also a capacitive multi-touch screen and a gorgeous machine with an RGB backlit keyboard. And it's cheaper than the 15" MBP.



And if you really want thinness, the Razer Blade Stealth matches the 0.52" thick rMB. But Razer still gives you the quad core i7, a full 4k screen, a larger battery, and its cheaper. And you get to keep the RGB keyboard from its bigger brother. It will run rings around any current mac laptop, but you do lose the dGPU so it doesn't meet your original requirement.
Simple: 2 cm thickness, 2 kg, 15 inch, 3K display at least(2560x1440P), RX 480M/ GTX 960M, 16 GB of RAM, quad core CPU, SSD, 85-90W PSU.

Is there anything on the market that has something like this?

P.S. If you like the Razer Blade, why you will not buy it instead of Apple offering? Its a serious question.
 
There are a few Distro's of Linux made for arm, but many of them are not full featured, and many hvae much MUCH reduced repositories. just because the OS is ported, doesn't mean you'll be able to have a full solutions
All the mainstream software runs fine. Beyond that just because an individual developer hasn't ported doesn't mean it can't be done.
it's the same issue that Microsoft ran into with Windows RT. Just because they made the OS and gave a high level language doesn't mean everyone jumps on board when it means supporting potentially two, or more different platforms.

I really doubt the idea that ARM was the big reason Windows RT failed.

However you seem to be distracted by the idea that supporting two platforms is a big chore, it isn't for a properly trained developer. Even in the Mac world we have developers supporting two platforms at the same time, sometimes three. The Mac and the iPhone often have apps from developers with mirrored functionality, you can throw iPad into the mix to get three platforms. The same developers might have a Windows app too. An ARM based Mac wouldn't be any different for the majority of the developers out there than a current Mac. All API's would be identical, Apple has gone to great lengths in fact to make sure Key libraries are the same on both platforms with MACOS and iOS so clearly this isn't the problem.

In the end Windiws RT died from a combination of ignorance in the developer community and likewise issues in the user community.
 
Simple: 2 cm thickness, 2 kg, 15 inch, 3K display at least(2560x1440P), RX 480M/ GTX 960M, 16 GB of RAM, quad core CPU, SSD, 85-90W PSU.

Is there anything on the market that has something like this?

P.S. If you like the Razer Blade, why you will not buy it instead of Apple offering? Its a serious question.

Simple, is there anything from Apple like that?

The GPU you're specing consumes 60 Watts all by itself. All those other parts take power too, not to mention the CPU.

Why not require it have a GTX 1080 and run for a month off a single AA cell?
 
I really doubt that Apple is paying $25 a pop for the A series processors. For one they are always on the bleeding edge and that in itself means high prices. Second these are big complex chips using again a bleeding edge assembly process. The chips are still cheap of course compared to Intel but I wouldn't be surprised to see them costing Apple as much as $65 a pop, but certainly more than $25.
$26.90 for the A10-Fusion according this MacRumors article.
 
I do that from time to time. It's a 2.5 hour trip to buffalo. right now though with the dollar disparity it's not really all that cost affective for me. hopefulyl as the dollar balances, i'll start doing more drives.
Well no matter who gets elected as our new president I see big economic problems for the USA. So next year you might be driving a lot.
It's the only way I get my Cherry Coke-Zero fix

Cherry Coke Zero >> mother Milk. Love that stuff.
 
Pretty hilarious that Apple moved away from IBM chips to Intel because they didn't want to be held back by IBM's slow development cycle. And now they're hamstrung by Intel's cycle. The sooner Apple can bring the whole CPU and GPU development in-house, the better. Even if it means buying AMD or another fabricator.

It wasn't as much IBM's slow cycle but Apple's willingness to pay for the whole thing for themselves. Sony and Microsoft asked for custom PPC processors and got them. Apple wanted customer stuff at x86 prices, but with orders of magnitude lower x86 volume. It just doesn't work ( not IBM , Motorola's fault that Mac volume was and is what it is). Apple went x86 far more so to get onto a more affordable (and broader cost sharing base) more so then something wrong with PPC. (i.e., PPC major problem is that no one else but apple was using the chip in volume for mainstream PC settings. Apple was and still is a relatively small single digit player in the classic mainstream PC market. There is no bulk volume relative to the larger market. )

Same things appears here. Apple can't make Intel do things for which there is only a narrow small market for. Most of Apple's competitors have tried GT3 and GT3e desktop stuff and bulked. It wouldn't be surprising that the Intel stopped road mapping them in the higher TDP contexts because discrete GPU (dGPU) were winning more designs in that space. Apple by itself can't make it go. ( Even Apple has to go through gyrations to create enough volume for bulk leverage. The same CPU package in the MBA , entry Mac Mini, entry iMac... that is all to buy in a larger allotments. )

Competitors to the MBP 15" in that same price space mostly all have dGPUs.


P.S. Would be interesting and not on these roadmaps if Intel was planning to do a dGPU at some point in the 2018 timeframe. Take the experience they have built up in making larger GPUs and just use a whole die instead of sharing space with the CPU and PCH. A 100% mobile focused one, not trying to go mid-high end desktop at all. Or just moving to GT2 only allows them to focus on more space optimized so can get more stuff on single die ( and GPU performace just left for dGPU from AMD and Nvidia. )
 
All the mainstream software runs fine. Beyond that just because an individual developer hasn't ported doesn't mean it can't be done.


I really doubt the idea that ARM was the big reason Windows RT failed.

However you seem to be distracted by the idea that supporting two platforms is a big chore, it isn't for a properly trained developer. Even in the Mac world we have developers supporting two platforms at the same time, sometimes three. The Mac and the iPhone often have apps from developers with mirrored functionality, you can throw iPad into the mix to get three platforms. The same developers might have a Windows app too. An ARM based Mac wouldn't be any different for the majority of the developers out there than a current Mac. All API's would be identical, Apple has gone to great lengths in fact to make sure Key libraries are the same on both platforms with MACOS and iOS so clearly this isn't the problem.

In the end Windiws RT died from a combination of ignorance in the developer community and likewise issues in the user community.

"An ARM based Mac wouldn't be any different for the majority of the developers out there than a current Mac."

What about for companies like Adobe, (Creative Pro Suite), Autodesk (AutoCad), and MS (Office) that make Mac desktop software? I can't see big software suites like those created by such companies would be easy to port over to an ARM-based Mac.
[doublepost=1474575667][/doublepost]
Well no matter who gets elected as our new president I see big economic problems for the USA. So next year you might be driving a lot.


Cherry Coke Zero >> mother Milk. Love that stuff.

"Well no matter who gets elected as our new president I see big economic problems for the USA."

Why is that?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.