Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Quad GTX 1080Ti outperform a single VEGA 64 - true but I would expect that the majority of quad GTX cards setups are found in servers rather than single user work stations. There are many who have access to compute servers that make a quad GTX machine looks like a joke compute wise. I think Apples line up the last years has been aimed at OK work station performance connected with strong servers in the background doing the heavy lifting. This is not a particularly unique setup.

A single GTX1080Ti outperforms a Vega64.

doing ML Training in server farms is not viable/practical there are many factos as network banwidth configurations etc (while google is trying this concept in near future, and Amazon sells GP GPU server time ), server farms in ML are best suited for productions services as Siri, Alexa, not for R&D which requires a lot of flexibility and customization not trivial.

For my information: Are these 4-16 GPU used for iOS Development? Are these machines used as servers of workstations connected to servers?

Sadly currently no ones uses more than 2 GPUs in iOS development because is the max available configuration in the tcMP coz lazy Apple managers didn't care to update it.

In Android (and many others unrelated development) you may need all those GPUs (and even more if available) to train again and again an AI until you manage it to do what you want, then you train it for production, finally the AI is sent to production servers or the mobile devices where the inferences are quite less demanding and can be handled by simpler gpu or dedicated NN.


I am not apologising for anything - just drawing conclusions based on possible strategic scenario: 1. Largerst earner is iOS devices. 2. developers need hardware to develope iOS apps. 3. Current line is propably sufficient.

Your conclusions are Naive, no matter if current hardware its OK for General Purpose Apps, those developers on the edge competing with other platforms need hardware on the edge.

Current line is enough if nobody on past 4 years ever mentioned ML, AR, VR.
 
Mac hardware needs to be able to develop for tomorrows iOS. There needs to be a refocusing on that.

This post by Mago is quite relevant:

... you may need all those GPUs (and even more if available) to train again and again an AI until you manage to [get it to] do what you want, then you train it for production, finally the AI is sent to production servers or the mobile devices where the inferences are quite less demanding and can be handled by simpler gpu or dedicated NN.
This is exactly what we're doing with the big CUDA servers that we have. Training and retraining and retraining again - then pushing the inference models to simpler systems. One production system was built to retrain one model daily (new data is always arriving, and could be subtly changed from older data). (This server has 72C/144T, 2 TiB RAM, and quad GTX 1080Ti.)
 
Last edited:
A single GTX1080Ti outperforms a Vega64.

doing ML Training in server farms is not viable/practical there are many factos as network banwidth configurations etc (while google is trying this concept in near future, and Amazon sells GP GPU server time ), server farms in ML are best suited for productions services as Siri, Alexa, not for R&D which requires a lot of flexibility and customization not trivial.



Sadly currently no ones uses more than 2 GPUs in iOS development because is the max available configuration in the tcMP coz lazy Apple managers didn't care to update it.

In Android (and many others unrelated development) you may need all those GPUs (and even more if available) to train again and again an AI until you manage it to do what you want, then you train it for production, finally the AI is sent to production servers or the mobile devices where the inferences are quite less demanding and can be handled by simpler gpu or dedicated NN.




Your conclusions are Naive, no matter if current hardware its OK for General Purpose Apps, those developers on the edge competing with other platforms need hardware on the edge.

Current line is enough if nobody on past 4 years ever mentioned ML, AR, VR.

Who said the servers need to be off site? What do you think the 10 Gb ethernet port will be used for in iMac Pro? See what Aiden has setup. Local servers. Aiden, are the workstations as powerful as the servers? Naive? Hardly - open minded to other solutions than a big box of full of graphics cards under your desk. If iOS development needs extensive compute power, Apple will resease a Mac pro or server doing just that.

I still do not get why Apple should compete with HP for general purpose workstations. They have not succeeded particularly well the last two decades. You have not provided any arguments about this.

Do you say the current hardware is suffient for iOS app development unless it is centered around ML, AR, VR?
 
I still do not get why Apple should compete with HP for general purpose workstations. They have not succeeded particularly well the last two decades. You have not provided any arguments about this.

I for one couldn't give a toss what "software developers" need in their hardware. Content creators are arguably a bigger market for high end Apple hardware than developers, and we also need stonking big (multiple) GPUs, and the ability to replace them, and the ability to choose better / better suited monitors than Apple chooses.

Apple's failure in workstations, is because Apple refuses to offer a good enough product, pure and simple. It's a vicious circle. No decent / decently priced hardware (or industry-standard APIs) -> developers not wasting time optimising products for Macs / macOS -> claim there's no demand for decent hardware -> repeat.

It's like Ned flanders' parents refusing on principle to discipline their child, but wanting him to behave. "We've tried nothing, and we're all out of ideas". Oh, but here's another "Nuts & Gum, together at last" Workstation-priced disposable appliance, now why aren't we succeeding in the high end content creation market? Why is HP using our products as their marketing message?

Now personally, I don't think Apple should compete with HP in markets where they clearly don't want to commit, I think the fact that macOS isn't directly compatible with Windows (ie there's platform-specific software), should see this laughable construct that Windows and macOS are one "PC" market abandoned, and macOS recognised as being a single market, over which Apple is wielding consumer-hostile control, and cause governments to intervene to structurally separate Apple's hardware and software businesses if they won't enter into FRAND licencing of macOS to any PC maker who wants it.
 
An Updated Single or Dual Socket Cheese-Grater would be welcome by many, but I dont believe Mr Jonny Ive to allow it, its main advantage it will allow a quicker Mac-Pro return.

People keep suggesting this but it isn't any quicker. The internals would have to be redesigned for current chipsets, and all it does is distract from the next Mac Pro. You'd be talking at least a year to restart the classic Mac Pro design, especially once you include time to get the factories churning them out again.

It really doesn't make a lick of sense, especially cause you'd have to toss the Thunderbolt ports that Apple has already started the transition to.

And if you solve how to put Thunderbolt ports in a Mac Pro tower... well you've already solved the next Mac Pro then.

The production lines for the classic Mac Pro were dismantled long ago, along with the classic Mac Pro engineering team. It's not like just flicking back on a light switch.
 
Who said the servers need to be off site? What do you think the 10 Gb ethernet port will be used for in iMac Pro? See what Aiden has setup. Local servers. Aiden, are the workstations as powerful as the servers? Naive? Hardly - open minded to other solutions than a big box of full of graphics cards under your desk. If iOS development needs extensive compute power, Apple will resease a Mac pro or server doing just that.

I still do not get why Apple should compete with HP for general purpose workstations. They have not succeeded particularly well the last two decades. You have not provided any arguments about this.

Do you say the current hardware is suffient for iOS app development unless it is centered around ML, AR, VR?
I personally use 10GBe with an slave compute server (a Xeon-D with an single nvidia GP100) in my operation I do a lot of double precision compute on this host, I'd like to not need a black box running linux along my iMac (non-pro 1st gen 5K iMac with TB2 10G adapter), I do this thanks to nVidia and intel specific cross-platform remote toolchain, of course I'm not programming anything to run on iOS impossible today nothing in Xcode (ML,AR) allows such setup (remote cross-platform debugging.compile, etc), not to remember you Adobe Premeire dont do Video Edition on external compute servers, and a long etc.

Aiden was specific to their production server, not they R&D workstations, all we know what he uses.

The Problem is, Apple Needs to release a product for iOS development well before it is needed otherwise iOS developers will be for ever on Android developer's Mirrors, AR/ML,VR trends where so obvious even when Apple launched Siri (one of the first widely available AI developments).

Luckly Siri was developed 100% on linux then integrated into iOS (was intended to be multi-platform before Apple purchased Siri creators), this is not an easy toolchain for AI, neither Core ML (aka ML-kit) foresee such toolchain, it requires to run ML training on the Mac.
[doublepost=1505819601][/doublepost]
People keep suggesting this but it isn't any quicker. The internals would have to be redesigned for current chipsets, and all it does is distract from the next Mac Pro. You'd be talking at least a year to restart the classic Mac Pro design, especially once you include time to get the factories churning them out again.

It really doesn't make a lick of sense, especially cause you'd have to toss the Thunderbolt ports that Apple has already started the transition to.

And if you solve how to put Thunderbolt ports in a Mac Pro tower... well you've already solved the next Mac Pro then.

The production lines for the classic Mac Pro were dismantled long ago, along with the classic Mac Pro engineering team. It's not like just flicking back on a light switch.

I mean, a classic Tower-like design as the Cheese-grater, NOT to to force new Motherboards on the legacy cheese-grater chassis, its much quickier to release a cheese-grater like solution since its more flexible and quicker to develop than a Trash-can like, even dont scare about Motherboard design, it its almost automated an all new motherboard based on Intel or AMD reference designs cost less than 50K for a dozen or more prototype cards, Apple could commision the chasis to one of thousands chinese factories already doing that, a cheese-grater like wont include 5.1/5 bays maybe even not spinners bays, always will be quicker and cheaper than a trashcan-like design, or the hybrid tc-chesee-grater designs which requires more tricky R&D about thermals, form and function etc, Apple now has almost 6 months on that, most pc Manufacturers comes to the market quicker than that starting from zero.
 
Last edited:
We are on the same page regarding the need for a big Mac Pro to meet requirements for developers and content creation. I only argue that there are other setups that are viable than big boxes with lots of graphics card.

I totally agree that Apple dropped the ball regarding users needing lots of power. Look for instance how they practically have given away 3D modeling and ray tracing to the windows community. There a more apps under windows and not the least NVIDIA cards for ray tracing and high FPS performance in huge models. I know AMD can be used but not all software work or is optimised for AMD.

The lack of commitment from Apple in content creation areas is for me an indication that they see Mac Pro as developing platform for iOS devices and some content creation rather then a strong general workstation for content creation or ML. Now that ML, AR/VR (8k? And particle simulations!) has gotten their attention we can hope for more versatile machines to meet the demands in these areas.
 
Local servers. Aiden, are the workstations as powerful as the servers?
The users mostly have MacBooks, and some Windows laptops. Mostly ssh shell windows to the the servers - nothing with much graphics.

A few have low-end workstations with mid-range graphics for the people who want three or four 4K displays - but the local GPU is only for pushing more pixels that a laptop can handle.
 
Does anyone think the existing 2013 macpro 6,1 will continue to be sold when the new 7,1 is finally out? Also will the price drop to reasonable range to take into account that it old tech?

Since they have the design in production - which is pretty slick from an aesthetic perspective - they should repurpose the can as a new mid-ranged standalone. I'd figure the thermal issues using consumer grade components wouldn't be present. Just keep the options simple like the Mini, a 3.x i5/QC or a 4.x i7, slot the GPU (oh the horror!), price it relative to the iMac minus the display (plus the hotness :D)
 
A mac pro with options for lots of graphics card + Ram/Dual-single CPUs can scale both ways - downstream and upstream for the content creation market - be they developers, DCC professionals or other markets where compute power is a requisite.
 
The users mostly have MacBooks, and some Windows laptops. Mostly ssh shell windows to the the servers - nothing with much graphics.

A few have low-end workstations with mid-range graphics for the people who want three or four 4K displays - but the local GPU is only for pushing more pixels that a laptop can handle.
Thank you. Weak"workstations"-strong servers. I assumed that that was the setup but wanted to make sure.
 
Since they have the design in production - which is pretty slick from an aesthetic perspective - they should repurpose the can as a new mid-ranged standalone. I'd figure the thermal issues using consumer grade components wouldn't be present. Just keep the options simple like the Mini, a 3.x i5/QC or a 4.x i7, slot the GPU (oh the horror!), price it relative to the iMac minus the display (plus the hotness :D)

It would be beautiful, fur for such purpose they consider the iMac pro is better, I think the tcMP may inspire the next Mac Mini, but harly we will see it to reborn with Core i7-7800K + dual RX570.

About the possible choices for the next Mac Pro, I missed the new Zen APU for compute, an 8-16 core Zen + Vega56/64 GPU and 12-16GB HBM2 on the same MCM, AMD is reading this monster APU for the Hexascale Computer Solutions likely will couple two of these APU on the same motherboard/blade on servers like HPE Moonshot.

A Mac Pro with Single/dual High Performance APU in a 600-800 W TDP could recycle the TrasCan concept allowing from 8 to 32 total cores and from single Vega 56 to dual vega 64. but means more AMD dependency in an design unlikely to be adapted to nVidia/Intel solutions.
 
Like I said before, it would make sense and it would be awesome, everyone would be catered for, with the Xeon-W on the nMP for a 1S workstation, and the ncMP (new cMP - lack of a better name) as a bad a$$ 2S workstation and server (tower design of course, follow up of the cMP) with Purley and Xeon-SP.
If after this setup anyone still had anything to whine about, trolling comes to mind :)
[doublepost=1505917559][/doublepost]nMP with dual Polaris 580, dual Vega 56 or dual Vega 64, dialed down to fit the power budget.
On the ncMP full fat GPUs would be a must of course.
 
I mean, a classic Tower-like design as the Cheese-grater, NOT to to force new Motherboards on the legacy cheese-grater chassis, its much quickier to release a cheese-grater like solution since its more flexible and quicker to develop than a Trash-can like, even dont scare about Motherboard design, it its almost automated an all new motherboard based on Intel or AMD reference designs cost less than 50K for a dozen or more prototype cards, Apple could commision the chasis to one of thousands chinese factories already doing that, a cheese-grater like wont include 5.1/5 bays maybe even not spinners bays, always will be quicker and cheaper than a trashcan-like design, or the hybrid tc-chesee-grater designs which requires more tricky R&D about thermals, form and function etc, Apple now has almost 6 months on that, most pc Manufacturers comes to the market quicker than that starting from zero.

I don't think even this is as simple as you make it sound. Especially considering Apple would have to support the thing for the next 5-6 years. They're not just going to throw some rebranded Chinese generic thing on the market and hope it holds up. You're also talking about software that has to go with. Card drivers (the current Vega and 1080 drivers do not cut it), firmware, QA, etc...

This whole idea also doesn't solve the Thunderbolt problem, which I know you probably don't care about, but Apple would never go for given how strategic Thunderbolt is to them. No Chinese manufacturer has a ready to go design that does PCI-E GPUs and Thunderbolt in a package Apple would accept.

The upgrade can Mac Pro idea seems more realistic than this mess.

Again, especially when that time could go to the Mac Pro.

It's just better Apple isn't distracted and that they move right on to the next Mac Pro.

Apple already knows anyone who basically wants the cheap Chinese crap idea can do a Hackintosh anyway. And plus who's going to buy that thing if they can just wait 6-12 months and get a quality Mac Pro anyway.
 
Thunderbolt 3 equipped motherboards are becoming more and more common daily. You have to look for them but they are available. If the PC world can do it. It should be piece of cake for Apple.

Ok. Link me one that supports Thunderbolt graphics from the dGPU without a breakout PCIe card. Bonus points if you can find me one that supports Xeon.

There are boards out there that only support iGPU without the breakout card. But Xeon has no iGPU and that’s not very Pro.

The PC world has not done it which is why it’s not a piece of cake for Apple.
 
Ok. Link me one that supports Thunderbolt graphics from the dGPU without a breakout PCIe card. Bonus points if you can find me one that supports Xeon.

There are boards out there that only support iGPU without the breakout card. But Xeon has no iGPU and that’s not very Pro.

The PC world has not done it which is why it’s not a piece of cake for Apple.

Are you meaning like a thunderbolt connection to the display? If that's the case then Ok I see what you are talking about. I personally hope they don't go down that route. DisplayPort is fine by me.
 
Are you meaning like a thunderbolt connection to the display? If that's the case then Ok I see what you are talking about. I personally hope they don't go down that route. DisplayPort is fine by me.

I think DisplayPort could be an option on the next Mac Pro (especially if it takes dGPUs which will probably already have a DisplayPort), but I don't see any way Apple ships something that doesn't support Thunderbolt displays.

Intel mandates that Thunderbolt ports must support displays anyway, and with USB-C Apple will want to support the DisplayPort alternate mode for USB-C as well. So it's pretty inescapable.

I don't know if Apple's next display will be Thunderbolt or not, but Apple's previous two sanctioned pro displays being Thunderbolt, and their MacBook Pros only shipping with Thunderbolt/USB-C will be a problem for making sure all their products are compatible if they don't support video over Thunderbolt/USB-C.
 
I predict the new 7,1 will feature a modern take on a throwback to the old cheese grater design. I can see them re-purposing the current trash cans with a silver paint job for the new mini. Make a little wider and a bit shorter.

I think it's going to be a SFF unit, not like a NUC, but bigger, but lighter than a traditional SFF. I predict it'll feature user upgradeable PCI-E SSD, RAM, and other neat features. Was the CPU upgradeable on the old Pros?
 
I think DisplayPort could be an option on the next Mac Pro (especially if it takes dGPUs which will probably already have a DisplayPort), but I don't see any way Apple ships something that doesn't support Thunderbolt displays.

Intel mandates that Thunderbolt ports must support displays anyway, and with USB-C Apple will want to support the DisplayPort alternate mode for USB-C as well. So it's pretty inescapable.

I don't know if Apple's next display will be Thunderbolt or not, but Apple's previous two sanctioned pro displays being Thunderbolt, and their MacBook Pros only shipping with Thunderbolt/USB-C will be a problem for making sure all their products are compatible if they don't support video over Thunderbolt/USB-C.

Best case scenario would be not having the Thunderbolt Display connector on the graphics card but still running via the graphics cards. I say that as it would mean the gpu would need a thunderbolt port which no gpu has. It would kind of go against the whole idea of making things swappable and interchangeable in the new system.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.