Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
DX12 feature is Explicit Asynchronous Multi-GPU and its windows only as is not designed arroun any open GPU std, the closest is AMD's XConnect which blends rendered scenes (actually data) into one specific frambuffer, it may require a bandwidth relative to the framebuffer size and framerate, for a 5K display it may require 8 PCIe lines equivalent dedicated for a 120fps 5K display (maximun), consider even the nVidia Pascal GV100 barely saturates PCIe 8x, it is quite possible with PCIe3, but there is nothing working on macOS like that enabling Heterogeneous GPUs, neithr in Linux, only on Wincrap DX12, and is not quite stable, neither polite.
Which video?

I thought there was a project like this for linux, dgfx or dfgx or something ( I honestly do not remember the project name ). I think the aim of the project was to do other things, like a virtual unified desktop over multiple workstations, but it could be 'hacked' to do 'framebuffer tunnelling' between devices on the same host. Its been a while and I forget all the details, but I didn't think the concept was as big of a speedbump as its made out to be here. The kids over at Level1techs talked about a iommu/vm project where they routed output from vm's running with headless gpus back to the host with like 1ms latency. I didn't hear anything about a homogenous gpu environment being a requirement for either ....
 
This is ridiculous, given that the only obstacle to Apple making more money every quarter off Macs is mostly their update schedule or lack of it. The reality is that Apple doesn't need pros and hasn't for a decade now. The only question is whether they think it's still important enough as a fringe group to keep invested in the platform. I hope they are, but the people who posit Apple's failure to cater to pros as Apple's doom are the ones who are stuck in the 90s. The stakes aren't that high at all.


I agree Apple doesn't need pros , and hasn't for a long time .
I'm wondering if Apple wants to keep Macs around . And for that, I maintain the opinion that a line of traditional towers would be required .
[doublepost=1517306926][/doublepost]
Well, I'm a freelancer ("non-pro" according to you -- a gratuitous insult but I'll let it pass) and I regularly wish I had a second monitor in my 5K iMac setup (and if I get an iMac Pro I'll wish for the same thing).


Sorry for the misunderstanding .
I'm a freelancer as well, I'd been referring to a comment made earlier in the thread, re. freelancers not being professionals . ;)
 
  • Like
Reactions: monokakata
I'm wondering if Apple wants to keep Macs around

Apple would like you to keep buying as many apple products as possible. If that means you have to buy macs because iOS devices can’t do something then Apple is happy selling you some macs.

That by itself sugests macs have some future.
But the day it doesn’t make financial sense to keep macs around, Apple would like to make a quick exit.
 
Last edited:
It let's you take an image from one GPU. And load it onto another.
Moreless I do that everyday, I run a 1070 as eGPU on my my (for cuda compute), doing that "Its Possible" now, but is not the Way to do Graphics, Metal has true HSA support (as long GPUs are all AMD), doing that the way you think its possible implies to code for CUDA and Metal (and every other platform) to support every GPU, and while it is technically possible, as now Its very deficient, since Data among Frame Buffers hat to be moved by the CPU and the CPU's DMA.
[doublepost=1517326113][/doublepost]
I thought there was a project like this for linux
Its now part of Vulkan's HSA related features pending to do, sadly Apple axed Vulkan.
 
as now Its very deficient, since Data among Frame Buffers hat to be moved by the CPU and the CPU's DMA.

Which circles us back around to the original point: It's now efficient enough that Crossfire does this instead of requiring a bridge.

It was efficient in Apple's demo 8 years ago even.
 
Which circles us back around to the original point: It's now efficient enough that Crossfire does this instead of requiring a bridge.

It was efficient in Apple's demo 8 years ago even.
No Is Not (True HSA FrameBuffer requires direct DMA access across the GPU buses, the video requires CPU's DMA (implies system ram, not just PCIe)), new GPU architectures from AMD and nVidia can take PCIe control and start DMI transfers across the GPUs, but as yet this implementation is proprietary not std, indeed neither Heterogeneous.

From here I stop to argue with you about this topic.

I suggest you to read about this topic (HSA) in Metal at developer.apple.com and in Vulkan/OpenCL2.0 at Khronos gorup.
 
From here I stop to argue with you about this topic.

DirectX 12 does this without any special hardware support, and I literally sent you a video of Apple doing this 8 years ago without special hardware support. Both are cross vendor.

If you refuse to believe this is possible, with video proof from Apple, then you're just putting your head in the sand. You can go on about it's not possible because it needs this and that and whatever. But you'd be wrong.

You even told me exactly how it is done, but then said it wouldn't be possible.

Yes, many years ago it was inefficient, but hardware has caught up.

And just to put a final nail in the coffin, there are PC vendors already doing this exact same thing to solve this exact Thunderbolt problem.
 
I agree Apple doesn't need pros , and hasn't for a long time .
I'm wondering if Apple wants to keep Macs around . And for that, I maintain the opinion that a line of traditional towers would be required .
[doublepost=1517306926][/doublepost]

Sorry for the misunderstanding .
I'm a freelancer as well, I'd been referring to a comment made earlier in the thread, re. freelancers not being professionals . ;)

I think the Mac will stick around for quite some time. Apple needs machines to develop the OS on, IOS hasn't yet built a development environment they can dogfood, and having the Mac as a nice separate revenue source that's not intrinsically tied to the fortunes of the iOS ecosystem is a nice hedge. Plus, at the current moment Apple seems to believe in two separate tailored OSes they can get people to double-dip on hardware for rather than MS's one-OS-to-rule-them-all strategy.

There's also the fact that Apple under Tim Cook definitely seems a bit more... nostalgic, I suppose, than under Jobs. I think they'll want to keep the Mac around for no other reason than that's the product that shaped Apple the longest.

On an infinite timescale, though, the Mac's going to change, and whether that means it dies away or simply morphs into something different enough that the old terms don't apply I don't know. I feel like at some point, despite Apple's current reticence, iOS and macOS are going to collide. That could mean pretty much nothing from an end-use case, if it still runs on the same hardware or Apple makes the oft-discussed ARM transition as seamless as PPC->Intel was. Maybe things continue on a similar track as now, you just pick the form factor you like and you get a full-featured OS no matter what. Maybe desktops stay the way they are and laptops and iPads collide into a single convertible line.

On the other hand, I guess the question is whether Apple will see the virtue in continuing 'traditional' PCs in the future. I think the form factor is definitely 'dead' compared to tablets, phones, wearables, etc., but I also think that just like the traditional tower workstation specifically the PC is going to stick around in some capacity for niche cases (like when you just need power, not mobility, which will always be a thing, even as definitions of 'power' and 'mobility' change.) So I can see the concern that in the future it won't just be pro users Apple will be fine leaving behind for the future, but the Mac experience as well.

But this is all years away to some degree. And currently Apple has renewed their commitment to keeping a segment of pros in the fold, so we'll see what the future holds.
 
If you refuse to believe this is possible,
for that, 1st Apple has to Support nVidia H/W on Metal, 2nd nVidia has to support AMD ( & viceversa) on its drivers (setup right DMA Address, handshake etc), even if all the parts agree to do that, is not a trivial work, the biggest barrier is Metal, with is HSA functionality lying on AMD.

I'm a big nVidia fan here, but I dont see major cues on Apple's will to support nVidia Again, and even if Apple does that, what are the pupose to redo Metal (and AMD drivers) just to allow GPU mix, when all the Mac Pro Needs is Powerful GPUs with custom Displayport-TB3 link, either thru a proprietary PCIe + DP interface (As on the tcMP) or with some un-elegant solution as a DP1,4 feedback cable.

Sorry, even if you are rigth on the technical side (you are wrong), all the expense this requires dont worth the trivial gain (allow std PCIe 2nd GPU), something few Pros will consider seriously, AMD guys will buy AMD gpus, nVidia Guys like single/dual nVidia GPUs not an AMD gpu enabled to share PCIe bandwidth with an nVidia GPU, it is an insult to the intelligence, not the way a Pro setup a system, even redneck DIYers wont see that to be good, its like to install a Ferrari engine and axle to help a camaro run faster, that has little sense in a PRO machine, tolerable in Laptops, not elsewhere.

HSA is an goal on the IT industry but still a very long road do.
[doublepost=1517350050][/doublepost]
On the other hand, I guess the question is whether Apple will see the virtue in continuing 'traditional' PCs in the future. I think the form factor is definitely 'dead' compared to tablets, phones, wearables, etc.,
My prediction is AI will change everything, in the future cars, homes, business will run a "brain" a new hardware class, derived from GPUs/NN chips used in Machine Learning, this Brain will enable less intelligent terminals (As Phone, Tablets, Laptops, and desktops) to do complicated things that previouly required a purpose built system, while this brain will be somethin you buy in modules, all the same modules, you just add as many as you can buy to fill your chasis (and enable your desired compute power).

I think this will happen even before the IT industry agree on HSA, the true HSA is a Neural Network,
 
Last edited:
My prediction is AI will change everything, in the future cars, homes, business will run a "brain" a new hardware class, derived from GPUs/NN chips used in Machine Learning, this Brain will enable less intelligent terminals (As Phone, Tablets, Laptops, and desktops) to do complicated things that previouly required a purpose built system, while this brain will be somethin you buy in modules, all the same modules, you just add as many as you can buy to fill your chasis (and enable your desired compute power).

I think this will happen even before the IT industry agree on HSA, the true HSA is a Neural Network,

We like the same movies ! ;)

Well, in the real world, technology needs to adapt to users, which are analogue .
I see a future where there will be a growing demand for less automated, more user accessible devices .
Less digital, so to speak .

Mobile OSs and touch screens have been proven to be very limiting ; while a majority might be content with that trend, the more ambitious won't be .
AI is in its infancy, and will also face the belated, yet rapidly evolving scrutiny by law makers - and thank god for that .
 
  • Like
Reactions: Mago
for that, 1st Apple has to Support nVidia H/W on Metal, 2nd nVidia has to support AMD ( & viceversa) on its drivers (setup right DMA Address, handshake etc), even if all the parts agree to do that, is not a trivial work, the biggest barrier is Metal, with is HSA functionality lying on AMD.

No, they don't. They're just moving a RGB image between cards. That's supported purely at the Metal driver layer. Yes, that would require involving the CPU, but the overhead really isn't bad at all. This is exactly what DirectX12 is doing and what the demo video Apple did is doing.

This really isn't complicated.

Are you really telling me you can't move an RGB image from one graphics card and put it on another? Can't be done? Impossible? Requires a hardware bridge? I mean, this is a really silly conversation where you just can't admit you are wrong.

We all know you can upload an RGB image to a GPU. You can download an RGB image from a GPU. All you have to do is connect the two together. Download the RGB image that results from rendering the window server from one card, upload it to another, and then display it from that GPU.

(Also Nvidia hardware is already support by Apple on Metal. If you didn't know that then... this is really a waste of time.)

Look, if this is still confusing, watch the Apple video. They literally have a step by step graphic on how they're synchronizing the AMD and Nvidia GPU. It's exactly what I just described, but if you're not absorbing it from me, maybe the video will do the trick.
 
Last edited:
Sorry for the misunderstanding .
I'm a freelancer as well, I'd been referring to a comment made earlier in the thread, re. freelancers not being professionals . ;)

Not a problem, and I'm sorry for accusing you of something you didn't do. I keep up with this thread, but missed that. I was feeling somewhat thin-skinned that day.
 
  • Like
Reactions: filmak and Biped
So have any of you even considered the new Z machines?

Yep, will wait to see what Apple's new offering will be. But not optimistic, so have been checking out HP and Dell workstations. My next wish will be for an opensource phone / os.. then can completely disconnect from the Apple ecosystem.
 
  • Like
Reactions: Aldaris and 0388631
Are you really telling me you can't move an RGB image from one graphics card and put it on another? Can't be done? Impossible? Requires a hardware bridge? I mean, this is a really silly conversation where you just can't admit you are wrong.
No, but its deficient, you actually upload the RGB ti the system's RAM using the cpu, then the CPU moves the RGB to the main gpu ram, this is a thing I do on routine, and is not the same as in HSA/XConnect where GPU do not busy system ram just PCIe bandwidth, busy system RAM is an huge performance tax.

Still the Market appeal of this solution (HSA/XConnect), as it is tolerable on laptops, but I doubt anyone to seriously buy a workstation to mix GPUs, even low budget gamers will never do that,
 
Yep, will wait to see what Apple's new offering will be. But not optimistic, so have been checking out HP and Dell workstations. My next wish will be for an opensource phone / os.. then can completely disconnect from the Apple ecosystem.
The new HPs are modular and similar to the old Pros. I can't find much on Dell's workstations, but very little of their case interiors are easy to navigate and drop in or take out components. As you said, people are going to wait until Apple releases something and decide then.
 
  • Like
Reactions: Biped and Aldaris
Yep, will wait to see what Apple's new offering will be. But not optimistic, so have been checking out HP and Dell workstations. My next wish will be for an opensource phone / os.. then can completely disconnect from the Apple ecosystem.

https://puri.sm/shop/librem-5/

They have FOSS phones and computers, I met the owner at a SuSECON last year he seems like a cool dude. They're stuff is Apple expensive but it's all decent
 
  • Like
Reactions: Biped
So have any of you even considered the new Z machines?

If you are in the market for a workstation that isn’t the Mac Pro ( depending upon how Apple handles this redesign and are ok spending around the ballpark that Apple may quote for it ) do consider other vendors apart from HP - Dell, Lenovo, Supermicro, Titan, Puget etc. I am sure I am missing other system builders in that list.
 
Last edited:
  • Like
Reactions: Biped
If you are in the market for a workstation that isn’t the Mac Pro ( depending upon how Apple handles this redesign and are ok spending around the ballpark that Apple may quote for it ) do consider other vendors apart from HP - Dell, Lenovo, Supermicro, Titan, Puget etc. I am sure I am missing other system builders in that list.
Support quality, options choices and new hardware should determine your choice more, as well. I'm familiar with SuperMicro but had to look at Puget systems. Titan doesn't list anything. Puget's chip offerings are somewhat old, where Intel has moved on. There's another builder out of Aurora or Denver, IIRC, who might have the latest and greatest. Puget's offerings are a little older, but offer better prices on the same hardware in same instances, such as video cards and high end SSDs. It's really up to the individual to decide what they want within their budget.

The only flaw I'm seeing with certain builders is offering RAM capability that exceeds Win10 Pro and Win10 Enterprise limits. Microsoft did come out with a Windows 10 Pro for Workstations which raises the limits and brings in some additional features that protect data. Problem is, it's difficult to get without jumping through even more expensive hoops.

On the other hand, I'm really loving what AMD are doing with their newer server and mainstream consumer chips. It adds variety where there was little beforehand. I'd love to see Apple offer both Intel and AMD in the future for Mac Pros or other devices.

Have a good one!
[doublepost=1517621898][/doublepost]I should add that I'm confident Apple will come out with something like the old system but modern. And with better hardware than what Dell or HP offer. At a premium, obviously. However, Apple knows they have a legion of Pro fans who don't want to go to Windows 10 because it's Windows. And no matter how stable and great Windows is, they'd prefer to use something they've evolved with over the last 20 years. It just works.
 
It just works.

I would add ‘used to’ that. It’s a debatable point nowadays with Apple. But let’s see. One last chance, if for nothing else

I am waiting for some word on the Mac Pro’s build and availability... 2018 end would be pushing it.

Puget on its blog posts seems to be pretty engaging with regular updates and benchmark tests ( not those generic synthetic ones but real work based ones ) and while it’s processor line up in the Xeon category is a generation behind, everything else seems to be current.

Only HP and Dell seems to be showcasing the latest xeons. The rest don’t seem to be.

I just specced out a threadripper based system on their site with 4x Titanxps. Seems nice though about 30% more expensive than the highest build iMac Pro minus the monitor...but with a lot of storage and some serious GPU rendering muscle ... besides by the time the Mac Pro is revealed we might see a 2950x one... or even an Epyc with Ryzen level speeds...
 
Last edited:
I would add ‘used to’ that. It’s a debatable point nowadays with Apple. But let’s see. One last chance, if for nothing else
Only true if you constantly upgrade and get pushed into half-baked OS updates. There used to be a saying years ago for updating your BIOS that still rings true. If it isn't broken, don't fix it.
 
Last edited:
Only true if you constantly upgrade and get pushed into half-baked OS updates. There used to be a saying years ago for updating your BIOS that still rings true. If it isn't broken, don't fix it.
... I have worked in studio environments as well as on my own for the last decade... I know these basics

But If we apply that yardstick one should never update because what works now will work forever until it becomes obsolete... or redesigned workstations take over which circles us back to the reason all these talks over a Mac pro is taking place... so it just works doesn’t ring true if what worked earlier is no longer an option.
 
Last edited:
... I have worked in studio environments as well as in my own for the last decade... I know these basics

But If we apply that yardstick one should never update because what works now will work forever until it becomes obsolete... or redesigned workstations take over which circles us back to the reason all these talks over a Mac pro is taking place... so it just works doesn’t ring true if what worked earlier is no longer an option.
No. You're confusing yourself. Hardware is rarely if ever going to be faulty coming out of a factory. If it's going to fail, it's going to fail within initial operation hours. Software, OS updates and drivers are very different. You can upgrade hardware, and unless it's faulty, it's going to work the first time and will continue to work until it dies of old age, so to speak.

I'm referring to people, mainly Windows people, who'll update graphics, sound, third-party hardware drivers any chance they get without sitting back and reading up on any bug reports people have submitted. People know MacOS is buggy and the quality has come down in recent years, but they'll still update and then spend anywhere from a few hours to a week troubleshooting and restoring from an old image if they can't address or rollback the update.

With Windows users, the major issues I've seen are sound drivers causing crashes. Bad graphics drivers that cripple performance or introduce visual bugs. And up until a few years ago, SSDs on Windows had an issue with mobos that supported hot plug n play. SSDs would sometimes turn off and crash the system. Later updates addressed these issues. No damage to the SSDs, but it made people grumpy.

I've brought up the rarity of faulty modern hardware before elsewhere and someone linked me to an article about Sony burning through Mac Pros for some Marvel movie. Which I would assume wouldn't be the case if they used a cluster. The article, IIRC, made it seem as if sections were edited and rendered to final product on single Mac Pros.

Yeah, the graphics processor on the nPro is iffy, but so is whatever Sony did.


Looping back now. If MacOS Sierra is working just fine for you and your NLE is working smoothly without a single hiccup, why dick around with updating MacOS to High Sierra when it reaches gold status? It's still going to be buggy for some or cause major issues that hadn't popped up with testers.

Wait 2-3 months and let others suffer before your workload, your stress and most importantly, your business suffer. Now do you understand why I said what I said?

FWIW, I recently updated my graphics cards' drivers after having not done so for 3 years. Because nearly every release between the time I bought them and just two days ago was filled with bugs that either crippled performance or introduced bugs in various software ranging from gaming to video editors.

To me, buying a high end workstation or building one myself, means I want to control when I do an update and I'll let others take the plunge before I update. So I know what I may encounter down the road. And, most importantly, if I can rollback or restore an image if things go south.


Edit: Not completely related, but I use two Dell IPS on my self-built Windows based workstation. In the time I've had these monitors (bought at around 1200 each IIRC), I've put 30,096 hours according to their information page. They've had software and or driver updates from Dell and Microsoft. Never installed them. It works. Color is accurate. Backlight is in great shape because I run them at 40-45% brightness 90% of the time.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.