Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It’s very likely that everyone involved has been working together for some time, they’re just formalizing it with this announcement. If you search for the company names and “USD” (except for JDF?), they’ve all got websites that predate this announcement defining their engagement with and support of the format.
 
The RISC chips are so cheap, companies are placing them in the base of light bulbs and using them to control the LEDs. At the low end they can cost just pennies. ARM is more mature and there is a much higher high end.

But remember using ARM does NOT force the use of SoC. There are ARM computers that have their RAM in SIMM slots and their IO on a PCIe bus just like the classic Windows PC. Apple did their SoC design for performance reasons.
😮 in light bulbs!? Wow I used to be in the know on leading edge tech but that was last in the mid-2010’s (work and tech burnout). I completely forgot about RISC.

right! I should have been more specific and that’s on me. My issue isn’t with ARM, it’s with Apple’s SoC. I’d go back and edit my posts but - I’m tired lol

thanks for the info!

oh, and someone else suggested I read up on Apple’s history - I’ve been using Mac’s since 1999, mostly PowerMac’s then Mac Pro’s. I sold my 2019 Mac Pro for a Mac Studio Ultra and finally said goodbye to towers for the first time in 20+ years. Thankfully, I already have a thunderbolt array, I just needed to grab another one for my internal drives that were in my Pro’s MPX module. I’ll miss the cleaner all-in-one tower but the Studio Ultra is blazingly fast and incredibly quiet. I don’t have a problem with ARM systems, just with SoC’s as (right now) Apple doesn’t allow users to upgrade GPU, RAM, etc. hoping that changes for the Mac Pro (and RAM for iMacs and laptops). I’m 45 so I’m used to having upgradable towers/ATX. Moving to an SoC desktop system has been an adjustment.
 
Last edited:
Apple’s been burned by every CPU supplier (Motorola, IBM, Intel) they’ve ever used and people around here still question why Apple is building their own SoC’s now? Good grief…please try to grasp the obvious.
No one is questioning the obvious, some of us are merely lamenting the inability to upgrade components on apples SoC’s. As others have mentioned, that’s a design choice inherit with SoC’s (not ARM). That’s all.
 
Get Unity and Unreal on this too if they aren't already! A single pipeline for animations, materials, UVs, meshes, etc would be awesome! (although probably a huge challenge)
They are already…
I think Pixar put a LOT of work into it internally before releasing its initial iteration, and it’s proven to be robust for the large scale 3D industry.
 
They are so "brilliant" nowadays (just look at how they nailed the MacBooks with the notch), that I'm imagining the deal they did with NVIDIA this time: "Ok, ok, we let you in, but with one condition: you won't try to change the fact that the MacPro is an integrated GPU box and it will be like that forever"
 
  • Like
Reactions: BrightDarkSky
And yet the history is well known and documented.

The last Nvidia GPU in a Mac was 2015 iMac, whereas the GPU issue on Macbooks was 2007/2008 so even Nvidia dumping the costs on Apple when they gave millions of dollars to the likes of Dell over the issue wasn't the cause of Nvidia disappearing from Mac OS

Mojave beta had support for Nvidia GPU's and would have continued if Nvidia had been prepared to supply Metal Only drivers without insisting on CUDA.

Apple - moving forward we want only Metal API to be used.
Nvidia - we insist on CUDA support for our drivers as well for your Systems.

When one of those 2 is the customer then the Customer wins.

Apple insisted on Metal and they doing the buying so the Customer here. Nvidia insisting on including CUDA so got told no thanks.

Whilst not Apple not blameless here then Nvidia just as pig headed in insisting on something that there customer saying NO to.

Of course the losers were the end users.

I think Apple was helping us by boycotting CUDA. Apple didn't want any Mac software to become reliant on CUDA. This would make it harder for them to switch to AMD if AMD gives them a better bid for a GPU in the future and would have made it harder for their switch to Apple Silicon.

Getting locked into CUDA would be bad for users too. If apps started using CUDA, it would make it harder for Apple to consider AMD bids, which could result in higher prices for us. And it would have been a rougher transition for us to Apple Silicon when losing CUDA-dependent features of apps. Or if CUDA had spread deeply and widely enough, we might still have power-hungry discrete GPUs in our laptops today.
 
All these huge companies stacked with smart people and they couldn’t come up with an acronym that’s not already well known for something else?
 
Working with Nvidia would’ve been really useful a few years ago
Just of curiosity cuz I'm completely out of the loop--didnt even know there was beef involved between the two. But what would have Apple worked with Nvidia on a few years ago? Is it their GPU's and DLSS? Because I think work on Apple Silicon was already underway and apparently Apple has their own version own, albeit inferior, version of DLSS called MetalFX.

So maybe that's why Apple never bothered using Nvidia's GPU's in their Mac lineup? Just guessing.

OT: This alliance is great news. Glad to see Apple working collaboratively with other players. It makes sense for file and other standards as it's a win-win for everyone (same reason they should've jumped to USB-C much sooner but oh well).
 
Apple doing Pixar's bidding here or just spending $10K per year to give the appearance of support of this open standard? Have been several attempts at similar open standards in the past, but none have taken off. Need better integration with C4D and similar apps before this is widely adopted. With Autodesk onboard, doubtful Maxon will jump in quickly. Adobe trying everything to get people onboard with Substance. Great tools, but their prices are a bit steep for independents, especially when they're not integrated with AE very well (yet).

WTF are you talking about? Both Painter & Designer are industry standard in Texturing & Material creation. Neither Mari nor 3DCoat are anywhere near in adoption rate.

And as a Freelancer I had no problem getting their 3D package that comes with Painter, Designer & Sampler; even though I don't use sampler.

If you still find it costly you can get a perpetual license on Steam.

I don't know which industry you are in, but if you are a 3D production artist, Substance Designer+ Painter (Painter more) are the go-to industry standards along with Maya & Zbrush that you must learn to get a job in big studios.
 
  • Disagree
Reactions: bsbeamer
USD is just a file interchange format like FBX or Collada, not much to do with GPU's,why would this heralds a new era for Apple Nvidia relationship, on the other hand if Apple is required to license Nvidia's api's as part of this that would be something else again
USD has been around for awhile and is already supported in motion/FCPX.

Would love to see one great interchange format in 3D.
 
  • Like
Reactions: Unregistered 4U
😮 in light bulbs!? Wow I used to be in the know on leading edge tech but that was last in the mid-2010’s (work and tech burnout). I completely forgot about RISC.

right! I should have been more specific and that’s on me. My issue isn’t with ARM, it’s with Apple’s SoC. I’d go back and edit my posts but - I’m tired lol

thanks for the info!

oh, and someone else suggested I read up on Apple’s history - I’ve been using Mac’s since 1999, mostly PowerMac’s then Mac Pro’s. I sold my 2019 Mac Pro for a Mac Studio Ultra and finally said goodbye to towers for the first time in 20+ years. Thankfully, I already have a thunderbolt array, I just needed to grab another one for my internal drives that were in my Pro’s MPX module. I’ll miss the cleaner all-in-one tower but the Studio Ultra is blazingly fast and incredibly quiet. I don’t have a problem with ARM systems, just with SoC’s as (right now) Apple doesn’t allow users to upgrade GPU, RAM, etc. hoping that changes for the Mac Pro (and RAM for iMacs and laptops). I’m 45 so I’m used to having upgradable towers/ATX. Moving to an SoC desktop system has been an adjustment.
This is the future to include, memory and GPU in the SOC. The x86 will also get more and more into this and upgradeability will suffer, but speed and latency will reduce.
 
  • Like
Reactions: CD Player
Apple realizes the Mac Pro is worthless in the pro market without an Nvidia GPU. At least end users got to suffer for the petty dispute all these years. I'm really not buying into any more tech by anyone, it gets abandoned too fast and loses support overnight due to childish behavior by childish executives and lame stockholders. Can't get a reliable workflow going to run a business. This ride sucks!
While I'm not sure I'd go as far as "worthless", I was a bit surprised that Apple didn't come up with a way to use internal GPU-only (or even full M-series CPU/GPU) daughter cards to turn the Mac Pro into a potential render engine for high-end graphics applications.

This setup probably wouldn't have much of a real-time effect on gaming (FPS et al) because of the speed bottleneck between the compute engines...but I would have thought it would have a major effect on render times.

Perhaps in the future...

Meanwhile, I absolutely love the M-series CPUs and the SoC concept for my purposes. And the fact that I have to use an actual fire to roast marshmallows rather than my Intel MacBook is a feature I'm not willing to sacrifice for more speed. But I can imagine the frustration of users on the very high end.
 
Cool, I've used USD files in Blender a few times before, and it definitely feels a little like magic with how most things just work. Usually 3D formats are kinda a hassle, especially the older ones like OBJ or FBX. Always nice to see tech corps working together in improving new open standards, doesn't happen as often nowadays as it used to.
 
  • Like
Reactions: Unregistered 4U
This is the future to include, memory and GPU in the SOC. The x86 will also get more and more into this and upgradeability will suffer, but speed and latency will reduce.
True. I just hate e-waste - if we can't upgrade RAM, GPU, etc (which does extend system life) then I wish a simple SoC replacement could be possible instead of junking an entire desktop system. I'm old school so I keep imagining swapping out a CPU type upgrade - a simple chip swap. I know, that's a dream considering all the other components that would have to work with a new SoC, but hey, I like to dream. :)
 
  • Like
Reactions: iBluetooth
True. I just hate e-waste - if we can't upgrade RAM, GPU, etc (which does extend system life) then I wish a simple SoC replacement could be possible instead of junking an entire desktop system. I'm old school so I keep imagining swapping out a CPU type upgrade - a simple chip swap. I know, that's a dream considering all the other components that would have to work with a new SoC, but hey, I like to dream. :)
Instead what you find is that instead the focus is more on making the system recyclable in that can be stripped down and the parts recycled.

So whilst may not be used as long the components can be recycled more easily.

Is a different way of looking at it however less of my Mac Studio will end up as Land Fill/e-waste then my old PC Towers did.
 
Just listened to Siggraph 2023 keynote by nvidia. So has anyone here used Omniverse? They were really pushing how it uses USD and interoperates with other apps to do interesting things (they claimed Mercedes uses it to design their EVs and their ADAS system).
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.