Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The risk is: when my job consists of developing Windows x86(-64) software or working with programs that do not run on macOS, I don't buy a Mac. If I do so (for whatever reasons - design, preference of macOS, you name it), I have to be aware that I am using this Mac in a way that is not relevant to Apple when making decisions for their future.

The majority of developers using a Mac then.
 
Are you saying that the majority of developers using a Mac does not develop for the Apple ecosystem? If so, something must be fundamentally flawed with the non-Apple hardware offerings …

Yes, I'm saying that. Apple's hardware and macOS is preferred for the majority of developers who use macOS to develop not for the Mac.
 
  • Like
Reactions: ssgbryan
Yes, I'm saying that. Apple's hardware and macOS is preferred for the majority of developers who use macOS to develop not for the Mac.
That's quite interesting. Do you have any numbers or is it gut feeling?

As much as I prefer to work with macOS instead of Windows, I still can't grasp why people insist on using a Ferrari to transport washing machines instead of a lorry. I also don't use a hammer to fasten the screws of my furniture.
 
That's quite interesting. Do you have any numbers or is it gut feeling?

As much as I prefer to work with macOS instead of Windows, I still can't grasp why people insist on using a Ferrari to transport washing machines instead of a lorry. I also don't use a hammer to fasten the screws of my furniture.

I want to say it’s gut feeling but most places I’ve been prefer Macs for developers, and I’ve only targeted the Apple platform once in my life, and even the ops guys were using Macs. Amex, Goldman, Amazon just to mention a few.

As a platform, macOS was seeming developer friendly. The toolset and not to mention shell and home brew makes the ecosystem rich. Certainly better than running some Linux distros.
 
The risk is: when my job consists of developing Windows x86(-64) software or working with programs that do not run on macOS, I don't buy a Mac. If I do so (for whatever reasons - design, preference of macOS, you name it), I have to be aware that I am using this Mac in a way that is not relevant to Apple when making decisions for their future.


Ad 1: I did tinker with DAZ Studio some years ago. It's sad that proprietary CUDA is so prevalent in the 3D world.

Ad2: Well, as I said, if I rely on non-Mac software for my living, I shouldn't buy a Mac. Or accept that my workaround might break every day.

Ad 3: This might be one of the reasons this switch is coming. No one knows what an Apple chip with the cooling and power capabilities of a notebook or desktop computer is capable of.

Ad 1: I'm sorry. Daz Studio is worth what you paid for it. 10 years on part of the on-line help files still say If you are having trouble, find someone to help you. Because someone doesn't work at DAZ.

It isn't sad that CUDA is so prevalent - it is the right tool for the right job. If AMD could get more companies using the ProRender engine (it is a much better solution - it doesn't care if you are using Nvidia or AMD cpus) then the problem would solve itself. They have not been successful to this point.

Ad 2: That is why the power user base for Apple is not growing. In the real world, there are many, many areas that are Windows only. People use VMs on OSX for the exact same reason I was using VMs for windows applications in OS/2 25 years ago. OSX has unix underpinning it. Having an x86 architecture means that a Windows product in a VM is a 1 click solution. That won't be the case with ARM. I remember emulation on the PPC.

Ad 3: If you want to be the beta testers for Apple, knock yourself out. People that depend on their software won't be doing that.

Some earlier was mentioning an ARM based Mac Pro......

Who would buy this? The sweet spot on a 7,1 is around 10,000USD. These things are going to have a 5 - 7 year life cycle. Good luck in getting the bean counters to sign off on a replacement inside of 72 months.

The folks circling the airport? They aren't moving to the 7,1 - they certainly aren't going to move to something that requires them to replace both hardware and software in 1 shot.
 
  • Like
Reactions: ct2k7
Ad 1: I'm sorry. Daz Studio is worth what you paid for it. 10 years on part of the on-line help files still say If you are having trouble, find someone to help you. Because someone doesn't work at DAZ.

[…]

Was free and just for fun hobbyist use, so it was okay for me. I agree that it seems a bit out of place for professional use.
 
And honestly, I know a few engineers and 3D designers because my sister is in that field. She constantly tells me how most 3D designers don’t even like Mac’s because most of what they need isn’t available.

And 3Dx Max apparently is an old software that won’t be updated again period. So if your still using that, might want to change over in the next couple of years to something newer and still being updated.
 
Was free and just for fun hobbyist use, so it was okay for me. I agree that it seems a bit out of place for professional use.

I use it to move assets into Poser. For that, it is great. Pity that is all it is good for. It is easily the most poorly coded 3d app I have used in the past 15 years.

And honestly, I know a few engineers and 3D designers because my sister is in that field. She constantly tells me how most 3D designers don’t even like Mac’s because most of what they need isn’t available.

And 3Dx Max apparently is an old software that won’t be updated again period. So if your still using that, might want to change over in the next couple of years to something newer and still being updated.

That is true - and neither the 6,1 nor the 7,1 Mac Pro increased the professional base of OSX users.

Yes, my copy of 3Dx Max is an old version. All I use it for is converting assets from 3dx Max native format to .OBJ format. Same thing with my copy of Lightwave.
 
  • Like
Reactions: futureisfilm
The past year's trade war and tariffs cause China to panic. They direct several teams to design their own laptop grade chips that can be run at 10, 25, and 50 watt power levels. These might be of an entirely home grown architecture, or Risc V, but lets say they're Arm based. Never mind that the silicon design is far from optimized. For software, they select Linux and Open Office. Eventually, Windows/Arm is also an option.

It’s likely China will move to RISC V given it’s open nature. The trade wars have also seen ARM restricting licensing to the Chinese. Given that a number of big players are moving to RISC V, it’s a future we may all see in all devices.
 
Compiling large codebases for x86 machines is a simple yet important workflow in my mind.

Runs faster on a ARM N1 AWS server TODAY.

Cross compiling isn't a new thing 2020. We always compiling ARM code on x86 machine.
[automerge]1588028723[/automerge]
There are plenty of developers out there who develop on a Mac but don’t target the Mac platform per se.

Why that matters?

It makes it like you are coding on a machine runs the same CPU on the target.

Remember even Intel CPU have instruction difference and I always turn on AVX512 when compiling for Xeon even on my laptop which totally can not run those instructions.
 
  • Disagree
Reactions: chikorita157
No, they are perfectly capable of licensing chips from ARM and customizing them to their liking. They haven’t licenses the ARM Neoverse N1 before because they haven’t been making workstations with their chips yet. But I am assuming when and if they do replace Intel with ARM in their pro machines then they will use that as a base.

But they don’t have to. They could keep doing what they are now. Doing that will allow them to save money. Or licensing more chip designs from ARM will mean they will need to heavily invest and bump up production on new CPU’s. If they do that, then expect pricars to stay the same on the new MacBook’s and Mac’s. The Apple tax might even get a little higher for a little bit.
Apple doesn’t license chips from Arm. They license the instruction set architecture, and they design the chips themselves (the micro architecture, the logic design, the circuit design, and the physical design).
 
Ad 1: I'm sorry. Daz Studio is worth what you paid for it. 10 years on part of the on-line help files still say If you are having trouble, find someone to help you. Because someone doesn't work at DAZ.

It isn't sad that CUDA is so prevalent - it is the right tool for the right job. If AMD could get more companies using the ProRender engine (it is a much better solution - it doesn't care if you are using Nvidia or AMD cpus) then the problem would solve itself. They have not been successful to this point.

Ad 2: That is why the power user base for Apple is not growing. In the real world, there are many, many areas that are Windows only. People use VMs on OSX for the exact same reason I was using VMs for windows applications in OS/2 25 years ago. OSX has unix underpinning it. Having an x86 architecture means that a Windows product in a VM is a 1 click solution. That won't be the case with ARM. I remember emulation on the PPC.

Ad 3: If you want to be the beta testers for Apple, knock yourself out. People that depend on their software won't be doing that.

Some earlier was mentioning an ARM based Mac Pro......

Who would buy this? The sweet spot on a 7,1 is around 10,000USD. These things are going to have a 5 - 7 year life cycle. Good luck in getting the bean counters to sign off on a replacement inside of 72 months.

The folks circling the airport? They aren't moving to the 7,1 - they certainly aren't going to move to something that requires them to replace both hardware and software in 1 shot.

CUDA is cancer.
It is not offering anything to the AI market it's dominating right now.

The reason people using NVIDIA is not because of CUDA, but because of cuDNN and many more proprietary NVIDIA libraries witch are in fact GPU assembly instead of CUDA code.

NVIDIA intentionally not optimizing OpenCL performance on their card and push people to use CUDA but failed. Now they are using cuDNN to keep AI user on their platform.

I hardly seen any AI data scientist using CUDA directly. All their work can be ported to a new GPU without issue if they provide the backend for torch/tensorFlow.
 
  • Like
Reactions: Zdigital2015
Apple doesn’t license chips from Arm. They license the instruction set architecture, and they design the chips themselves (the micro architecture, the logic design, the circuit design, and the physical design).

o_O Sorry I wasn’t technical enough. But what you said is exactly what I mean.

Which is also what Amazon did with the Graviton2 (ARM called it Neoverse N1.)

Also, I don’t know the name of the company but Apple Hires someone to actually produce/manufacturer the Chips. The original article actually mentioned that company.
 
o_O Sorry I wasn’t technical enough. But what you said is exactly what I mean.

Which is also what Amazon did with the Graviton2 (ARM called it Neoverse N1.)

Also, I don’t know the name of the company but Apple Hires someone to actually produce/manufacturer the Chips. The original article actually mentioned that company.

Neoverse N1 is a "hardware" IP from ARM. Not designed by Amazon directly.
Multiple different companies are using this N1 design including Ampere eMAG.

You can think this N1 as a reference ARM server CPU core design that everyone can buy and integrate into their chip.
A13 is a fully Apple designed core.
 
o_O Sorry I wasn’t technical enough. But what you said is exactly what I mean.

Which is also what Amazon did with the Graviton2 (ARM called it Neoverse N1.)

Also, I don’t know the name of the company but Apple Hires someone to actually produce/manufacturer the Chips. The original article actually mentioned that company.

They use TSMC as the fab. But Apple provides TSMC with the masks (the instructions that the equipment uses) for each layer. TSMC does nothing other than take those instructions and run them through its equipment to produce the chips.

In other words, for each layer of the semiconductor (polysilicon, metal 0, metal 1, metal 2, etc., and the vias between the layers), Apple draws each and every one of the billions of polygons on each layer.
 
Neoverse N1 is a "hardware" IP from ARM. Not designed by Amazon directly.
Multiple different companies are using this N1 design including Ampere eMAG.

You can think this N1 as a reference ARM server CPU core design that everyone can buy and integrate into their chip.
A13 is a fully Apple designed core.

But Amazon licensed the ARM Neoverse N1 (which Amazon’s is using the CMN-600 variant). And I know Apple’s A13 is based off of Armv8 architecture.

and that both companies designed their own custom chip but based off of two different ARM designs. Which is fine. They are meant for two different devices.
 
But Amazon licensed the ARM Neoverse N1 (which Amazon’s is using the CMN-600 variant). And I know Apple’s A13 is based off of Armv8 architecture.

and that both companies designed their own custom chip but based off of two different ARM designs. Which is fine. They are meant for two different devices.

No, you are confusing hard IP and soft IP. Neoverse N1 is a pre-designed chip. The micro architecture is already designed by Arm. https://www.arm.com/-/media/global/...revision=1cf46d42-3a6b-4995-809c-94b14109b805

Apple starts with a clean sheet of paper; the only thing they get from Arm is the set of instructions that the CPU will know how to execute.
 
No, you are confusing hard IP and soft IP. Neoverse N1 is a pre-designed chip. The micro architecture is already designed by Arm. https://www.arm.com/-/media/global/products/processors/N1 Solution Overview.pdf?revision=1cf46d42-3a6b-4995-809c-94b14109b805

Apple starts with a clean sheet of paper; the only thing they get from Arm is the set of instructions that the CPU will know how to execute.

Ah, that makes sense.

Either way, I am pretty sure the Neoverse N1 is more powerful then any Apple Chip as of right now. And that Apple will make their own version of it for the Mac Pro.
 
CUDA is cancer.
It is not offering anything to the AI market it's dominating right now.

The reason people using NVIDIA is not because of CUDA, but because of cuDNN and many more proprietary NVIDIA libraries witch are in fact GPU assembly instead of CUDA code.

NVIDIA intentionally not optimizing OpenCL performance on their card and push people to use CUDA but failed. Now they are using cuDNN to keep AI user on their platform.

I hardly seen any AI data scientist using CUDA directly. All their work can be ported to a new GPU without issue if they provide the backend for torch/tensorFlow.
I would go so far as to say NVIDIA is a cancer...but, perhaps I go too far.
 
They JUST released this amazing laptop that fixed all the problems and they just couldn't resist toying with it again and (probably) ****ing it up. It's so frustrating as a consumer to constantly go into forums and read "Yeah, everyone agrees year X is the best but the company refuses to rebuild it."

Well, I wouldn't say fixed *all* the problems. They just fixed the largest problem. It's all relative.
 
Ah, that makes sense.

Either way, I am pretty sure the Neoverse N1 is more powerful then any Apple Chip as of right now. And that Apple will make their own version of it for the Mac Pro.
Well, Apple will make its own version of *something*, but not of neoverse N1. Apple has its own microarchitecture, and will design it all themselves. Assuming they wish to target Mac Pro-style performance, for example, they would likely extend the rumored A14 Mac chip so that instead of just 8 high performance cores and 4 low power cores, it has many more of each. It already must have a good on-chip bus architecture, but they’d likely have to scale it up to handle more cores (say 16 + 8, or 20 + 10, or whatever). They could increase the effective size of caches, and increase bus width in and out of them, for another boost in performance. They know what they are doing, and presumably it would blow away Amazon’s thing, since, per core, they already do.
 
Yup, that's why they built a 6K monitor for 5 grand with performance comparable to a $50,000 Dolby Digital reference monitor.

It is only comparable in Apple's marketing material, according to 3rd party reviews.
[automerge]1588040336[/automerge]
I would go so far as to say NVIDIA is a cancer...but, perhaps I go too far.

Sounds like GPU envy to me.

Like it or not, CUDA is very useful for many different types of computing. Just not on the Apple platform.
 
Last edited:
the big thing with the Mac Pro at the moment is the ASIC (afterburner card). Apple really should be making those standard on the Mac Pro.

If apple can get app developers onboard to support that properly then the big number crunching work doesn't need the CPU so much to get it done. Which leaves apple free to switch the CPU out for something else without losing as much performance in Afterburner-accelerated applications.

Will it work out that way? Who knows. But I see afterburner as Apple's exit strategy from intel at the high end.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.