Folks who render for a living don't care about power consumption.
Then said folk are in for a rude awakening as societies are forced to deal with energy costs.
Folks who render for a living don't care about power consumption.
UltraFusion is just Apple’s marketing name for TSMC’s Integrated Fan-Out (InFO) technology. Apple didn’t create their own off-chip interconnect, they are using a TSMC packaging technology.Hasn’t TSCM been working on their own interposer “fabric”? Who’s to say Apple hasn’t adopted theirs because it makes the fab process easier?
That's not correct, it isn't a sign of anything.They only tested this to 3rd party security libraries and not Apple 1st party security library, that should be a sign 🤪
I think you might of meant M3 Ultra or M3 Extreme. M3 Max is well into production and have been readily available in top end M3 MacBook Pro machines for monthsOr maybe there will never be an M3 Max
Thanks, I couldn’t tell if they were using TSMC’s solution wholesale or codeveloped it with them.UltraFusion is just Apple’s marketing name for TSMC’s Integrated Fan-Out (InFO) technology. Apple didn’t create their own off-chip interconnect, they are using a TSMC packaging technology.
IIRC, the M1 and M2 Ultras both use the same “InFO-LSI” (LSI = Local Silicon Interconnect) packaging, but that is obsolete now. TSMC has moved on to what they are calling “InFO-oS” (oS = on Substrate) which is similar, but improved. Indeed, they have announced 10x improvements in substrate performance, but it’s unclear if that applies to InFO, it’s possible it only applies to CoWoS and/or SoIC.
So the most likely reason the M3 Max looks different is because the new and improved “UltraFusion” packaging is different, not because it is absent.
don‘t know - but when I watched the Nvidia show, it reminded me of the Apple presentations when Jobs was still at the helm. No „mother nature“ but one more thing. Apple once was super innovative, but this is a long time ago - 2011 was the last time I saw him on stage.Oh please. You would be the first to say "Apple has strayed from its core technology and has no business making robots. Massive fail! And it should be ⅓ the price anyhow!
I love the concept of bemoaning the loss of innovation at Apple, in a thread about Apple Silicon…don‘t know - but when I watched the Nvidia show, it reminded me of the Apple presentations when Jobs was still at the helm. No „mother nature“ but one more thing. Apple once was super innovative, but this is a long time ago - 2011 was the last time I saw him on stage.
It make a difference if you can feel the emotion and the excitement or if a Tim just enters the stage and tells you a monotonous tale about the words incredible, amazing and awesome - but it is drop dead boring. Hope mother mature stays in bed this year …
This isn't complicated.Then said folk are in for a rude awakening as societies are forced to deal with energy costs.
Let’s extrapolate that forward a few generations. Unless the GPU industry puts real effort into breakthroughs in per/watt, people are going to have to upgrade their home circuit breakers to handle the load.
Meanwhile, Apple has a massive amount of headroom to dial up the power, but they’re a long-term thinking company and have spent years focusing on making things as efficient as possible.
NVIDIA/AMD/etc are going to hit a ceiling for professional users in the not too distant future unless people are going to be willing to install server grade power systems…
So much this. We can’t talk politics here, but the world is going to look *very* different as the age of abundance comes crashing down in the not to distant future…Then said folk are in for a rude awakening as societies are forced to deal with energy costs.
So much this. We can’t talk politics here, but the world is going to look *very* different as the age of abundance comes crashing down in the not to distant future…
Energy is cheap enough today that these home-based renderers don’t care that their computer is using 1000+w. If their energy prices doubled, tripled, etc they’d be singing a different tune about efficiency.
Yeah people who don‘t know that Jobs started Apple Silicon tend to do so. Apple Silicon was Jobs last gambit. It started in 2008 when Apple bought P.A. Semi - Tim/Apple today had nothing to do with it and it cannot be called innovative that Apple didn‘t stop this long planned transition.I love the concept of bemoaning the loss of innovation at Apple, in a thread about Apple Silicon…
Do you have a specific point you’re looking to discuss or is this just another airing of perceived grievances style post?Bitcoin miners have entered the chat.
Look, you know all the fabulous movies and TV shows Apple and the rest of Hollywood produce? How much energy do you think they consume? Prolly the output of several nuclear power plants. I got more bad news for you. None of it is rendered with Macs. Zip, zilch, nada, a big fat zero. So much for Apple signally that they are saving the planet. Hilarious.
oh, M3 Centipede!(I'm not a chip designer.) I wonder if the M3 Ultra could have the ultra connection on both sides allowing a 3+ configuration.
Because they are charging more.If it's a failure then I'd love to fail like Apple. Oh dear, they are making more money with M series chip than with intel or AMD.
Oh, hello!! So if the Ultra is standalone, and - as the article suggests - has the POSSIBILITY of a new UltraFusion connection.... Then that would be the mythical (but reported on) M3 Extreme.
That would not only allow the MacPro to be a valid machine (if Extreme was limited to it exclusively), but presumably that redesign would allow for say - cards and stuff? Which would justify keeping the Mac Pro chassis in production in the first place. #MacPro
Or maybe they could try some actual courage, and admit that they're not good at building GPUs. It would be nice to have Nvidia options on the Mac again.
If you want to know what innovation looks like:
Performance matters more than power consumption. Mac cant even do that.4090 is like 3.5x faster but consumes 7.5x the power (not to mention the M2 power consumption includes CPU), and that’s the M2 with no RT hardware
Talk to the people who scale out data center size render farms. Over the lifetime of rows of racks, they spend more total $$$ on power and cooling than on the CPUs and GPUs. The CFOs for the customers who pay for these render farms certainly care.Folks who render for a living don't care about power consumption.
Gurman said there would be no M2 Studio after the M2 Mini and Mini Pro came out. I believed it, even though I needed the studio, so I bought the Mini Pro, 4 months later the M2 studio came out. until Apple says something I don't believe anyone whose primary goal is to make money by making videos.probably more reliable than Gunman, pardon me Gurman.