Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
At some point, cheap power is going to become a lot less cheap and PCMR gaming PCs are going to cost more to run than it’s worth for the average consumer. Those who have started prioritizing performance per watt will be rewarded and those that thumb their nose at it will be in for a rude awakening. Being able to recharge my M1 MacBook Pro from a battery I charged up using a 100w solar panel gets really enticing really fast when intermittent rolling black outs and higher energy costs start kicking in. Even where I live in BFE we have an issue with someone attacking substations and knocking out power. This is going to get worse, not better. Prepare now.
I had the same thinking, but then I actually did the calculation. As much as I would like it to be the case, the added cost for an Apple computer won't be balanced out by power consumption, even at one dollar per kwh under heavy use. There is an environmental angle, and of course a few hundred bucks a year is a nice saving if you're buying an Apple anyway, but don't try to make it the primary argument for the added purchase cost.

My main gripe about Apple Silicon is that I (stupidly) underestimated how much markup Apple is willing to add to a product. I'm positive that the cost per unit for Apple using Apple Silicon is much lower than using Intel, and I was dreaming that they would use that to make their platform more accessible. With "everything" integreated in the chip, the manufacturing cost is very low. Instead (which everyone but me saw coming) they use their chip advantage as a selling point to charge more. I wonder if anyone would still be buying Windows if Apple created a 500 dollar laptop and a 200 dollar Apple TV sized "Mac Nano". But the current strategy probably makes them more money, so I don't blame them.
 
Nah it's fine. Energy is a major cost and constraint these days, even in our office!

We don't need faster chips, we need more grunt-per-watt and that's what Apple are doing.
Will it ever get to the point that electricity prices spike so much that using M chips over P will save you that money or more?
 
  • Disagree
  • Like
Reactions: Danfango and 4odomi
The only difference with Max over Pro is the GPU so not really. M1 Pro has the same SC and MC to the M1 Max. The M2 pro has 2 more cores and all of those cores are running faster. So its not impressive its inevitable but it will run hotter so interested to see what the difference is on longer processing tasks.
A lot of people on here are saying that, however how come the M1 Max is faster by 2000 on multicore, than the M1 Pro?
 
Why would I buy a Mini for $200 less in this situation?
Because it's smaller, lighter, newer, and cheaper.

Not to mention more ports, and better thermal management.
That's not a foregone conclusion. The M2 Pro's TDP is believed to be a bit more than half that of the 2018's Intel i7, and the cooling on the 2018 model performed well under sustained workloads. So there's good justification to believe that the 2023 Mini's M2 Pro will be adequately cooled.
 
Umm, the performance hike is just about the same as the M2 Pro vs. the M1 Pro.The difference between the M2 MacBook Air vs. the M1 MacBook Air is about 11% in single core and about 19% in multicore. Weird how the narrative can change people's opinions without any real evidence.
I didn't have to read it, as I mentioned in my post. For me personally testing it gives me the edge on those reading about it. Others may have had different needs, but it didn't do it for me. Didn't warrant upgrading equipment but looking forward to M3 which may just have the performance jump I need to upgrade multiple units
 
Which would only really matter to some PC laptop user on trying to do some high use CPU or GPU task while on battery. That is a such a tiny niche.

Most people doing that kind of work would be using a desktop, Mac or PC and if on laptop plugged in. Sure it is great that the MacBooks do not need to be plugged in....but even if I was doing those tasks with a MacBook....I would plug in, just because I think it is draining the heck out of my battery, even though my fans do not kick in.
So is electricity FREE whilst using it as a desk top?
 
So if you need to get a high demand CPU/GPU job done using a desktop......then yes I think most people would want the fastest option.

It is great that Apple does it better in terms of performance per watt, and if you are some extreme enviromentalist or power costs are super high where you are, that will matter. However for most people, using a desktop to do these kinds of workloads, for a living and time is money, they simply do not care about the performance per watt, they care about performance.

I have a large desk at home. My M1 Mac mini, with 27inch 4K monitor is on one end. I do all of my "computing" on it (typing this now) and it is silent and I love that. It sips power with my needs, email, web browser, teams, zoom, VPN into work, Secure CRT/SSH into various devices.

On the other end of my desk is a my gaming PC. It is a 11700K, 3070ti, 64gig, 3TB of PCIE M.2 SSD (1TB boot drive, 2TB drive for games). I have it in a large case, Phantek 500, with 3 - 140mm fans in front, 1 - 140mm rear fan, Noctua 15s CPU fan. While not even close to the Mac mini in terms of quiet, it is actually quiet because those fans are PWM and they never get loud, even after an hour of gaming. The power usage in comparison is off the charts I have no doubt...but I do not care because I want to game.
No doubt you parents still pay the bills, that's the only reason not to care....alternatively an addiction beyond control to gaming 😏
 
  • Disagree
  • Like
Reactions: lankox and itsboi
…and since the bulk of Apple’s Mac business is ultra-portable laptops and ultra small-form-factor desktops, that’s the key measure of their success. Also, this is just Geekbench - application benchmarks that take advantage of the media engine, neural engine etc. may show a greater advantage over Intel.
This is what people don’t understand and why I’m not a fan of these benchmarks currently. For what I do, the base M1 beat out my 10th gen i9 and 3080 Ti setup. It’s DUE to those media engines, neural engines etc that it wins for my workflow.
 
I mean, even if they had them, there is an embargo. No one is allowed to show them yet - so what did you expect? Don't blame someones thumbnail. Educate yourself on the NDA dates and know when reviews are allowed?
I did one better and educated myself on “Never watch a MaxTech video”. :) Never have, never will!
 
What are energy prices in Europe? Realistically to run my wife's m1 8 hours a day (At full blast) 365 days per year would cost me about $9 per YEAR. My Much more powerful desktop at full load would be about $90/year.
That's a huge percentage for what it is, but it's NOTHING compared to my 3 KWH air-conditioner, or 2.5 KWH Stove, or a 15 KWH Emergency heating element on a heat pump. These are all on or two orders of magnitude more energy hungry.

What most people do on their desktop devices doesn't make energy consumption a "deciding" factor.
Where I live, energy prices for businesses did 10x or even 20x depending on your contract. So even if you're in a small company that runs only 10 computers, there will be a difference.

Consumers are less impacted than businesses but some people are learning to live with layers of clothes inside their homes instead of turning the heating on so I guess everything matters.
 
If the difference in cost between the M2Pro and M2, specked out the same for ram and SSD size, is it worth the $300, You get 10 core cpu 16 core gpu vs 8 core cpu 10 core gpu, 2 extra Thunderbolt ports, HDMI 2.1 support. Wondering what everyones thought on this is? Thx
Absolutely, I think it’s a great value and that’s what I’m going order. Faster single core and multi core performance, two extra thunderbolt ports, HDMI 2.1, as well as the option of adding a third monitor.
 
At some point, cheap power is going to become a lot less cheap and PCMR gaming PCs are going to cost more to run than it’s worth for the average consumer. Those who have started prioritizing performance per watt will be rewarded and those that thumb their nose at it will be in for a rude awakening. Being able to recharge my M1 MacBook Pro from a battery I charged up using a 100w solar panel gets really enticing really fast when intermittent rolling black outs and higher energy costs start kicking in. Even where I live in BFE we have an issue with someone attacking substations and knocking out power. This is going to get worse, not better. Prepare now.
Yep. When possible I game on my macs vs my 3080 Ti desktop due to power and heat concerns.
 
  • Haha
Reactions: JMStearnsX2
Will it ever get to the point that electricity prices spike so much that using M chips over P will save you that money or more?
Where I live, electricity has at times been above one dollar per kWh. Let's imagine you are saving 125 watts on the less power-hungry option (I'm just grabbing that number out of thin air to make the numbers easy). Imagine you are running your computer 8 hours per day, 250 days a year. That's 250 kWh saved, for a total of 250 dollars. Let's say you amortize over 6 years (which you never would in business), that's 1500 saved over the lifetime of the computer. If you're running 24/7, you're probably not doing so non-stop for 6 years before replacing... probably half that. So maybe a few thousand dollars saved, at worst case scenario electricity prices (divide by 6-7 for current US prices...). For a normal consumer at current US electricity prices, let's say you are using your PC an average of 4 hours per day, 250 days per year, that's 125 kWh saved, at something like 20 bucks. Still at my completely arbitrary 125 watt saving.

Interestingly, in terms of cost, the difference in power consumption will probably matter MORE for businesses running desktop workstations than for consumers running laptops, mostly because they will be running more hours.
 
  • Like
Reactions: nicolas_s
I mean, even if they had them, there is an embargo. No one is allowed to show them yet - so what did you expect? Don't blame someones thumbnail. Educate yourself on the NDA dates and know when reviews are allowed?
Agree with the “don’t blame someone’s thumbnail”. If you ever been on the business side of YouTube, you need to attract new viewers. Having shocking thumbnails drives views that drives the algorithm.
 
Yeah they did so by "boosting" the clock well above 5GHz, even approaching 6GHz...
Intel i9 13900k, 2220@5800MHz
AMD Risen 9 7950X, 2190@5700MHz
Apple M2 Pro, 1950@3500MHz

By the way, in multi-core scores...

Intel i9 13900k @ 5.8GHZ w/24 cores (260W)... 25,388
AMD Risen 9 7950X @ 5.7GHz w/16 cores (230W)... 23,027
Apple M1 Ultra @ 3.2 GHz w/20 cores (60W)... 23,325

The Ultra uses cores that are 2 generations old (A14 vs. A16). However, based on multi-core performance increase between M1 Pro and M2 Pro (~24%), we should get...

Apple M2 Ultra @ 3.5 GHz w/24 cores... ~29,000

And that's still a core design that's not the latest generation. (A15 vs. A16)

So how exactly does Apple need to up the performance side of things? And these still use a fraction of the power Intel and AMD CPUs use to achieve those scores.
I just built a PC, with AM4 platform, a cheap b450 motherboard, a Ryzen 5600 G APU, no dedicated graphics card, 32 Gigs of Ram, 3600 speed, 512 MVNE gen3 SSD, and a cheap case, 500 watt power supply at micro center. all told it was close to 600.00 in parts. No USB C on the board. My mac mini m1 base in handbrake encoding will run circles around it. I enjoy tinkering with the PC. With the mac mini M2 base at a lower price, its a no brainer. I'm not a gamer, never was, But I think basic users the price point is what really matters. For all my important everyday tasks, I use the mac mini. I know it will always work, no fuss. The prejudice of windows users is a myth, macs are only for creatives etc. Still exists. I 've built PC for years. I have found replacements for every program that I use on a PC with the mac with Apple silicone. Even some of the more famous youtube PC builders wear Apple watches, or use a iphone now. As far as Apple not having AV1 hardware encoding, it will take years before it is adopted outright. The only GPU that really supports it is the new intel arc cards, gamers won't use because the drivers are still in development. By the way Apple is part of the AV1 group as well. It is open source.
 
Because it's smaller, lighter, newer, and cheaper.


That's not a foregone conclusion. The M2 Pro's TDP is believed to be a bit more than half that of the 2018's Intel i7, and the cooling on the 2018 model performed well under sustained workloads. So there's good justification to believe that the 2023 Mini's M2 Pro will be adequately cooled.
It's a desktop and so the smaller (yes shorter only), lighter, newer, and cheaper (arguably barely) arguments are totally irrelevant as I pointed out. Who cares if my desktop is a bit taller and marginally heavier.

Yes, the M2 Pro in the Mini may very well be adequately cooled, but undoubtedly if you stress the workload, Apple will throttle the M2 Pro to compensate for heat, a situation less likely to happen in the Studio.
 
  • Like
Reactions: Ruftzooi
A lot of people on here are saying that, however how come the M1 Max is faster by 2000 on multicore, than the M1 Pro?
The M1 Max has double the memory bandwidth
It's because there are two versions of the M1 Pro, an 8 core (10,000 ish multicore), and a 10 core (12,000 ish multicore). The 10 core M1 Pro has an identical CPU to the M1 Max (12,000 ish multicore).
 
For the M2 pro to beat M1 Max in both SC and MC after just 1 generation, its impressive
I wonder if M2 Max can approach the level of M1 Ultra, in SC yes, but in MC?
But i still suppose the gpu will be more powerful on the M1 Max compared to M2 Pro
That’s not the way it works. CPU wise M1pro/max are (nearly) identical (beside memory bandwith, mainly relevant for the gpu inside the soc). So the CPU-part of m2pro/max will win over m1pro/max. But m1 ultra is twice the cpu part of one m1 pro/max, so no, m2 max won’t be this level.
 
A lot of people on here are saying that, however how come the M1 Max is faster by 2000 on multicore, than the M1 Pro?
It's not. They are very similar. The 16" Max MacBook Pro is 12191 and the 16" Max (10-core) MacBook Pro is 12141. Are you sure you aren't looking at the binned Pro (8-core)?
 
creditable but not earth shattering, and I still have gut feeling that Apple were not that impressed with the M2 in general, especially with delays causing it to be on the heels of the M3 where I believe a much greater efficiency in both energy and productivity will occur. Also question why they bother to have different implementations of m1/m2 rather than go for the best to start with and possibly go to the dual chip implementation for higher end. Costs a lot less producing one implementation. I know its SOC and I know much of the benefit is by virtue of it being SOC, but sure Apple could sort that out. I know with the M1 the top end was basically a dual chip, but cost and even performance could be enhanced by having one M3 chip based on the best Apple could do, as the performance hike on every device would be worthwhile but still not preclude the power hungry users having dual chip etc.

I was very unimpressed with the base M2 air both from its SSD slow down and its less than impressive performance hike.

After testing the MBA M2 I've postponed any thought of updating the iMacs, Mac mini's and MacBooks we have, as they still do their job well but will revisit it when the M3 shows up in a device.

I don't upgrade for the prestige, not the colour of the device, and it does make me smile when I see the only criticism is 'they don't do it in x colour'....
M3 will be based on a new manufacturing process, so it should deliver pretty good improvement.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.