Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
My guess is the chips are tuned for burst performance/race to 0 normally, and this mode retunes them for sustained performance (with higher power draw and heat), with probably some loss in burst. Love the options apple! keep em coming!
 
I find myself more and more wishing Apple would offer more options On their laptops. I would love to see a 14 or 16 inch with a simple M1. I don’t need the power of the M1 Pro or Max, but I would love a larger screen at a lower price. At this point I think Apple should offer the MacBook Pros and a simpler MacBook. The MacBook could have fewer ports and less power to keep the price down. Perhaps the rumored larger MacBook Air will fit my needs. But its still just a rumor at this point.
 
  • Like
Reactions: Mockletoy
If we're doing color grading on a MacBook Pro, what's left for Mac Pros to do? :)

Two things without even leaving the video editing category. ( and there are lots of other categories ),

First, Apple's benchmarks about how super fast the M1 Max is are all ProRes 422 . Not ProRes RAW .
8K ProRes coming out as a "proxy" for RAW to get some quick and dirty color adjustments in the field versus a primary recording that is made in RAW.


Second, that primary RAW recording could be in something other than. ProRes. RedRAW will require more horsepower. Blackmagic RAW ... more horsepower. etc.

ProRes444 if it is hyper nitpick color correction. Also not covered as well.

The Mac Pro will be more flexible. The MBP 16" will be more mobile (and cheaper). There will be an ever growing pile of. ProRes422 coming out of phones, prosumer mirrorless , etc. There will be lots of footage to chew into. But for folks that are looking for the most accurate color adjustments 422 compressed may not be the primary correction format.

However, the meme of "You can't be a 'real' Pro unless you are using a ultra narrow set of hardware.... blah blah blah"; that is much weaker now. Over an extended period of time has been getting weaker and weaker like the rest of the PC market covering what used to be 'big iron' workloads.
 
High Power Mode sounds like it can run at high power consumption without throttling but with higher fan speed vs High Performance Mode that boosts clocks and performance. This is video editing anyway so it should be using hardware video encoders which according to the presentation is suppose to be low power. Conflicting message.

The de/encoder just do one specific thing. They either turn an compressed, encoded video into unencoded native video or they go the opposite way ( native/unencoded ) into some format that will persistantly stored on disk.

Color correction does not work on encoded data. You have to decode and then color correct the data. and then recode it back on the storage media. Color correction is about that part in the middle. There would be much more work on top of the color correction if the SoC didn't offload the de/encode subtasks off onto the fixed function.

A major reason why the fixed function logic takes so much less power is because it is so myopically focused on a relatively narrow task. It does that task extremely well and other tasks not at all. Color correction needs to adjust the colors . in color space, de/encoding is solely about mapping color data into a format that takes less space. It is trying to keep the same color in that transition; not change it into something else.

In short the en/de coders allow you to do more with less. You have more computational cycles for color correction because not "burning them up" in de/encoding overhead before/after the corrections. Are those ProRes de/encoders going to help with RedRAW? Nope. So a fixed width path solution.
 
Last edited:
  • Like
Reactions: citysnaps
I'm glad it supposedly can handle processing 8K, but honestly, do we really have a need for 8K yet?
8K is pure marketing. I’ve never really seen an 8K display, but all that I’ve read and seen about them is you need a super large screen to really tell any difference between 4K and 8K. I have a 50” screen and can hardly notice a difference between 1080 and 4K. It is noticeable, but I’m never upset with 1080. I imagine you need at least a 100” screen to even begin to notice 8K. Not only that, but it also require relatively decent internet speeds. I not only think it will be a long time for 8K to become a standard, but I doubt whether it ever will at all.
 
I'm glad it supposedly can handle processing 8K, but honestly, do we really have a need for 8K yet? 4K has been around for years, and only recently have we seen new TV manufacturers move to 4K as the standard format. You still can't stream much 4K over most services, cable, satellite, etc. And OTA TV is staying at 1080i, 720p, and 480i for a long time to come. As far as HDR, well good luck streaming much f that at all! With the industry still trying to catch up with 4K, it will be a long time before 8K becomes a part of the average household. At present it is a small niche market only. I bought my first HDTV in the early 2000s (2002?). For years most programming was still SDTV and it looked terrible on the first HDTV sets. That is why I waited until this year to buy a 4k HDR TV. The bad experience of jumping the gun with HDTV was a hard lesson to learn. 8K TV, well if it comes it will be a long time from now...
I hear what you’re saying. However, wouldn’t it be nice to watch some of your favorite older movies in native 4K? Movies shot today in 8K will played on 4K until TVs and mass markets adopt 8K as the “new” standard. At that point, these movies can be played in their native 8K renderings.

BL, I’ll gladly take a new MBP that can perform massive tasks I’ll never use at the same price I paid for a MBP several years ago.
 
Ah yes, color grading 8K ProRes videos, I do that all the time!

It's my favorite thing to do when I wake up and before I go to sleep.

But I just wish it could color grade 16K ProRes videos as well :(
 
  • Haha
Reactions: Orange Bat
I mean yea that’s a professional use. So cool. I’m more snarky at the idea that everyone with a camera is a Professional in the sense that they have achieved something in a profession. This is an entire rabbit hole.

On the other hand I disagree with the original post. Gaming is not irrelevant. People find joy in that and it takes up a lot of peoples time. I don’t think it’s a stretch for one to desire to be able to play games on these machines considering what they are and their quality. I find myself wanting something like this all the time and going back and trying it out, but until the main games I play are either native or apple does some magic I’m finding it easier to use a windows device for games and my iPad for other things.
I mean, Macs have never been the center of the gaming world so nothing has really changed here other than the new Macs are certainly capable…now it’s up to game Developers business models whether it’s worth it to do it on the Mac.

As for the rise of social media personalities, I hate it, but they’re working just as much as TV personalities are “working”. So I’m not going to gripe that what constitutes “work” has changed. I’m a SysAdmin…I don’t begrudge the fact that I’m not doing “real work” in a coal mine. Times change.
 
8K is pure marketing. I’ve never really seen an 8K display, but all that I’ve read and seen about them is you need a super large screen to really tell any difference between 4K and 8K. I have a 50” screen and can hardly notice a difference between 1080 and 4K. It is noticeable, but I’m never upset with 1080. I imagine you need at least a 100” screen to even begin to notice 8K. Not only that, but it also require relatively decent internet speeds. I not only think it will be a long time for 8K to become a standard, but I doubt whether it ever will at all.

You don't need an 8k display to benefit from 8k. I play 8k YouTube on a 4k display to compensate for loss of detail from video compression. Also, it's nice to be able to play local 8k media without transcoding down to 4k. And, 4k is commonplace with 8k just around the corner if not already present on phones for 8k recording. So, it makes sense that whatever you're buying now is 8k ready.
 
Apple is designing machines for field use and portable for work at a location then bring home to work. The reality, the people who need the ultimate power are largely desktop based and not pushing laptops to the limits outside of their typical setup. The work from home for the foreseeable future helps Apple a lot here, but companies paying for them are not buying into the M1/M1x systems, yet. Might be very different in a 6-12 months from now.
Even at home a laptop is great. Huge flexibility!
 
This really does seem like a whole new mindset taking hold at Apple, as if they have finally remembered that these are tools for doing work, not status symbols for influencers and coffee shop posers.
Well said! I'm very close to upgrading my late 2015 15" rMBP to a new 16" M1 Max because I finally feel like Apple listened to the pro users (video, animators, photo, graphics, producers, etc.) on what they needed vs what Apple felt they needed. I don't remember the last time I was this excited about computers from Apple. They're rocking it with their own silicon.
 
  • Like
Reactions: Mockletoy
That's so good. I remember copying 4K video from my Panasonic GH4 in December 2014, and barely being able to view it on a mid-2012 MacBook Pro quad-core i7 running at 2.6 GHz with 16 GB of RAM and an SSD.

I'm glad that the people in charge can finally see that we need more, useful power, not just numbers on a sheet.
 
  • Like
Reactions: dkamisato
Still wondering about the scenario the M1 Max is definitely going to be obsolete in 2026 and wont be able to run the latest MacOS due to specs/limitations in 2028. Looking forward to M7 Pro Max SOC Nano. Battery included. Runtime 1 month.
 
Color Grading occurs in a Linear Color Space, so lots of math going on, & especially if @ 8K30p !

In real time ?, need to convert to get there, & then again to get back to original (for viewing the changes).

That would be a very useful Benchmark !
 
Color Grading occurs in a Linear Color Space, so lots of math going on, & especially if @ 8K30p !

In real time ?, need to convert to get there, & then again to get back to original (for viewing the changes).

That would be a very useful Benchmark !

Something like that. Apple wording will mislead many that they will be able to grade 8K in real time, that's why I sad "yeah right" in the first post. You can grade 8K almost in real time but to do that you need dual Xeons or Threadripper with two 3080 desktop GPUs. Then Scratch can run 8K offline files for color grading in almost real time as you keep adding layers and masks frame rate drops like hell.
 
People tend to pick the right tool for the job. Apple wants to get into AAA gaming it will do in it's own good time as there's clearly a market.

Q-6
I'm honestly surprised more people aren't talking about this... Apple is doing it(getting into higher end gaming) right now, it's happening right before us with their recent moves including what we've seen with these new MBPs, and a lot of people haven't seemed to realize it yet.

One of the biggest reasons games have not been ported to the Mac over the years has been market share. The market share of the Mac and the demographics of the users, doesn't make it worthwhile to port a lot of games for a lot of game companies.

Enter M1, it can run applications meant for iPhone and iPad... there's a level of compatibility between the two platforms, because they're now the same CPU architecture, that does not even come close to existing between popular traditional computers and smart phone/tablets elsewhere... devices which make up a very very large market. All 3 combined, comprise a user base of over a billion people worldwide. That's a fat market, one in which the users have consistently shown a strong willingness to pay for games and applications.

Suddenly, porting to the Mac isn't about going after the dollars of a couple of 10s of millions of mac users, many of which are business/creativity focused, but rather, porting to Apple Silicon is about going after the dollars of hundreds of millions(well over a billion last I looked it up) of users, many of which are gamers.

The next couple of years will bear this out... while not every game will make it to the Mac(not every game makes it to every gaming platform), many game companies who would have avoided a Mac port in the past, will absolutely do an Apple Silicon port, because otherwise they'll be leaving a lot of money on the table and the shareholders/board of directors will be none too happy about that.
 
I mean, Macs have never been the center of the gaming world so nothing has really changed here other than the new Macs are certainly capable…now it’s up to game Developers business models whether it’s worth it to do it on the Mac.

As for the rise of social media personalities, I hate it, but they’re working just as much as TV personalities are “working”. So I’m not going to gripe that what constitutes “work” has changed. I’m a SysAdmin…I don’t begrudge the fact that I’m not doing “real work” in a coal mine. Times change.
That is a nonsensical comparison. Nobody is saying anything about coal mines or real “work”. What I’m calling into question is the so called association of pro creativity with buying a $4,000 laptop. I play piano and am a trained musician and I have a hard time reconciling even calling that “creative” per se. This is the only parallel here. And attributing Professional to someone usually means they are skilled. What I’m calling into question is that. Are they skilled or do they just have mom and dads credit card and time. In other words, the words and how they are used are what I was being snarky at in my original post.
 
Even more sad that the 2024 14" M3 Max will destroy the M2 Max. Why do chips improve every year? :eek:
Evolution. But what Apple is trying to do here, probably, is make buying a new Macbook predictable for people. And it will probably be every 2 years with programs like the Phone upgrade program. Upgrading them massively every 1 year makes no sense for Apple as these are not phones, they cost way more. I expect internal upgrades every 2 years and generational changes every 4 years. But let's see. :D
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.