Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
In regards to Face ID, Apple said customers love the experience of using Touch ID on the Mac
False. Touch ID sucks on the Mac. Touch ID sucks period when it isn't part of a natural flow. It was fine on iPhone inside the Home Button, because it was part of the flow. It sucks on iPad, and it sucks on the Mac. Face ID would be infinitely better on the Mac. They can't do it, however. Not without a miracle miniaturization of components. But they won't admit that.
 
False. Touch ID sucks on the Mac. Touch ID sucks period when it isn't part of a natural flow. It was fine on iPhone inside the Home Button, because it was part of the flow. It sucks on iPad, and it sucks on the Mac. Face ID would be infinitely better on the Mac. They can't do it, however. Not without a miracle miniaturization of components. But they won't admit that.

The FaceID system looks pretty thin. If it’s currently just too thick then it doesn’t seem too far from being viable.

984DBE3B-0E6C-48BE-BC91-84836A09BE70.jpeg
 
Should HDR kick in when viewing HDR YouTube videos or dose it have to be a certain HDR content?

Also if HDR does turn on for HDR youtube can't we just open a HDR YouTube video in the corner of the screen, that way the screen will be brighter for everything else?


Great question! i like the way you think outside the box.
 
This is probably an unpopular opinion, but Macs are not built to be gaming machines. They're for productivity. If you want to game buy a gaming laptop, PS5, XSX or build a PC.
Basically, because I don't want to have to buy two machines, when one should be able to do all of it. My 2015 iMac in Boot Camp used to be able to play almost all games well. Unfortunately, I'll have to buy a separate gaming PC soon, which I absolutely don't want to do (but absolutely have to).
 
  • Like
Reactions: Vazor and MacsRgr8
The FaceID system looks pretty thin. If it’s currently just too thick then it doesn’t seem too far from being viable.

View attachment 1874088
its housed in an Iphone not a macbook pro lid.

it'll be some time before they can add it because shrinking those components haven't made much progress.

I'd prefer touch ID on a phone and FaceID on a macbook. I honestly think Apple is @$$-backwards on these security/biometric decisions, but we have to take what they give us.
 
I might not keep the 16" then if the brightness is the same. 1000 sustained seemed to be misleading.
if this report is true. yes 100% what apple has stated is completely misleading BUT NOT A TECHNICAL LIE

If they would have said 1000 nits sustained *while viewing HDR content* and 500 nits typical then I would trust them more.

But now? Sorry, Apple you're a liar in my eyes and I'll need to do complete due diligence before spending any money on your products.

Your deceptive marketing jargon is super gray area and I don't like it one bit.

Give the people the unfiltered information and let us make a purchase decision.
 
False. Touch ID sucks on the Mac. Touch ID sucks period when it isn't part of a natural flow. It was fine on iPhone inside the Home Button, because it was part of the flow. It sucks on iPad, and it sucks on the Mac. Face ID would be infinitely better on the Mac. They can't do it, however. Not without a miracle miniaturization of components. But they won't admit that.

Huh? TouchID on my Mac has been great! My parents are techy and can use it easily. I am a tech person and I enjoy using it.

My hands are on the keyboard already, like an inch away. Obviously if I am already looking at the the screen then FaceID is absolutely "easier" but that hardly makes moving my finger an inch "suck".
 
Huh? TouchID on my Mac has been great! My parents are techy and can use it easily. I am a tech person and I enjoy using it.

My hands are on the keyboard already, like an inch away. Obviously if I am already looking at the the screen then FaceID is absolutely "easier" but that hardly makes moving my finger an inch "suck".
apparently you don't understand flow.

You don't move your finer an inch... you take your whole hand and fingers off the keyboard where they are positioned in the "flow" of using a computer (like the average person that is or advanced user).

And you have to take your eyes off the screen to find the TouchID key/sensor.

Unless you registered your pinky finger and have good reach for the TouchID key. You're probably doing what I described.

Which disrupts the "flow" you might not understand.
 
if this report is true. yes 100% what apple has stated is completely misleading BUT NOT A TECHNICAL LIE

If they would have said 1000 nits sustained *while viewing HDR content* and 500 nits typical then I would trust them more.

But now? Sorry, Apple you're a liar in my eyes and I'll need to do complete due diligence before spending any money on your products.

Your deceptive marketing jargon is super gray area and I don't like it one bit.

Give the people the unfiltered information and let us make a purchase decision.
Yea, they did this intentionally. Maybe I’ll see how it is with the mini-led but if it doesn’t wow me, I guess I’ll wait one more year.
 
Just imagine this,

You're buying something online on your macbook. You type in all your personal details, hit the tab key to enter Credit card information - BOOM FaceID autofill and your hands didn't leave the keyboard. All you do is enter the CVV or PIN # and your done.

Your eyes never left the screen and your hands never left the keyboard.

FLOW
 
It’s not the slightly slower speeds that are messed up, it’s that the only available material at the time orders opened claimed the two sizes had “exactly the same” chips and cooling solutions. It’s a bit of a bait and switch, and since you can’t swap your order around without incurring potentially months of delays, that does suck for anyone who went with the smaller size explicitly for the “same” performance. It won’t matter in the real world for 99% of users, but it still does suck and is kinda messed up. Also, it’s not just that the 14” throttles a little sooner or more, as we might have expected, there’s an entire “high power mode” that wasn’t disclosed until orders had been live for three days. It’s not going to be 2% different in high power mode, that’s within margin of error. More likely it’s 10% faster.
I was flip-flopping between a 16’ Max 24c gpu and a 14’ Max 32c gpu. Both configs with 32gb and exact same price. I then settled for the 14’. If it turns out that the 24c 16’ has better performance because of thermals I am so returning this one as this really feels myself and many others were mislead by not having this tiny detail upon order time!
 
Yea, they did this intentionally. Maybe I’ll see how it is with the mini-led but if it doesn’t wow me, I guess I’ll wait one more year.
Weird thing to me is, If iPhone Pro's can reach a normal 800 nits daily use outdoors, why not try and make the macbook pros do the same?

People have stated washed out colors, but that doesn't happen on an Iphone.

Different screen technology? maybe.

OLED has its problems with burn in, but WE WANT BRIGHNESS (and maybe set a quick screen saver? LOL)
 
Yeah, really messed up this wasn’t disclosed before orders went live. It’s not like we can wait for reviews to decide what to buy, if we do that it’ll be mid 2022 before we get the machines! I wanted to go for the 14” but grabbed a 16 because of battery life and the back of my mind being concerned about potential throttling. This is worse than throttling though, and totally sucks for anyone who ordered a 14” Max.

Of course you should wait! Why in the world would anyone pay $$$$$ for a just released laptop knowing little about it. There will be loads of reviews with loads of additional information in just a couple of weeks.

Simply cancel your order. Stat!
 
In regards to Face ID, Apple said customers love the experience of using Touch ID on the iPhone for everything from unlocking their Iphone, to filling in passwords online, changing accounts, and making secure purchases with Apple Pay, but the company unsurprisingly changed to FaceID to sell more iPhones.
They’ll probably include it next time, new paradigm lol
 
  • Like
Reactions: yitwail
Of course you should wait! Why in the world would anyone pay $$$$$ for a just released laptop knowing little about it. There will be loads of reviews with loads of additional information in just a couple of weeks.

Simply cancel your order. Stat!
NO don't wait! tomorrow isn't guaranteed spend as much money as you can now on things you don't know much about.

its a new shiny thing. BUY IT MEOW!
 
I was flip-flopping between a 16’ Max 24c gpu and a 14’ Max 32c gpu. Both configs with 32gb and exact same price. I then settled for the 14’. If it turns out that the 24c 16’ has better performance because of thermals I am so returning this one as this really feels myself and many others were mislead by not having this tiny detail upon order time!
I’d bet you’ll be fine, at least gpu wise. there’s no way high power mode is 20+% different performance wise. Honestly, I think you did it right. Might not have the ultimate peak performance but if you considered the 24core version, you’ll still beat that for the same money with more portability.
 
This is a little off topic. But I preordered my 16” through Best Buy and will be able to pick up on 10/26. Also they offer 36 months same as cash! I thought this was great. Just wanted to share. ?
 
If I understand correctly, the unified memory in Apple Silicon is shared and used both as "regular" RAM and also as "VRAM." Is this correct? And if so, would people want to actually consider upgrading to higher memory levels in Apple Silicon computers compared to what they're used to?

Because, for example, if you nearly reached the limits of 16GB RAM and had a 4 GB graphics card before on an Intel machine, and now that memory is being shared for other additional uses, would that mean you'd want more like a minimum of 20+ GB RAM (obviously 32 GB is the next step up, though) to get a similar ceiling, depending on what you're working with?

If anyone could help me understand/explain why this is or isn't true, that would be very helpful.
Fair question, curious to know about this too.
Couple of things to think about:
  1. When using a classic GPU, often the data on the VRAM exists duplicated on system RAM. This is because if the program wants to do changes on a mesh, texture, material, etc at runtime the CPU needs to access it, apply the changes and send a copy to VRAM (with all the sync pain points, pipeline stalls, etc that it triggers). Not the case with M1s where the data can safely exist only once.
  2. Since the beginning mobile games have been using clever real-time compression tricks for data, namely textures which can take a huge memory footprint. Lately, with the latest “ASTC” compression advancements, some content can be compressed massively with almost imperceptible visual loss… where a 4096x4096 texture would take 16MB, an ASTC compression could bring it to half of it or 20% or even 15% if it had no alpha channel and some minor artifacts is deemed acceptable (it’s very flexible on the quality to choose from). The cool part is, whereas a jpg needs to be uncompressed and placed paying the full memory price when on the GPU, the ASTC is kept as is, loaded on the GPU with no changes (or in the case of M1, just loaded), and used as is. Hope we get access to play with that like it’s doable already with Unity/Unreal.

Weight of the 16" is why I'm contemplating a 14" instead of a 16". I find the weight of both machines to be a bit disappointing.
I have a 15” MBP from 2014, I used it not that much in portable mode in the end because of how clunky it feels even with it being one of the most ultraportables for that size and category. If I were to buy one, I would contemplate to go 14” too, “high performance switch” whatever be damned… I even turn off the turbo boost thing on my iMac.
 
  • Like
Reactions: Artemis70
500nit cap is complete garbage, I was so excited for a 1,000nit sustained screen for working outside. Sure it will drain the battery, but if the screen can do it why limit it to the same last gen brightness? Hell throw us a bone and give us 600-800nit even, something that's at least brighter than last gen.
 
Basically, because I don't want to have to buy two machines, when one should be able to do all of it. My 2015 iMac in Boot Camp used to be able to play almost all games well. Unfortunately, I'll have to buy a separate gaming PC soon, which I absolutely don't want to do (but absolutely have to).
The issue here isn't, or rather won't be once these are out, raw gpu power. Its that it's a completely different api layer for a relative small number of machines. they dropped OpenGL, don't do directx, and have a fraction of the install base of amd or nvidia. Metal is great, but it's a completely different api to build an engine to.

i gave up and got a shadow.tech account for gaming. would Love to drop it, but not expecting to be able to anytime soon.
 
  • Like
Reactions: Moyapilot
If I understand correctly, the unified memory in Apple Silicon is shared and used both as "regular" RAM and also as "VRAM." Is this correct? And if so, would people want to actually consider upgrading to higher memory levels in Apple Silicon computers compared to what they're used to?

Because, for example, if you nearly reached the limits of 16GB RAM and had a 4 GB graphics card before on an Intel machine, and now that memory is being shared for other additional uses, would that mean you'd want more like a minimum of 20+ GB RAM (obviously 32 GB is the next step up, though) to get a similar ceiling, depending on what you're working with?

If anyone could help me understand/explain why this is or isn't true, that would be very helpful.
This is exactly why I jumped from 32 to 64.
 
I wonder if eGPUs are even coming back. I actually used one for a couple years but the reality was decidedly less awesome than the theory. I can’t think of any thing that gave me more stability headaches on a Mac basically ever. It wasn’t unusable but it wasn’t what I’m used to.

Seems like SOC is where things are heading, particularly in Apple Land.

I think the M series Mac Pros are going to tell us a lot about this. Are the video systems going to be up upgradable? Are we going to see drop-in SOCs and motherboards with sockets? Is it just going to be PCIe?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.