Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Doesn't it say a lot when Apple doesn't trust Touch ID to be the only option? Instead, a password is always the last resort.


It's actually not that at all. Apple's approach to TouchID is quite complicated and this information is stored in encrypted fashion in Apple's Secure Enclave

https://support.apple.com/en-ca/105095#:~:text=Secure%20Enclave&text=It%20isn't%20possible%20for,only%20to%20the%20Secure%20Enclave.


and



Your fingerprint data is encrypted, stored on disk, and protected with a key available only to the Secure Enclave. Your fingerprint data is used only by the Secure Enclave to verify that your fingerprint matches the enrolled fingerprint data. It can’t be accessed by the OS on your device or by any applications running on it. It's never stored on Apple servers, it's never backed up to iCloud or anywhere else, and it can't be used to match against other fingerprint databases.

If you search for the related security exploits for phone and computer biometrics I think you'll find that some of Apple's competitors who have simpler systems have been breached, while it seems that Apple has not (yet?).
 
Your fingerprint data is encrypted, stored on disk, and protected with a key available only to the Secure Enclave. Your fingerprint data is used only by the Secure Enclave to verify that your fingerprint matches the enrolled fingerprint data. It can’t be accessed by the OS on your device or by any applications running on it. It's never stored on Apple servers, it's never backed up to iCloud or anywhere else, and it can't be used to match against other fingerprint databases.

If you search for the related security exploits for phone and computer biometrics I think you'll find that some of Apple's competitors who have simpler systems have been breached, while it seems that Apple has not (yet?).
Indeed, and it really feels like Apple‘s approach to biometric authentication has been more as an added convenience to the user, rather than to replace password-based authentication entirely. Sort of a “nice to have” thing that makes entering passwords and waking the computer from sleep easier, but not something that gives unlimited access to your device.

One can always log in with their passcode/password on Apple devices (and with macOS, you absolutely need to use your password on initial login and in many cases after that for some software installations, updates and so forth). The almighty password is still very much alive and well.
 
  • Like
Reactions: lsquare
Doesn't it say a lot when Apple doesn't trust Touch ID to be the only option? Instead, a password is always the last resort.
Requiring the password prevents a hardware attack, like taking a programmed fingerprint module out of one laptop and putting it into another. But I don’t know for certain.
 
  • Like
Reactions: lsquare
Requiring the password prevents a hardware attack, like taking a programmed fingerprint module out of one laptop and putting it into another. But I don’t know for certain.

Correct, because the password is required to unencrypted the Secure Enclave information.
 
  • Like
Reactions: lsquare
I understand his perspective: If the fingerprint method is perfect, why should you ever require a password? I think it’s ok to ask that.
But who said fingerprint method is perfect? As I posted before, a finger injury will cause it to stop working until the injury is healed. That's an inherent problem with all biometric authentication methods. The biometric indicator can change. So there needs to be a backup method to get in.
 
I understand his perspective: If the fingerprint method is perfect, why should you ever require a password? I think it’s ok to ask that.
I don't, but I have been in the IT biz for decades, so maybe that is skewing my perspective. Now I wish the OP would take a gander at our perspective and understand how layered security helps.
 
  • Like
Reactions: Alameda
Or a rash, or ...
Chapped fingers in the wintertime... this happens to me pretty much every year, and renders Touch ID mostly useless on all of my Apple devices that support it. I can't tell you how many times I've had to reregister fingerprints on all devices during the colder season.

As for facial recognition, that can also be problematic under certain lighting conditions and at certain angles. Hence the need for multiple layers and methods of access...
 
You guys have been fantastic. My understanding have improved dramatically. Here's the part that still doesn't make sense to me. If MacOS uses the native resolution of the monitor, why would it be blurry compared to using a 4K monitor? That's the part that makes no sense to me. Why would Windows look better on the 29" monitor? What is it about MacOS that renders text worse than Windows at lower resolution?

Thank you!
The best answer is that a 30” 1080p display will be blurry simply because it is low resolution for such a large screen. But a 30” 4K screen at 1080p resolution will have very sharp text. The Mac still sends a 4K video signal to the screen even though you tell the Mac to display in 1080p. This gives you very smooth, larger fonts.

I use an LG 4K display with my Mac and it’s great. I use all different resolutions to suit what I’m doing. The full 4K setting gives me the largest desktop; I can arrange more windows and so on. But menus and other objects can be small for me, so I usually use a lower setting, in between 1080p and 4K.

If I used a larger 4K monitor, then I wouldn’t have the problems with small objects looking too small, but now you’re moving your head all around looking for stuff. I don’t think Windows is any better in this respect.

BTW, if you buy a 4K display for your Mac, use a DisplayPort connection, not HDMI through a dongle. It will cost less and work better, because you won’t get 4K60 10-bit through an HDMI dongle; you’ll get 8-bit at best and maybe only 4K30. It’s a limitation of the dongles, not of HDMI, and the dongle vendors don’t point this out.

OTOH, when I edit digital photos (I’m a photographer), then I use the full 4K mode and it’s great. I use my image editor a lot so I know where all my tools are and I can see all the details in my photos when I’m looking for blemishes or stray hairs that I want to clone out and so on.

I used a Windows 11 laptop for work and absolutely hated it. I’m much happier using a Mac. There are little dislikes I have, but I live with them. For instance, Windows has Maximize and Restore buttons on each window. But on a Mac, there is no Maximize. If you click the Maximize button, it creates a full screen window on a new, virtual desktop. Instead, if you double-click a window title bar, it makes the window large but not full screen.

Another annoyance is that when you minimize windows, you can’t Alt-Tab to switch to them. You have to use your mouse to tap the minimized window’s icon to restore it. I installed a 3rd-party utility to get around that.

There are so many advantages to Mac compared to Windows that these minor annoyances are not that big a deal to me.

My best advice is this: If you have a Windows app you need to use, try to find a Mac-native alternative instead of emulating Windows. Just start a new thread here describing what your app is and what it does, and if there’s a Mac alternative, you’ll find out.
 
Okay. First of all, you can't use two monitors with Macbook Airs. Even if you close the screen. You need Macbook Pros for that. I think the newest MBA M3 also allows two monitors if you close the screen.

Now, there are some ways around that with certain docks and the like, but I always just use one 4k monitor with the MBA's screen as the second, so I haven't had to look into this.

As far as how Mac handles scaling, it's both better than windows and worse. First, it renders everything at 4k, for instance, and then downsamples it to 1080p or 1440p or whatever you choose. This means no tiny text when installing programs, and everything renders very sharp.

However, it sucks with less than 4k resolution--and it does take some getting used to.

Good luck.

To get better text rendering on screens with less than 4k I recommend https://www.fontsmoothingadjuster.com/ it sharpens up fonts nicely.

I use multiple monitors just fine - 3 in fact - on my M1 Macbook Pro with the help of a DisplayLink dongle but you need to get one of the newer 4k60hz devices such as Pluggable USBC-6950U. The only drawback is not being able to playback DRM content. Other than that, performance is perfectly fine for any standard desktop use. High end gaming, maybe that won't work so good but a Macbook Air isn't a gaming device to begin with anyway.
 
To get better text rendering on screens with less than 4k I recommend https://www.fontsmoothingadjuster.com/ it sharpens up fonts nicely.

I use multiple monitors just fine - 3 in fact - on my M1 Macbook Pro with the help of a DisplayLink dongle but you need to get one of the newer 4k60hz devices such as Pluggable USBC-6950U. The only drawback is not being able to playback DRM content. Other than that, performance is perfectly fine for any standard desktop use. High end gaming, maybe that won't work so good but a Macbook Air isn't a gaming device to begin with anyway.
Well, that would have been useful before I bought all 4k monitors. :D :D
 
  • Like
Reactions: lsquare
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.