I didn't say it was right because they did it, I said it was right because it improved the user experience, which it does.
It really doesn't. I've tried to use many apps on my old Dell Venue 7 and I just gave up trying to click on buttons that are way too small. Was trying to use filezilla and bulletproof ftp and their interfaces were designed with mouse only in mind.
No, it's positioned in a way that forces you to look away from your screen to use it, and is often obscured by your hands while typing. Vertically would be a hell of an improvement, but while they're at it, why not just skip the secondary display and just let me touch the display I'm already looking at.
Ergonomically speaking, it's fine.
I didn't say I love the touch bar, but for long periods of usage, it's fine. For certain interactions like scrubbing a Quicktime video, it's great. For tapping F1, F2, F3, it's not better than dedicated keys for sure. I'd prefer dedicated keys for that usage.
I find it odd that you're complaining about a usability issue with touchbar, but ignore the usability issue with having a touchscreen Mac, which in many ways, is more critical to the user experience than the touch bar.
I strongly disagree. There's plenty of need for touch now that the Mac is going to run touch-based apps. If they wanted people to redesign their apps, then maybe don't start by directly running touch apps on a non-touch display. It's going to be a mediocre-at-best experience, and I predict that's something we're going to see get brought up in every review. Everything they're doing signals that touch is coming, regardless of what they say.
Agree to disagree. A touchscreen Mac will undoubtedly have compromises in the design. A trackpad can certainly handle most iPad apps on the Mac.
Don't know what you mean. Apple wants people to use SwiftUI which makes it quite easy to bring apps to the Mac. I don't see why Apple would invest heavily on figuring out controls that Mac and iPad and create Project Catalyst. If Apple is really making a touchscreen Mac, they would have ported UIKit entirely to the Mac instead of making Project Catalyst and SwiftUI.
It wouldn't compromise anything, so there's no need to worry about that. Adding touch is a known quantity, they don't need to reinvent the wheel here - or if it is a real challenge, then maybe they're not quite as skilled as they should be.
No way the lid remains the same thickness. A touch layer would at least add thickness, weight, and compromise on battery (which, sure, is more than enough, but it's still a compromise). Not to mention additional cost to the product.
And yet iOS apps are designed for touch interaction first and foremost, so incorporating that into the MacOS experience makes the most sense. Why not give all their devices the same input options so users don't have to adjust their use to fit a gimped experience on one device vs another? Being stubborn about keeping things separate isn't good for the user experience - again, this is something we've seen other laptop makers do and it brings plenty of benefits with basically no drawbacks. And if you really hate it, just don't use it - simple.
Giving user options of many input options adds complexity for both the user and developer. Have you seen Android? Some Androids have foldable screens, yet what apps take advantage of that? Very little. I remember I had a Droid with a physical keyboard, but most apps have stretched out screens when typing with the keyboard (because sliding out the keyboard forces you to use the device in landscape mode with a very narrow screen).
Then you have to worry about whether the user should have the option to add a touchscreen to their Mac. Or should Apple make all Macs have touchscreens and be forced to pay $100 extra for something they don't need?
If all Macs had touchscreens then developers will be forced to test their apps on keyboard/mouse + touchscreen inputs. However, if *some* Macs had touchscreens, then the user needs to figure out if the app they're going to buy supports touch input.
I used my iPad Pro with magic keyboard. I tried using a trackpad as a mouse pointer to navigate through the ESPN app, but the interface wouldn't let me click on anything, so I had to switch to the touch input. If Apple can't even get magic keyboard to work well with all apps, what makes you think it'll work great when Apple introduces a touchscreen Mac?
TL;DR version:
You're skipping over very important details that the user/developer/Apple needs to worry about when introducing touch to the Mac. How *exactly* does it get implemented across the entire user experience, from buying a Mac (option to have one? or forced across all Macs? Mac mini/Pro too? how does that work?), to developers supporting it (forced to support touchscreens? or allow iPad and iOS flavors alongside Mac keyboard flavor? or optionally support Mac touchscreens?), to a user buying an app (is it an iPhone app? iPad app? or is it just the difference between Mac app with keyboard support, Mac app with touch only support? Mac with both keyboard/mouse/touch support?), to a user using the app (do you flip the MacBook screen back to use the iPad app? if so, how do you switch between Mac and iPad windows? or is it natively a Mac touch screen app in which case the user will swap between a keyboard and touch screen? or do you force the developer to always adopt keyboard and touch?).
It's not that simple as adding a touch layer to the Mac.