Put an M1 in the Apple TV you cowards.
IMHO Apple needs to push the "video and audio quality" angle harder. (Of course that requires them to stop making mistakes in this space, like the on-going Atmos issues...)
Of course some content comes out as 4K HDR 120Hz Atmos, and that's great. But there is a ton of content that's older and lesser quality in various ways, all the way down to like 70s NTSC material.
TV's compete in their "AI" chips to improve the visual quality of this stuff, and Apple should do the same thing and do it better.
To be more precise Apple does some of this right now.
- Their Display Pipe tries to add contrast and dynamic range to SDR content and does it well, better than my LG CX
- Their Display Pipe tries to do AI-based upscaling and does it "adequately". Better than a dumb scaler, not as well as LG.
- Their Display Pipe *appears* to be based purely on one-frame at a time, no frame-to-frame data propagation. This limits how much they can improve frame n+1 based on features of frame n; and means they are terrible in terms of temporal interpolation/motion smoothing compared to my LG.
Basically what aTV HW should be able to do is take any video stream (even something like a low quality 480i DVD stream) and do a good job of
- adding HDR (contrast, color)
- intelligent upscaling all the way to 4K
- remove interlace artifacts
- temporally resample to 120Hz
- (and the same sweetening and improvements applied to audio)
to send that stream to the TV.
Yes, yes, yes we all know that you're a movie snob who only watches The Criterion Collection on your TV is "Cinema Mode" and is happy to rant for hours about the violation of human rights that occurs when a TV "modifies the film maker's intent" by performing any of these sorts of manipulations. Whatever. The rest of us just want to watch our random content, whether it's 80's sitcoms, old DVDs, or PAL documentaries, at what looks to use like the best possible quality.
This is a space where Apple (in theory) can compete and Roku et al can not. The intelligence is built into the Apple chips (the Display Pipe has been doing these sorts of things for the past 7 years or so) but
(a) Apple doesn't push this angle in product design. The aTV seems to get this stuff essentially because it's there, not because anyone on the team is looking at the HW and saying "OK, this is a great chip with this functionality; now how can we push it even harder than the phone and laptop guys are pushing it for the purposes of improving low quality video/audio"
(b) Apple doesn't push this angle in marketing. One of the pillars of the first aWatch ads was "this is just a better watch! It's atomic clock accurate, it gives easy functionality for things like alarms, it automatically handles time zones and daylight savings". Apple aTV ads should do the same thing. "This is just a better TV device. It makes all your content, even the old stuff, look better".
(c) One whole track of Apple SoC display quality has to do with automatically tracking the brightness and whitepoint of the environment and modifying the display accordingly. (The brand name for this is True Tone; a variant is the Night Shift stuff if you are one of those people who wants to remove blue from what you see when reading late at night.) aTV makes ZERO use of this.
One could imagine having the aTV with a built in sensor for brightness and white point (and perhaps a small separate wireless sensor for people who like to put their aTV in a drawer or behind their TV). Then your TV would automatically shift its brightness and color to match sunlight in the room vs night (but lights on) vs night and lights off.
It's fine to play up the aTV+ angle, but
- many people just aren't interested in the sensibility of aTV+. Very little of it interest me, for example
- aTV+ is not a great scheme for selling aTVs since it is available elsewhere!
aTV is both HW and a UI experience. Apple sells it purely as the UI experience.
They ought to be selling it more as the HW (which is a LOT harder to replicate...)