so basically no.. unless you really hate the old remote, which you can buy separately, or if you need eARC support.
That's the wrong takeaway.
The real issue is
(a) are you upgrading from an HD rather than a 4K?
(b) what sort of TV do you have (or plan to get soon)?
If you have an HD, then the new 4K just feels faster. You may think the old HD was fast enough, and sure it mostly is. The the new one just feels faster in ways you can't really articulate, the same way your iPhone from two years ago feels fast enough, but you try the newest one and it's just smoother and there's fractionally less waiting.
I never owned the old 4K, so all I can discuss is the new 4K, in the context of an LG C1, which is an OLED Dolby Vision 120fps device. (I upgraded my 15 yr old TV along with the Apple TV!)
Obviously the Dolby Vision/HDR stuff is spectacular, but the old 4K had that.
What's not clear to me is the amount of "massaging" that's happening within the atV -- and if this differs in the new 4K.
We all know that when you buy a new TV, the TV company makes a big deal of all their various modes and supposed AI -- technology that's supposed to upscale an HD or SD signal to 4K, that's supposed to motion interpolate out 24fps judder, that's supposed to automatically detect the type of content and optimally video process it, etc etc. The LG TV has plenty of that -- and it's exactly as haphazard, disorganized, and spread all over the menu system as you would expect.
HOWEVER (and this is the important part) when you connect up your new aTV in default mode (which is what I did) it feeds the TV nothing but a 120Hz 4K Dolby Vision signal; and the TV responds by making almost all that image massaging stuff either impossible to get to or much more difficult to get to. The TV seems willing to admit that, yeah, if you're sending me a signal that's optimized along every dimension, I can't do much to "improve" it.
OK, so now suppose you're watching content that's substantially lesser than 120Hz 4K Dolby Vision, eg something that began life as a DVD. How EXACTLY does Apple convert that up from 24fps to 120fps? From 720p to 4K? From SDR to HDR? Obviously they can do the bare minimum like replicate each frame 5 times (24->120Hz) and apply a superbasic bilinear scalar to get from 720p to 4K. But I've got to say the images I saw looked pretty good!
And when I tried LG's solution to these problems they did not (to my quick eye) look any better. (You can tell the aTV to use "content matching" in which case it will feed the signal as originally encoded to the TV, so it will send a 24fps 720p SDR to the TV as a 24fps 720p SDR signal. I tried this to see what it was like.
As I say, I didn't see LG as doing a better job than Apple of "upsampling" the signal AND if you use content matching then there's a noticeable delay whenever you switch from one mode of content to another as the TV adjusts itself -- it's not nearly as slow and irritating as my old TV, which looked like it was having a seizure every time this happened, but it's a noticeable second or two of black screen.)
So the point of this long excursion is that
- Apple seems to be doing more than just the bare minimum when it upsamples older content
- this means you don't have to rely on your TV to do this (and Apple seems to do as good a job as the best TVs)
- AND it means that the new 4K might be doing a better job given that it has a lot more oomph (more CPU, more GPU, an NPU) to throw at the task.
Like trying to see if this year's TV (LG C1) does a better job than last year's TV (LG CX) or the year before's TV (LG C9) in upsampling, quality of upsampling is a hard thing to measure -- even apart from the religious wars and snobbery that surround the whole issue...
All I can say is that I watch a variety of content, some of it the newest bestest stuff that's created from the start in Dolby Vision 4K (Apple TV+ has lots of this, though I don't think I've seen any of it at 120Hz), some of it captured off broadcast TV via Channels DVR, some of it old movies at about DVD-level. And it *all* looks a lot better on the new TV than the old. I'm very happy with the upscaling (resolution, dynamic range, and fps) being done by the Apple TV, and have seen no glitches that irritated me and ruined the illusion. (I don't game, or watch sports, so I can't comment on those. But I do watch action movies, and those seem smoother without the panning glitches I seem to be very sensitive to -- I notice them in movie theaters, not just on 3:2 pullup TVs).
So before you assume that the old and new Apple TV 4K deliver essentially the same signal, test out the quality of upscaling DVD-level content to a 120Hz 4K Dolby Vision screen. The difference in upscaling quality may be more than you expected...
(I assume the eARC stuff works, but can't comment on that because I got a soundbar with the new TV so I use that rather than HomePods)