Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Well I wrote about this (in my market report) last week; just based on 4k trailers promoting the shows pre-launch. Absolutely best color, resolution and audio I've had from any source. I am using an Apple 4k, on a 5 year old 78" Samsung 4k HD set; but not with Atmos or HDR, and still it's amazing. Beats the Olympics or anything I've seen in 4k heretofore. My connectivity is AT&T gigabit fiber (930 mbps ethernet and 400-600 on WiFi). On the 78" Samsung on AT&T TV Now I get intermitted studder and audio lag that I can't correct or figure-out, but that has nothing to do with Apple TV. And on the other Apple TV 4k (a 65" Samsung) there is not studder and the picture is also totally perfect but not as immersive as the 78". This is so close to the clarity I see at the trade shows; that I ponder why I would spend money to upgrade to QLED or similar?
 
  • Love
Reactions: BODYBUILDERPAUL
Well I wrote about this (in my market report) last week; just based on 4k trailers promoting the shows pre-launch. Absolutely best color, resolution and audio I've had from any source. I am using an Apple 4k, on a 5 year old 78" Samsung 4k HD set; but not with Atmos or HDR, and still it's amazing. Beats the Olympics or anything I've seen in 4k heretofore. My connectivity is AT&T gigabit fiber (930 mbps ethernet and 400-600 on WiFi). On the 78" Samsung on AT&T TV Now I get intermitted studder and audio lag that I can't correct or figure-out, but that has nothing to do with Apple TV. And on the other Apple TV 4k (a 65" Samsung) there is not studder and the picture is also totally perfect but not as immersive as the 78". This is so close to the clarity I see at the trade shows; that I ponder why I would spend money to upgrade to QLED or similar?

There is something flawed here. Why would you compare the Olympics which is typically live streamed to a VOD? The assets themselves are encoded AND delivered differently.

The Samsung QLED is decent, but I always advise people to go for a LG OLED instead. The colors on the OLED are significantly better in real world when demoed out of show room and trade shows. You actually don't need HDR to see the differences between colors of those 2 panels. If you did, the mere fact that LG OLEDs typically support HDR10 and Dolby Vision is a bonus too. Samsung TVs only support HDR.
[automerge]1572922324[/automerge]
This is no surprise to me. Apple TV+ offers supreme dynamic range with bottomless blacks. No visible artefacts or colour banding, even in typically problematic dark scenes. For me, watching on an OLED panel, ATV+ is vastly superior to both Netflix and Amazon. Amazon in particular seems hit and miss on black depth, and Netflix sometimes appears to have a problematic implementation of HDR.

The bandwidth costs for ATV+ compared to Netflix/Amazon is miniscule in comparison, so it's not really a good comparison. There are sacrifices made by large streaming companies that Apple doesn't have to worry about (yet?)
 
Last edited:
Bitrate only tells part of the story, you would still need to know what codec is being used to make a claim that it is better or not vs the competition.

You would need to know the codec in order to make a meaningful technical analysis, but the majority seem to agree that the results speak for themselves. Apple TV+ is, in my opinion, far and away the highest quality streaming service available bar none. I already knew this long before I knew the bitrate or any other technical specification.
 
As a gearhead, I’d be interested to know if this bitrate is achievable on WiFi on Apple TV or if Ethernet is required. Obviously Ethernet is always better, but I always wonder where the technical limits of WiFi streaming are.
I have no trouble exceeding 350 Megabit/s (around 10x the bitrate of these shows) on Speedtest.net on my AppleTV over WiFi 802.11ac. Of course, this will depend on your wifi router, interference from neighbors, distance to the router, etc.
 
The bandwidth costs for ATV+ compared to Netflix/Amazon is miniscule in comparison, so it's not really a good comparison. There are sacrifices made by large streaming companies that Apple doesn't have to worry about (yet?)

Respectfully, of course it's a good comparison. ATV+ looks superior to the others. The reasons are moot. As a consumer of the content, quality of he content is paramount. If Apple has an advantage, then it has an advantage. I understand that bandwidth may become a factor at a later time, but until that happens, we don't know how Apple will cope with their server infrastructure. It's not all running off a single MacBook Pro on Tim's desk ;)

I suspect that they will keep on top of it and maintain extremely high quality, but of course it remains to be seen. In the meantime ATV+ looks better than the other services :cool:
 
Is this really perceptible in a common man's living room or is this just a spec brag that is real only on paper??
Definitely the latter
[automerge]1572927068[/automerge]
Definitely with night scenes. Think GoT "The Long Night". That episode looked like trash, mainly due to the bitrate.
its weird, I watched that Episode on NOW TV in the UK (which has less than 5mbps nitrate) on an average Sony 1080p TV with a connection of 50mbps and didn’t notice any banding. Well, there may have been some but I didn’t notice it and that’s something that would really annoy me due the industry I work in. I think with that episode there are so many variables that could affect the quality, with the average bitrate being just one of many but not necessarily the most important
 
Last edited:
I’d love to subscribe but neither my newish Samsung Smart Tv or 3rd Generation 4K Fire device offer the relevant Apple TV app

Why would you want to use the Samsung Apple TV app and not an Apple TV? There is a thread discussing the poor performance of the Samsung app and the reasons to go the Apple TV route.

I always advise people to go for a LG OLED instead.

A recent comparison between the top end QLED and Sony OLED 65" actually came out as a virtual tie:

 
I Keep reading this since my first Internet connection in 1998. Like a constant, whining note.
That whining sound is Apple’s stock price and market cap going up and up and up. You need AirPods Pro with ANC to block the sound of how much Apple is getting it right.

Honestly, as a complete skeptic I have loved everything I have watched so far. The Morning Show is my favorite (binged all 3 episodes in a night), Dickinson was quirky and witty.
I really liked Dickinson, but I think you have to be vested in Steinfeld to get into it. I’ll have to revisit See sometime. Couldn’t get passed the first 7 minutes. I’m saving For All Mankind for last because it’s considered the best out of the bunch.
 
Doesn't that depend on the scene and the camera being used etc.? Too low bitrate in darker scenes sure can look bad. But I agree with you there should be a limit where one cannot see the difference of the same stream using higher bitrate, the question is there what line goes? Like I said I think it will vary quite a bit depending on the production. Wouldn't newer productions using newer cameras reasonably require a higher bitrate to not notice a difference in picture quality?

Where would you say the line goes where higher bitrate is totally pointless? Sounds strange to me that Apple and the people behind Blu-rays would use way more bitrate than necessary if there's no point at all in doing so.

Yes, that's why I mentioned at the end that if a low ISO was used to film, the file end file size is relatively small. Camera doesn't matter much. Dark scenes don't matter much either, it's the variation of colours. If it's a pure black screen, like end credits (without the text) it's just one colour... but dark scenes where there's small differences throughout will make a difference. All my old movies from the 80s and way back, or films where there's a lot of noise ALWAYS end up being huge files. Each frame has specks of colours that are different from the previous frame... the way it works with codecs is they reference previous frames and use data from that to build the next frame (I'm oversimplifying a bit, but you get the point).

Bit rate being pointless will depend on what's being shown on screen actually. Say you're look at a crowd of people from a birds eye view and the camera is quickly panning through it... well, that's a lot of detail to cover in a short period so you'll need a higher bitrate to preserve all that info. However, if you're movie is something like.... say mostly slow moving clouds where it's mostly just blues and whites, well... you don't need a high bitrate for that.

Ideally what you want is a VARIABLE bitrate across a film. When there isn't much on screen, the bitrate drops.. when there's a lot happening like an action scene, the bitrate increases. The end result is an optimized copy of a movie. At 30mb-40/mb/s I suspect there's a lot of wasted data.

Take a look at a couple stills from a 4K rip of The Matrix i've got (still using h264!):

Here's the data for the movie:
WMWAulB.png


A 2hr 16min movie got down to a 18.4gb file (audio excluded) for an average bitrate of 19.4mb/s. That's pretty damn efficient.

Now check out Bumblebee:

PtNX93m.png


Difference here is that The Matrix is 3840x1600 whereas Bumblebee is the full 16x9 aspect ratio, so a full 4k image at 3840x2160 and still comes out to a smaller file size (although it is the shorter film, and of course, other variables come into play).

Anyway, bitrate isn't the whole story, CLEARLY. Don't let @oneMadRssn, @twolf2919, @Iconoclysm or @ItalianGabriele fool you.
 
Last edited:
  • Like
Reactions: ipponrg
Yes, that's why I mentioned at the end that if a low ISO was used to film, the file end file size is relatively small. Camera doesn't matter much. Dark scenes don't matter much either, it's the variation of colours. If it's a pure black screen, like end credits (without the text) it's just one colour... but dark scenes where there's small differences throughout will make a difference. All my old movies from the 80s and way back, or films where there's a lot of noise ALWAYS end up being huge files. Each frame has specks of colours that are different from the previous frame... the way it works with codecs is they reference previous frames and use data from that to build the next frame (I'm oversimplifying a bit, but you get the point).

Bit rate being pointless will depend on what's being shown on screen actually. Say you're look at a crowd of people from a birds eye view and the camera is quickly panning through it... well, that's a lot of detail to cover in a short period so you'll need a higher bitrate to preserve all that info. However, if you're movie is something like.... say mostly slow moving clouds where it's mostly just blues and whites, well... you don't need a high bitrate for that.

Ideally what you want is a VARIABLE bitrate across a film. When there isn't much on screen, the bitrate drops.. when there's a lot happening like an action scene, the bitrate increases. The end result is an optimized copy of a movie. At 30mb-40/mb/s I suspect there's a lot of wasted data.

Take a look at a couple stills from a 4K rip of The Matrix i've got (still using h264!):

Here's the data for the movie:
WMWAulB.png


A 2hr 16min movie got down to a 18.4gb file (audio excluded) for an average bitrate of 19.4mb/s. That's pretty damn efficient.

Now check out Bumblebee:

PtNX93m.png


Difference here is that The Matrix is 3840x1600 whereas Bumblebee is the full 16x9 aspect ratio, so a full 4k image at 3840x2160 and still comes out to a smaller file size (although it is the shorter film, and of course, other variables come into play).

Anyway, bitrate isn't the whole story, CLEARLY. Don't let @oneMadRssn, @twolf2919, @Iconoclysm or @ItalianGabriele fool you.

You realize video conversion bitrate has nothing to do with stream bitrate, right? Someone tell him.🤡
 
I have ZERO confidence in Apple knowing how to make a movie or TV series.

There has always been plenty of sceptics who had zero confidence on Apple knowing how to make a good MP3 player, good cell phone, good tablet, good smart watch, and so on and on. And believe it or not, plenty of them have their hilarious pessimistic comments frozen in time in this forum, as a solid proof of how wrong they were to even doubt it. seems that time just keeps repeating itself around here.
 
I got a refurb apple 4k tv works great on my 720p tv. But I’m waiting for black fri deals to buy a 4K tv.

You'd be better off by going to put one on Lay-away. Black Friday always comes the lower quality products. But you could go to Walmart now and find a good 4K in any budget, put it on Lay-away between now and Christmas and come out with a better product. Even if you just end up spending what you were going to spend on Black Friday.
 
Last edited by a moderator:
Still consumer garbage quality. a 1080p Blu-ray is light years ahead of compressed streaming 4K. And nothing can touch a UHD Blu-ray. Don't even get me started on lossless sound...
 
  • Like
Reactions: Oberhorst
Curious, because last night we tried watching Dickenson on our ATV 4K connected to a 4K TV, and every time we tried any movie on ATV+, the little spinner just spun. After about a minute, I gave up. After multiple tries, I rebooted the ATV4K and Dickenson immediately started playing. To be honest, I'm not sure if it was 4K or 1080p.

Does it take considerable time to buffer a 4K stream before it starts playing?

I‘m actually pretty certain that at least certain Apple TV 4Ks have bunk network chips. I have to restart one of mine all the time to get even 1080P video to stream at all. The higher the bandwidth, the more of an issue. The upside of the 4K is that it restarts faster than the 1080P versions, but........it just works? Especially with my AC Time Capsule that I guess used to, just work? I’d be curious if others have the same issue. It always works fine on a restart, but it flakes out several times a day. A+.

The audio side of my system is much nicer than the video side of my system, though spending $2K+ on a 65” probably puts me somewhat ahead of many; not trying to be snobby, but realistic. I don’t consider that high end, though I’ll say I’ve noticed less artifacting on this content which is welcome relative to much of what I’ve streamed on other services. The audio side of my system makes about anything sound decent, though is certainly revealing of poorly produced/mastered content. I have no complaints about the audio quality. It’s nice to get some more legitimate break in on my Atmos speakers.

But the content...

At least For All Mankind is marginally entertaining and interesting. The rest is rough, and even For All Mankind is B-level
prestige thus far on the writing front. And I think I’m being generous there because I enjoy Apollo-era history. The rest seems overwrought if not outright painful.

What I’ve most gotten out of this thread so far is...

1) There’s a guy who fussed with Handbrake who thinks he understands all about video compression because...he fussed with Handbrake? So do a lot of us. It’s still fussing. There’s countless factors, for sure. Bitrate isn’t the end all but it’s a good sign of thoughtful encoding in terms of quality at the very least. Relative to the rest of the streaming landscape and perceived quality from my system, I’d say it’s a step in the right direction. Maybe the content will catch up? Though they certainly must build a setting in to limit bandwidth for users who need to due to ISP/orifice limitations.

2) I’m surprised but also not that an Apple forum likes much of this content, because it’s Apple of course. I could let that bother me, but I’ll let it slide. Week one.

And last but most importantly, 3) I’m really glad I don’t have a data cap.
[automerge]1572936548[/automerge]
False.
[automerge]1572934628[/automerge]


You'd be better off by going to put one on Lay-away. Black Friday always comes the lower quality products. But you could go to Walmart now and find a good 4K in any budget, put it on Lay-away between now and Christmas and come out with a better product. Even if you just end up spending what you were going to spend on Black Friday.

I don’t quite get what you’re saying about layaway, but to be certain, Black Friday deals are most oftentimes unique SKUs/models produced strictly for Black Friday with stripped down offerings to meet a price point. Black Friday is a tremendous scam far more often than not. I wouldn’t wish a typical unique Black Friday deal TV on anyone. You get what you pay for, especially on Black Friday.
 
Reading this from the biggest anti-Apple evangelist, and seeing the same people liking his post as usual, I'm convinced that Apple TV+ may actually be great!

By the way, "Apple's inability to produce a solid product even in their own tech market"? Are you kidding me?

Here is what an Android fanboy, who is also a long time Pixel user, and an audio/photo expert says about recent Apple products:

"Apple's 2019 Pro releases:
- Powerbeats Pro, sensational.
- Beats Solo Pro, even better.
- iPhone 11 Pro, best phone ever.

And now AirPods Pro with the most sought-after addition of noise cancelling.
Only thing left is a MacBook Pro with a good keyboard."

Kind of off-topic. But what the AirPods show is that users rather go for convenience over quality. And that sums up Apple's philosophy these last few years..
 
Kind of off-topic. But what the AirPods show is that users rather go for convenience over quality.

I’m a huge audio nut who’s worked in the music industry in production and audio engineering and has spent at least ten times what most spend on a TV on simply the audio system, and I think it’s cheap. I still love the AirPods and the new AirPods Pro have their place. They’re not audiophile headphones, but I think they’re far and away the best day-to-day headphones money can buy. I think the Pros sound better than the 1st and 2nd gen originals if only due to the seal (perhaps new driver too), but I don’t find them so comfortable. I’m half tempted to keep my 2nd gens and the Pros for when I want comfort and when I need noise cancelling, though I think my ears are strange and I don’t usually like in-ear models from a comfort standpoint, though these Pros feel far better than most.

Bottom line: even as an audio snob, I’d never besmirch someone for digging any of the AirPods. The utility they bring and relative quality far outweigh the downsides from a day to day standpoint. Most people don’t even know what to listen for to tell the difference anyways. When I want quality sound, I don’t listen to AirPods. It’s like an iPhone camera and a DSLR in a crude way. Sure, absolutely a DSLR is going to be better quality, but it’s not so convenient. I hope that in 5-6 years, the AirPods catch up to the quality of camera the iPhones now have. In the meantime, I love the convenience and utility. AirPods are without a doubt my favorite post-Jobs Apple product.
 
I’m a huge audio nut who’s worked in the music industry in production and audio engineering and has spent at least ten times what most spend on a TV on simply the audio system, and I think it’s cheap. I still love the AirPods and the new AirPods Pro have their place. They’re not audiophile headphones, but I think they’re far and away the best day-to-day headphones money can buy. I think the Pros sound better than the 1st and 2nd gen originals if only due to the seal (perhaps new driver too), but I don’t find them so comfortable. I’m half tempted to keep my 2nd gens and the Pros for when I want comfort and when I need noise cancelling, though I think my ears are strange and I don’t usually like in-ear models from a comfort standpoint, though these Pros feel far better than most.

Bottom line: even as an audio snob, I’d never besmirch someone for digging any of the AirPods. The utility they bring and relative quality far outweigh the downsides from a day to day standpoint. Most people don’t even know what to listen for to tell the difference anyways. When I want quality sound, I don’t listen to AirPods. It’s like an iPhone camera and a DSLR in a crude way. Sure, absolutely a DSLR is going to be better quality, but it’s not so convenient. I hope that in 5-6 years, the AirPods catch up to the quality of camera the iPhones now have. In the meantime, I love the convenience and utility. AirPods are without a doubt my favorite post-Jobs Apple product.

I agree that airpods are very good as a product or experience. In the same way Netflix and TV+ are superior to BlueRay as a full-fledged product/experience, yet it has objectively worse picture quality.
 
  • Like
Reactions: AgentElliot007
Not quite - I’d love to subscribe but neither my newish Samsung Smart Tv or 3rd Generation 4K Fire device offer the relevant Apple TV app - so no go at the moment. Why haven’t Apple organised with others for the swift release of working apps? Seems short sided in terms of sales!

I get your point, I have a 2017 Samsung 4K TV, unfortunately only the 2019 models have the Apple TV app. The older models won’t be getting them. Sad but true.
 
I was pleasantly surprised to notice that the ATV+ shows are in 4K but this only highlights the absence of 4K TV material in the general iTunes store. Is there any reason for this given how much product these days is made in 4K?
 
Yeah about that - I watched the first episode of „for all mankind“ the other day in 4k and I found it to have excellent picture quality but it was laggy during fast movement. I have a 1 gigabit internet connection and gigabit wired ethernet to the Apple TV bos
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.