Speaking of audio, in series For All Mankind, the audio was fantastic, it just comes out of no where with good clarity...
Well I wrote about this (in my market report) last week; just based on 4k trailers promoting the shows pre-launch. Absolutely best color, resolution and audio I've had from any source. I am using an Apple 4k, on a 5 year old 78" Samsung 4k HD set; but not with Atmos or HDR, and still it's amazing. Beats the Olympics or anything I've seen in 4k heretofore. My connectivity is AT&T gigabit fiber (930 mbps ethernet and 400-600 on WiFi). On the 78" Samsung on AT&T TV Now I get intermitted studder and audio lag that I can't correct or figure-out, but that has nothing to do with Apple TV. And on the other Apple TV 4k (a 65" Samsung) there is not studder and the picture is also totally perfect but not as immersive as the 78". This is so close to the clarity I see at the trade shows; that I ponder why I would spend money to upgrade to QLED or similar?
This is no surprise to me. Apple TV+ offers supreme dynamic range with bottomless blacks. No visible artefacts or colour banding, even in typically problematic dark scenes. For me, watching on an OLED panel, ATV+ is vastly superior to both Netflix and Amazon. Amazon in particular seems hit and miss on black depth, and Netflix sometimes appears to have a problematic implementation of HDR.
Bitrate only tells part of the story, you would still need to know what codec is being used to make a claim that it is better or not vs the competition.
I have no trouble exceeding 350 Megabit/s (around 10x the bitrate of these shows) on Speedtest.net on my AppleTV over WiFi 802.11ac. Of course, this will depend on your wifi router, interference from neighbors, distance to the router, etc.As a gearhead, I’d be interested to know if this bitrate is achievable on WiFi on Apple TV or if Ethernet is required. Obviously Ethernet is always better, but I always wonder where the technical limits of WiFi streaming are.
The bandwidth costs for ATV+ compared to Netflix/Amazon is miniscule in comparison, so it's not really a good comparison. There are sacrifices made by large streaming companies that Apple doesn't have to worry about (yet?)
Definitely the latterIs this really perceptible in a common man's living room or is this just a spec brag that is real only on paper??
its weird, I watched that Episode on NOW TV in the UK (which has less than 5mbps nitrate) on an average Sony 1080p TV with a connection of 50mbps and didn’t notice any banding. Well, there may have been some but I didn’t notice it and that’s something that would really annoy me due the industry I work in. I think with that episode there are so many variables that could affect the quality, with the average bitrate being just one of many but not necessarily the most importantDefinitely with night scenes. Think GoT "The Long Night". That episode looked like trash, mainly due to the bitrate.
I’d love to subscribe but neither my newish Samsung Smart Tv or 3rd Generation 4K Fire device offer the relevant Apple TV app
I always advise people to go for a LG OLED instead.
A recent comparison between the top end QLED and Sony OLED 65" actually came out as a virtual tie:
![]()
That whining sound is Apple’s stock price and market cap going up and up and up. You need AirPods Pro with ANC to block the sound of how much Apple is getting it right.I Keep reading this since my first Internet connection in 1998. Like a constant, whining note.
I really liked Dickinson, but I think you have to be vested in Steinfeld to get into it. I’ll have to revisit See sometime. Couldn’t get passed the first 7 minutes. I’m saving For All Mankind for last because it’s considered the best out of the bunch.Honestly, as a complete skeptic I have loved everything I have watched so far. The Morning Show is my favorite (binged all 3 episodes in a night), Dickinson was quirky and witty.
Doesn't that depend on the scene and the camera being used etc.? Too low bitrate in darker scenes sure can look bad. But I agree with you there should be a limit where one cannot see the difference of the same stream using higher bitrate, the question is there what line goes? Like I said I think it will vary quite a bit depending on the production. Wouldn't newer productions using newer cameras reasonably require a higher bitrate to not notice a difference in picture quality?
Where would you say the line goes where higher bitrate is totally pointless? Sounds strange to me that Apple and the people behind Blu-rays would use way more bitrate than necessary if there's no point at all in doing so.
Yes, that's why I mentioned at the end that if a low ISO was used to film, the file end file size is relatively small. Camera doesn't matter much. Dark scenes don't matter much either, it's the variation of colours. If it's a pure black screen, like end credits (without the text) it's just one colour... but dark scenes where there's small differences throughout will make a difference. All my old movies from the 80s and way back, or films where there's a lot of noise ALWAYS end up being huge files. Each frame has specks of colours that are different from the previous frame... the way it works with codecs is they reference previous frames and use data from that to build the next frame (I'm oversimplifying a bit, but you get the point).
Bit rate being pointless will depend on what's being shown on screen actually. Say you're look at a crowd of people from a birds eye view and the camera is quickly panning through it... well, that's a lot of detail to cover in a short period so you'll need a higher bitrate to preserve all that info. However, if you're movie is something like.... say mostly slow moving clouds where it's mostly just blues and whites, well... you don't need a high bitrate for that.
Ideally what you want is a VARIABLE bitrate across a film. When there isn't much on screen, the bitrate drops.. when there's a lot happening like an action scene, the bitrate increases. The end result is an optimized copy of a movie. At 30mb-40/mb/s I suspect there's a lot of wasted data.
Take a look at a couple stills from a 4K rip of The Matrix i've got (still using h264!):
Here's the data for the movie:
![]()
A 2hr 16min movie got down to a 18.4gb file (audio excluded) for an average bitrate of 19.4mb/s. That's pretty damn efficient.
Now check out Bumblebee:
![]()
Difference here is that The Matrix is 3840x1600 whereas Bumblebee is the full 16x9 aspect ratio, so a full 4k image at 3840x2160 and still comes out to a smaller file size (although it is the shorter film, and of course, other variables come into play).
Anyway, bitrate isn't the whole story, CLEARLY. Don't let @oneMadRssn, @twolf2919, @Iconoclysm or @ItalianGabriele fool you.
I have ZERO confidence in Apple knowing how to make a movie or TV series.
I got a refurb apple 4k tv works great on my 720p tv. But I’m waiting for black fri deals to buy a 4K tv.
Curious, because last night we tried watching Dickenson on our ATV 4K connected to a 4K TV, and every time we tried any movie on ATV+, the little spinner just spun. After about a minute, I gave up. After multiple tries, I rebooted the ATV4K and Dickenson immediately started playing. To be honest, I'm not sure if it was 4K or 1080p.
Does it take considerable time to buffer a 4K stream before it starts playing?
False.
[automerge]1572934628[/automerge]
You'd be better off by going to put one on Lay-away. Black Friday always comes the lower quality products. But you could go to Walmart now and find a good 4K in any budget, put it on Lay-away between now and Christmas and come out with a better product. Even if you just end up spending what you were going to spend on Black Friday.
Reading this from the biggest anti-Apple evangelist, and seeing the same people liking his post as usual, I'm convinced that Apple TV+ may actually be great!
By the way, "Apple's inability to produce a solid product even in their own tech market"? Are you kidding me?
Here is what an Android fanboy, who is also a long time Pixel user, and an audio/photo expert says about recent Apple products:
"Apple's 2019 Pro releases:
- Powerbeats Pro, sensational.
- Beats Solo Pro, even better.
- iPhone 11 Pro, best phone ever.
And now AirPods Pro with the most sought-after addition of noise cancelling.
Only thing left is a MacBook Pro with a good keyboard."
Kind of off-topic. But what the AirPods show is that users rather go for convenience over quality.
It’s like an iPhone camera and a DSLR in a crude way.
I’m a huge audio nut who’s worked in the music industry in production and audio engineering and has spent at least ten times what most spend on a TV on simply the audio system, and I think it’s cheap. I still love the AirPods and the new AirPods Pro have their place. They’re not audiophile headphones, but I think they’re far and away the best day-to-day headphones money can buy. I think the Pros sound better than the 1st and 2nd gen originals if only due to the seal (perhaps new driver too), but I don’t find them so comfortable. I’m half tempted to keep my 2nd gens and the Pros for when I want comfort and when I need noise cancelling, though I think my ears are strange and I don’t usually like in-ear models from a comfort standpoint, though these Pros feel far better than most.
Bottom line: even as an audio snob, I’d never besmirch someone for digging any of the AirPods. The utility they bring and relative quality far outweigh the downsides from a day to day standpoint. Most people don’t even know what to listen for to tell the difference anyways. When I want quality sound, I don’t listen to AirPods. It’s like an iPhone camera and a DSLR in a crude way. Sure, absolutely a DSLR is going to be better quality, but it’s not so convenient. I hope that in 5-6 years, the AirPods catch up to the quality of camera the iPhones now have. In the meantime, I love the convenience and utility. AirPods are without a doubt my favorite post-Jobs Apple product.
Not quite - I’d love to subscribe but neither my newish Samsung Smart Tv or 3rd Generation 4K Fire device offer the relevant Apple TV app - so no go at the moment. Why haven’t Apple organised with others for the swift release of working apps? Seems short sided in terms of sales!