Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
1) That has nothing to do with bitrate. Name me a single Blu-ray title that would be higher than the theoretical throughput of 15Mibps if streamed. Blu-rays are 25 and 50GB, and 15Mibps is 52.7 GiB.

If you have two people sitting around a table talking, the bitrate needed to provide a high quality picture is far lower than when you have (for instance), a Transformer transforming, with all these moving parts.

Blu-rays are capable of providing a very high bitrate when needed... I believe a max of 40Mbps... far higher than streaming's 15mbps.

UltraHD Blu-ray has three maximum bitrates. 50 GB discs with 82 Mbit/s, 66 GB discs with 108 Mbit/s, and 100 GB discs with 128 Mbit/s.

Compare that to streaming's 15Mbps?
[doublepost=1505454222][/doublepost]
Theoretically, shouldn't they be playable in 4K on retina Macs with Kaby Lake chips (hardware decoding)?

Theoretically, I would assume.
[doublepost=1505454271][/doublepost]
So how does 4K content from iTunes compare to 1080p Blu-ray?

Nobody knows yet.
 
If you have two people sitting around a table talking, the bitrate needed to provide a high quality picture is far lower than when you have (for instance), a Transformer transforming, with all these moving parts.

Blu-rays are capable of providing a very high bitrate when needed... I believe a max of 40Mbps... far higher than streaming's 15mbps.

UltraHD Blu-ray has three maximum bitrates. 50 GB discs with 82 Mbit/s, 66 GB discs with 108 Mbit/s, and 100 GB discs with 128 Mbit/s.

Compare that to streaming's 15Mbps?

Just keep in mind you're using a minimum sustained bandwidth for streaming (and one that is much higher than the data itself) and a maximum throughput of a local disc. Those aren't equal footing for making a comparison of "bitrate needed to provide a high quality picture."
 
HRD is cancer for the eyes.
Both photography and movies.

1) Do you mean HDR, or this an initialism for High Resolution Disc, *Display, or something else?

2) If so, I'll bite. Tell me how it's such an awful feature that you'd compare it to cancer.
 
Just keep in mind you're using a minimum sustained bandwidth for streaming (and one that is much higher than the data itself) and a maximum throughput of a local disc. Those aren't equal footing for making a comparison of "bitrate needed to provide a high quality picture."

I'm not saying that iTunes 4K content won't be high quality. I'm simply stating that a properly done Blu-ray disc or UltraHD Blu-ray disc has technological advantages over a streaming version that can be viewed with 15Mbps, and the proof is in the picture shown on our screens.
 
  • Like
Reactions: jamesrick80
I'm not saying that iTunes 4K content won't be high quality. I'm simply stating that a properly done Blu-ray disc or UltraHD Blu-ray disc has technological advantages over a streaming version that can be viewed with 15Mbps, and the proof is in the picture shown on our screens.

For most content and most people, 1080p Netflix or 4K will be fine. I'm actually really impressed by Netflix original shows even in 1080P, they look really good. That said, if you really start to look at the details, Blu-Ray will always be better.
 
  • Like
Reactions: Alan Gordon
For most content and most people, 1080p Netflix or 4K will be fine. I'm actually really impressed by Netflix original shows even in 1080P, they look really good. That said, if you really start to look at the details, Blu-Ray will always be better.

Absolutely! Heck, I know people who doesn't care about HD over SD.
 
If you have two people sitting around a table talking, the bitrate needed to provide a high quality picture is far lower than when you have (for instance), a Transformer transforming, with all these moving parts.

Blu-rays are capable of providing a very high bitrate when needed... I believe a max of 40Mbps... far higher than streaming's 15mbps.

UltraHD Blu-ray has three maximum bitrates. 50 GB discs with 82 Mbit/s, 66 GB discs with 108 Mbit/s, and 100 GB discs with 128 Mbit/s.

Compare that to streaming's 15Mbps?

That's disingenuous. The bitrates are overkill, the proof is in the fact that we can compress that data significantly to be visually indistinguishable. For example, if you rip a CD and encode it losslessly, the bitrates typically hover around 900 kbps. Overwhelmingly, a well encoded 256kbps AAC file will sound exactly the same (and for most people including myself, even 256kbps is overkill). You need to have insanely good hearing paired with good equipment to distinguish the two. And I don't mean saying 'oh, this file has sweet highs and warm bass'. I mean taking a double blind test and statistically being able to tell which one is which.

The same concept applies to movies. A lot of BluRay remuxes I see on the internet hover around 20-30Mbps for their video bitrate. I'd be hard-pressed to tell the difference between that and one encoded at 8Mbps. And I don't mean while they're playing. I mean taking screenshots and flipping between the two would be difficult if not impossible to tell. These codecs are incredibly sophisticated and ingeniously use psychovisual models to get rid of and compress information that you can't (or likely won't) see. Again, differences might be seen on really high end equipment or with scrutiny, but overwhelmingly not for the majority of people or setups.
 
  • Like
Reactions: solipsism
The biggest thing you will notice is what I think they call banding, where the colors just don't get shown correctly and you see obvious distinctions between let's say dark gray and lighter gray. There are also some compression artifacts that can happen at the lower bit rates. I bet 90% or more will not notice it unless someone points it out, so I would suggest NOT looking it up. Ignorance is bliss after all! I used to have no idea what tearing was in video games, then I looked it up and started noticing it everywhere :( . (Vsync sucks!)
 
Careful Cox got rid of unlimited data and imposed a 1TB data cap per month. You can blow through 1TB with a few 2 hour movies.
Last I saw at the moment, Gigablast is the only one without that limitation for Arizona

EDIT: Gigablast has 2TB cap now
 
I'm not saying that iTunes 4K content won't be high quality. I'm simply stating that a properly done Blu-ray disc or UltraHD Blu-ray disc has technological advantages over a streaming version that can be viewed with 15Mbps, and the proof is in the picture shown on our screens.
True. There are tradeoffs. However, I never have to worry about losing a disc, buying expensive specialized players for those discs, accidentally scratching or damaging a disc, or upgrading a disc. When I go on vacation or to a friend's house, every iTunes movie I own (365 as of right now) automatically travels with me and I can watch them from anywhere in the world on an iPhone, iPad, Mac, or Apple TV. This week Apple is upgrading all my content to 4K HDR and I don't have to lift a finger or pay a cent, that's amazing.

A few years ago I made the decision to completely abandon physical media. That includes music, movies, games (console and iOS), and books. I've been through the very expensive purchasing and upgrading (and by "upgrading" I mean "completely purchasing again") cycle since VHS, then SVHS, then laserdisc, then DVD, then Blu-Ray, and I finally got sick of it, decided I was not going to do that anymore. So far I haven't regretted it. It's moves like the one Apple is making to upgrade their content that validate the "digital lifestyle" and make me feel comfortable fully committing to it. That's good for me and ultimately good for Apple and good for content providers because it means I'm less hesitant to lay out cash for it.
 
Last edited:
It was very nice of Apple to upscale all of my existing purchases. :)

Ya,, particularly when i'll never get a 4K TV.. I'm happy with 32" always have been....

Good thing its 'free' upgrade, cos i ain't gonna pay for it..
 
That's disingenuous. The bitrates are overkill, the proof is in the fact that we can compress that data significantly to be visually indistinguishable.

It's not visually indistinguishable. Sure, I have some titles that probably are, but usually on titles in which the studio didn't give the proper effort to the disc.

I'd be hard-pressed to tell the difference between that and one encoded at 8Mbps.

I have many friends and family with poor eyesight who probably wouldn't be able to tell the difference either. Ditto with the difference between 4K and 1080p.

Again, differences might be seen on really high end equipment or with scrutiny, but overwhelmingly not for the majority of people or setups.

Differences can be seen on bargain bin basement TVs with no scrutiny, but I agree with the last part. For the majority of people, the differences are negligible.
 
I don’t use iTunes services usually, and having had my fingers burnt with the previous Apple TV being disappointing for me, I won’t be jumping on a 4K one quickly. However hopefully this launch will increase the range of 4K movies offered by the likes of amazon and google play stores.
 
The biggest thing you will notice is what I think they call banding, where the colors just don't get shown correctly and you see obvious distinctions between let's say dark gray and lighter gray. There are also some compression artifacts that can happen at the lower bit rates. I bet 90% or more will not notice it unless someone points it out, so I would suggest NOT looking it up. Ignorance is bliss after all! I used to have no idea what tearing was in video games, then I looked it up and started noticing it everywhere :( . (Vsync sucks!)

Banding can come as both an encoding artifact but also can come from the TV or monitor your using (even when viewing high quality content). The HDR standard requires displays to be able to produce more colors (higher color depth) and achieve larger contrast ratios (greater luminance range). Hopefully that helps reduce banding on behalf of the display. And since the content itself comes from masters with more color and contrast granularity, good compression should hopefully avoid it as well.

And if you really want to get rid of tearing without resorting to vsync (or freesync / gsync alternatives), pray high frame rate content and displays become the norm. Tearing still exists, but I rarely notice it on a 144Hz display when sustaining frame rates well above 144.
 
For example, if you rip a CD and encode it losslessly, the bitrates typically hover around 900 kbps. Overwhelmingly, a well encoded 256kbps AAC file will sound exactly the same (and for most people including myself, even 256kbps is overkill). You need to have insanely good hearing paired with good equipment to distinguish the two. And I don't mean saying 'oh, this file has sweet highs and warm bass'. I mean taking a double blind test and statistically being able to tell which one is which.

I once did an ABX test between lossless audio and 320kbps mp3 files. I can absolutely hear the difference even though I wouldn't consider my equipment to be that insanely great.
 
I dunno why Apple just can't offer two qualities....and it will still be free if they wish to go to 4K... All Apple has to do is check if user has already purchased on AppleID.. and if u have bought same movie, quality for free 4K alternative..

I guess you also wanna keep your purchase section clean of duplicates as well.
 
  • Like
Reactions: Nugget
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.