How could you tell it was 720p? Is there some indication in the TV app? The visual difference would probably be unnoticeable on a screen that small.
I pulled it from my iPhone 7 and opened it in MediaInfo on my Mac.
How could you tell it was 720p? Is there some indication in the TV app? The visual difference would probably be unnoticeable on a screen that small.
1) That has nothing to do with bitrate. Name me a single Blu-ray title that would be higher than the theoretical throughput of 15Mibps if streamed. Blu-rays are 25 and 50GB, and 15Mibps is 52.7 GiB.
Theoretically, shouldn't they be playable in 4K on retina Macs with Kaby Lake chips (hardware decoding)?
So how does 4K content from iTunes compare to 1080p Blu-ray?
If you have two people sitting around a table talking, the bitrate needed to provide a high quality picture is far lower than when you have (for instance), a Transformer transforming, with all these moving parts.
Blu-rays are capable of providing a very high bitrate when needed... I believe a max of 40Mbps... far higher than streaming's 15mbps.
UltraHD Blu-ray has three maximum bitrates. 50 GB discs with 82 Mbit/s, 66 GB discs with 108 Mbit/s, and 100 GB discs with 128 Mbit/s.
Compare that to streaming's 15Mbps?
HRD is cancer for the eyes.
Both photography and movies.
HRD is cancer for the eyes.
Both photography and movies.
HRD is cancer for the eyes.
Both photography and movies.
Eye cancer is an actual thing that exists, so you'll need another metaphor.HRD is cancer for the eyes.
Just keep in mind you're using a minimum sustained bandwidth for streaming (and one that is much higher than the data itself) and a maximum throughput of a local disc. Those aren't equal footing for making a comparison of "bitrate needed to provide a high quality picture."
I'm not saying that iTunes 4K content won't be high quality. I'm simply stating that a properly done Blu-ray disc or UltraHD Blu-ray disc has technological advantages over a streaming version that can be viewed with 15Mbps, and the proof is in the picture shown on our screens.
For most content and most people, 1080p Netflix or 4K will be fine. I'm actually really impressed by Netflix original shows even in 1080P, they look really good. That said, if you really start to look at the details, Blu-Ray will always be better.
If you have two people sitting around a table talking, the bitrate needed to provide a high quality picture is far lower than when you have (for instance), a Transformer transforming, with all these moving parts.
Blu-rays are capable of providing a very high bitrate when needed... I believe a max of 40Mbps... far higher than streaming's 15mbps.
UltraHD Blu-ray has three maximum bitrates. 50 GB discs with 82 Mbit/s, 66 GB discs with 108 Mbit/s, and 100 GB discs with 128 Mbit/s.
Compare that to streaming's 15Mbps?
Last I saw at the moment, Gigablast is the only one without that limitation for ArizonaCareful Cox got rid of unlimited data and imposed a 1TB data cap per month. You can blow through 1TB with a few 2 hour movies.
True. There are tradeoffs. However, I never have to worry about losing a disc, buying expensive specialized players for those discs, accidentally scratching or damaging a disc, or upgrading a disc. When I go on vacation or to a friend's house, every iTunes movie I own (365 as of right now) automatically travels with me and I can watch them from anywhere in the world on an iPhone, iPad, Mac, or Apple TV. This week Apple is upgrading all my content to 4K HDR and I don't have to lift a finger or pay a cent, that's amazing.I'm not saying that iTunes 4K content won't be high quality. I'm simply stating that a properly done Blu-ray disc or UltraHD Blu-ray disc has technological advantages over a streaming version that can be viewed with 15Mbps, and the proof is in the picture shown on our screens.
It was very nice of Apple to upscale all of my existing purchases.![]()
That's disingenuous. The bitrates are overkill, the proof is in the fact that we can compress that data significantly to be visually indistinguishable.
I'd be hard-pressed to tell the difference between that and one encoded at 8Mbps.
Again, differences might be seen on really high end equipment or with scrutiny, but overwhelmingly not for the majority of people or setups.
The biggest thing you will notice is what I think they call banding, where the colors just don't get shown correctly and you see obvious distinctions between let's say dark gray and lighter gray. There are also some compression artifacts that can happen at the lower bit rates. I bet 90% or more will not notice it unless someone points it out, so I would suggest NOT looking it up. Ignorance is bliss after all! I used to have no idea what tearing was in video games, then I looked it up and started noticing it everywhere. (Vsync sucks!)
For example, if you rip a CD and encode it losslessly, the bitrates typically hover around 900 kbps. Overwhelmingly, a well encoded 256kbps AAC file will sound exactly the same (and for most people including myself, even 256kbps is overkill). You need to have insanely good hearing paired with good equipment to distinguish the two. And I don't mean saying 'oh, this file has sweet highs and warm bass'. I mean taking a double blind test and statistically being able to tell which one is which.