Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
____________________________
I just bought a Philips 4K Bluray player.....I compared the same exact Bluray scenes from Bourne Idenity with my 5 year old SonyBluray and the 4K Bluray...the Philips was NOT as good as the old Sony.....this 4K is a fraud...unless you have a very special tv and a connection for two cables. I am sending the Philips back....a complete fraud and it had NO ability to steam anything.

I'm not saying your Philips player was good or bad, but you shouldn't judge it by looking at a remaster of a 15 year-old movie. You should judge it against a modern release of a movie mastered in 4K (or higher). Even then, a particular Blu-ray can be crap because it wasn't transferred or mastered well.

Example: In the early HD days, HD Blu-rays were mastered in MPEG-2 of all things--not H264. It wouldn't surprise me if many of the current UHD Blu-rays were mastered in H264 instead of H265.

Also--and I am by no means a Microsoft fan--you should know that the Xbox One S plays 4K UHD Blu-ray discs, and given all the deals these days is usually cheaper than standalone 4K Blu-ray players. I picked one up during last year's Black Friday, and I paid less than any 4K/UHD player. I only use it for playing UHD discs.
 
Just when I was about to splurge on a few ATV4's for DirectTV NOW. I think I will wait, although I do need something to work as my homekit hub.
 
About damn time. Hopefully this means that already purchased movies will be provided in 4K automatically.
 
Apple is like the last company on earth to support 4k. I thought they would be among the first.

Because Apple was the first to jump on BluRay (oh right we're still waiting...)? Apple pushes in technology they care about and lag in technology they don't.

With the limited content and the minimal difference (vs 1080), 4K is something that doesn't interest me in the slightest. While I can admit I might be psychologically biased, because I don't have a 4K TV set, I think that the real factor is that despite how much I love new technology I have never once even been slightly tempted to upgrade to a 4K TV.... It's marketing to get people to buy the newer and "better" things. 3D flopped so they are trying 4K. Now HDR is being added to the mix. If my TV died tomorrow, sure I'd wanna get 4K and HDR to future proof (plus majority of large TVs seem to have it anyway), but even then I don't think I'd really care much about watching content in 1080 vs 4k
 
I just bought a Philips 4K Bluray player.....I compared the same exact Bluray scenes from Bourne Idenity with my 5 year old SonyBluray and the 4K Bluray...the Philips was NOT as good as the old Sony.....this 4K is a fraud...unless you have a very special tv and a connection for two cables. I am sending the Philips back....a complete fraud and it had NO ability to steam anything.

You might want to give it another try with a different 4K movie. If you look around at review sites, that Bourne Identity 4K transfer is pretty universally trashed for the low quality video transfer they did. Some reviewers said the same as you, that it looked no better or even worse than the BR.

What has happened is some studios are using 2K source material and making a 4K disk, and it shows.

http://4kblurays.com

Here is a good site that shows true 4K transfers. You might try one of those before you throw in the towel. I've read Passengers 4K is a particularly good video transfer.
 
Of course a new Apple TV is coming this year. On WWDC, they barely talked about Apple TV, and Tim said much more was to come later on. He clearly implied they didn’t want to spill the beans for now, but expect an Apple TV refresh in september, alongside the new iPhone and the new Apple Watch.

Apple will have refreshed most of its product line this year...
 
  • Like
Reactions: solipsism
____________________________
I just bought a Philips 4K Bluray player.....I compared the same exact Bluray scenes from Bourne Idenity with my 5 year old SonyBluray and the 4K Bluray...the Philips was NOT as good as the old Sony.....this 4K is a fraud...unless you have a very special tv and a connection for two cables. I am sending the Philips back....a complete fraud and it had NO ability to steam anything.

You have copy of the Bourne Identity with true 2160p resolution, not an up-convert? That seems unlikely with a movie that was filmed 16–17 years ago.
 
Such as...

Personal computers (the first ones commercially available came out in 1973 or so, and Apple's first one, the Apple I, in 1976)?

The GUI? This was invented by Xerox. Steve Job saw it when touring the xerox facility.

The 3.5" floppy drive? This was invented by Sony who persuaded Apple to put it in their computers.


Also, 4K isn't a technology, it is just a larger number of dots. That has been going on since practically forever.

I corrected a few of your claims above.
 
Seems like a good step. I'll never own a display / screen big enough to truly appreciate the difference, but knowing you could is nice...

I think smaller screens are better. The pixel density is higher. When you sit close to a big screen HDTV, you can see the pixels, whereas on a smaller TV, it's not as noticeable.

I have a 47" LG 4KTV and the picture is amazing with a noticeable difference between HD and 4K.
 
You have copy of the Bourne Identity with true 2160p resolution? That seems unlikely with a movie that was filmed 16–17 years ago.
And the Bourne Identity UHD has been slammed in reviews as being a bad example of the quality possible on UHD. Plus its a 2k DI if I recall correctly.

Look at something like the Sully UHD to see the difference - and you need , obviously a UHD player, a good TV or projector setup and the best 4k DI discs to see the best of what it can do.
 
You have copy of the Bourne Identity with true 2160p resolution, not an up-convert? That seems unlikely with a movie that was filmed 16–17 years ago.
That is a common misunderstanding. You would be correct if this was digitally recorded at a low resolution, but real film is actually higher resolution than 4K. So older movies that were made on film can easily be moved over to 4K at full 4K resolution.
 
Hopefully the quality will be high enough to make the 4K HDR meaningful. Many reviews suggest a lot of digital 4K sources cannot match 1080p Blu-Ray for quality. Image quality isn't just about the resolution.

Photography is everything and no technical jargon or fancy abbreviations is going to magically turn everything into David Leans's Lawrence of Arabia or Ridley Scott's Blade Runner (the sequel looks like a cardboard CGI set)
 
Something most/all of you fail to recognize is that Apple wasn't going to trot out a 4K ATV until there was iTunes content to match. Also too is licensing with all the studios and negotiations on pricing and options on already purchased content would be auto upgraded or must be purchased separately.

Its so much more complex than just building a streaming box.
 
That is a common misunderstanding. You would be correct if this was digitally recorded at a low resolution, but real film is actually higher resolution than 4K. So older movies that were made on film can easily be moved over to 4K at full 4K resolution.

That’s like saying you can take a 256kbps iTunes track and convert it to 512kbps and then use the excuse that ir works because the masters are even higher quality. Unless you go to the source you’re not going to get 4K from a 1080p upconvert.

Keep in mind that I used the term upconvert, which doesn’t apply to using a 4K or better source.
 
  • Like
Reactions: Jovian9
All I want is to be able to download my purchased movies to my Apple TV. Is that so much to ask?
 
Yes, but Wide Colour Gamut and HDR arguable contributes more than extra pixels also.

LOA is a spectacular movie visually and quality wise even in 1080.

Even in VHS it was incredible.

Wide gamut can help but depends on the talents of the post production grading. If they just saturate and curve the **** out of everything then it's just all hyper colour for nothing.

Human vision isn't that colourful or super sharp as these digital processes are creating. It's one reason many people still prefer the grainy slightly soft look of film. Anyway...bah enjoy what you want.
 
So here we go people the real reason behind the HVEC push. They did not just suddenly decide you know what would be great if we moved to more modern compression for no reason. They did it because it had to be baked into all the OS and devices for a reason that was yet to be announced. The announcement is 4k HDR content on iTunes. This all dove tails in with the WSJ article about the new Apple TV doing 4k. They will be able to push out 4k HDR at a similar bitrate to solid 1080 h.264 content. The end game was always about getting you to buy 4k HDR content. The off shoot was oh hey since we have to bake this into our OS and hardware here is some amazing side benefits like smaller sizes for your iPhone 4k video and amazing compression for photos. Before you go but final cut X needed it. They could have just tossed that in with a codex pack for the software. They made it so deeply intertwined so that selling you content would be thoughtless if you have the latest OS your set click, buy, and watch HDR.
 
I'm assuming this is a US thing so far. Checked my UK account and my purchase of Fantastic Beasts still lists as 'Film (HD)'. Not on the 4K bandwagon yet but do have a decent 1080p TV + Denon AV amp + M&K speakers. Been happy enough with iTunes purchases with the exception of the 'all time favourites' which I have on Blu (Blade Runner, Kubricks, classic Disney/Pixar/Ghibli etc).

Barring the odd blu, ALL of our TV watching is through either Fire TV (NowTV, Prime) or Apple TV (iTunes, Netflix, DisneyLife). A 4K Apple TV and upgraded content a la 'iTunes Plus' would be a strong argument for a TV+4k amp upgrade.
 
There are a lot of infrastructure and planning reasons why, in my opinion, it was deferred. Again, Apple is rarely on the cutting edege of technology, whether that be first to adopt 4G LTE.

Market research shows only a sliver of homes had 4K TV's by the end of 2016, which was a full year after the ATV4 launched. Apple is a conusmer based company now, and with the ATV already having a small market share, I'm sure the cost/benefit analysis to bringing 4K to the market in late 2015 wasn't worth it.

Additionally, 4K uses H.265 for streaming, which is basically the only way to get content onto your Apple TV. Ironically, iOS 11, Mac OS High Sierra and the new version of TvOS are venturing into H.265 this fall.. so the stars are aligning.

Many content providers still do not offer 4K. Netflix offers it for handful of shows, but you are speaking as if everything is in 4K nowadays and that Apple is years behind, which absolutely isn't true.

Yeah, yeah, yeah. Defense. Defense.

Once again, I'll point out that at the time of the launch of :apple:TV4, just about EVERYTHING else Apple makes had already embraced 4K... just this ONE thing clung to 1080p. In fact, in the very same session that :apple:TV4 launched, Apple had just touted the incredible 4K capabilities of another Apple product.

It's funny how quick we are to rationalize why this ONE product does NOT have 4K but you don't see such rationale being blasted at Apple for embracing it in all the other products. Does it really make little sense here, in this ONE thing... but makes perfect sense there, in everything else? Or does it make sense there because Apple has chosen to have it there... and it makes no sense here yet because Apple has chosen NOT to have it here yet?

Answer carefully. Conceptually in just a month or two, Apple may roll out a 4K :apple:TV5. If it makes no sense to embrace 4K in this thing today, it probably should still make no sense barely 4 or 8 weeks from now. Seems a passionate argument against it today should have one back ripping into Apple a few weeks from now for embracing it then. Of course, we know that won't happen. Instead, it will be "shut up and take my money."

And it wasn't about Apple being on the cutting edge. They were pretty much already LAST at that point too. Of the major players, who did NOT have a 4K STB at the time?

Furthermore, Apple had already rationalized h.265 BEFORE :apple:TV4 as the FaceTime codec. Apparently, the "cost benefit analysis" and stars aligning justified it being used there?

And no, there is not a ton of 4K content and I'm certainly not remotely implying 4K is everywhere. As I've shared many times before on this topic, hardware must lead. Put a bunch of 8K STBs in homes and some Studio will be tempted to roll out some 8K to see if they can make a profit. If they do, more 8K will quickly follow. It never makes sense for software products to be everywhere before there is much hardware on which that software can play. If we wish to make the argument that software must be everywhere first, there's not a single app in the iPhone app store yet exclusively for iPhone 8's capabilities, so perhaps iPhone 8 should wait until "everything" in the iOS store is already iPhone 8 upgraded before they roll out that new hardware? Hardware always leads.

If the software is not quite there at launch, it catches up. It never, NEVER works the other way.
 
Last edited:
  • Like
Reactions: Rmonster
That’s like saying you can take a 256kbps iTunes track and convert it to 512kbps and then use the excuse that ir works because the masters are even higher quality. Unless you go to the source you’re not going to get 4K from a 1080p upconvert.

Keep in mind that I used the term upconvert, which doesn’t apply to using a 4K or better source.

I think we are in agreement, but my point is an older movie that was filmed can be made into a true 4K disk from the source material (the film). I was not suggesting upconverting from a lower resolution and I'm not sure where you got that I was.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.