Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
You are talking like you have Apple business plan in front of you. Apple engineers don't even know all the facts about ATV4, let alone a guy on macRumors.

Ok sherlock, what about the 1000 other posts? You are going to comment on those too? Seems like everybody is speculating around here. Same thing for me.
 
Apple should admit they were caught by surprise with their competitors offering 4K on their system.
 
And has been proven repeatedly, there is no science behind the chart. It's someone's estimation by using manufacturers recommended viewing distance charts. Nothing more.

That's funny because I haven't seen YOU prove a DAMN THING. You just state it's not true despite people (including myself) posting all kinds of evidence and studies on the matter (for god's sake, the SNELLEN charts that all eye exams are based on are what these studies use and are well known and what determines if you can see properly with a given set of glasses or not if your vision isn't right!)

I quoted several sources and linked a test you can try at home with a printer. Resolving distances are a well known FACT. Your opinion that science isn't science and that whatever you happen WANT to believe should be the truth is your fracking imagination, nothing more. It reminds me of certain political groups that would rather bad mouth the other side than look at the actual facts.

The chart even has an actual response at the bottom that has actual scientific citations and studies that completely contradict the chart.

http://www.homecinemaguru.com/can-we-see-4kuhd-on-a-normal-sized-screen-you-betcha/

Perhaps take your own advice, read that and accept the education people are offering you. Full benefits, again, does not mean only benefits.

I don't think you know what "completely contradict" even means. The only way a person could see better than the previously posted chart is if their eyes are better than 20/20 vision. Despite the claims of that article, few people have "significantly" better vision than 20/20. I don't know many that have better than 20/10 vision period (and I have had 20/10 "corrected" before, but like most people, my vision varies a bit during the course of the day, sinuses, etc.). At 20/10, you might be able to double the distances you could see a "difference" at, but that puts the limits of being able to just notice a difference at around 15 feet for a 50" set. Even if I give the article you posted the benefit of the doubt and say it's 17-18 feet, that is the JUST NOTICEABLE difference. To see ALL the detail of 4K, it would be closer to 7 feet as the maximum distance you could sit from a 50" set. Now to say that the article's assessment is the "truth" and what was posted previously has "no science behind the chart" is utterly laughable for the simple reason that the web page you quote simply accounts for visual acuity differences (the original chart is based on 20/20 vision, not 20/10 vision that is not normal vision, but fighter jet pilot level vision that FEW HAVE).

In other words, the original chart is 100% based in science on 20/20 vision. The rest of the article about sampling is pretty much BS. The recording end is what matters for the ultimate signal and there is no modern technical reason you can't properly scale that down using proper quality equipment to 1080p and get the full 1080p resolution out of it from a higher resolution source. The fact even Blu-Ray compresses the signal means it will never be 1:1 any time soon at the consumer level (typical movie would require over 7TB of data for a true uncompressed signal or at least 2.5TB with lossless compression and that's only 1080P.)

The funny thing is that I'm not against 4K at all. I want a huge-arse home theater and 4K is the ticket to making it look the real deal, particularly with a 2.35:1 screen. I'm 100% for people subsidizing the expansion into 4K with cheap and crappy sets since a faster adoption means lower prices sooner for REAL 4K equipment like projectors (currently at $8k and up). So go ahead and buy lots of 4K TV sets! But don't pretend that everyone buying one is enjoying the full benefits of 4K at home since I wager most are sitting way too far away for that. But that doesn't matter to me as long as 4K continues to come down in price.

You can believe whatever you want, but don't call a belief science.
 
Apple should admit they were caught by surprise with their competitors offering 4K on their system.

Not at all. The same was said about ATV2, which was limited to 720p when all others supported 1080p. :Apple: wasn't surprised, they knew they were behind the curve - but made sure what they were doing was done _well_, vs others who were more interested in checkbox marketing than user experience.
 
You can believe whatever you want, but don't call a belief science.

I can believe my lyin' eyes.

I can walk into a room full of various TVs and easily appreciate that the 4K displays look distinctly better, are less grainy, at what in my home is normal viewing distances. Yes, I can tell the difference; yes, I have a clue (worked at Kodak's digital cinema department). You can throw all the "science" around you like, but if I can tell the difference when your numbers say I can't, you might want to revisit your science.
 
Apple should admit they were caught by surprise with their competitors offering 4K on their system.

There is no viable 4k content on the market.

The 4k content offered by Netflix is very limited and the actual sound/video quality is worse than standard Bluray.
 
I can believe my lyin' eyes.

I can walk into a room full of various TVs and easily appreciate that the 4K displays look distinctly better, are less grainy, at what in my home is normal viewing distances. Yes, I can tell the difference; yes, I have a clue (worked at Kodak's digital cinema department). You can throw all the "science" around you like, but if I can tell the difference when your numbers say I can't, you might want to revisit your science.

Of course you can tell the difference.

Those TV's at the stores are running hard drives with 4k media. Is your TV at home running 4k media? Doubt it.
 
I can believe my lyin' eyes.

I can walk into a room full of various TVs and easily appreciate that the 4K displays look distinctly better, are less grainy, at what in my home is normal viewing distances. Yes, I can tell the difference; yes, I have a clue (worked at Kodak's digital cinema department). You can throw all the "science" around you like, but if I can tell the difference when your numbers say I can't, you might want to revisit your science.

The difference doesn't come from the resolution, that's a 100% sure.

It may come from newer panels being used in 4K, meaning cheap 4K panels would have better than a cheap 1080P TV.
But considering many 4K have other deficiencies that cancel those improvements, even that is not sure.

I can't even seen "graininess" in a highest end plasma 1080P, so I don't really understand this comment.
BTW, those highest end plasma just kill all comparably priced 4K unless your blind...

Also, unless you're actually comparing a good quality 1080P to it with the same native content, the whole declaration is pointless.

You must also know also that almost all internet streams are crap and if those are used as source, it will be a no contest unless the 1080P TV is getting the 4K stream too...
 
That's funny because I haven't seen YOU prove a DAMN THING. You just state it's not true despite people (including myself) posting all kinds of evidence and studies on the matter (for god's sake, the SNELLEN charts that all eye exams are based on are what these studies use and are well known and what determines if you can see properly with a given set of glasses or not if your vision isn't right!)

I quoted several sources and linked a test you can try at home with a printer. Resolving distances are a well known FACT. Your opinion that science isn't science and that whatever you happen WANT to believe should be the truth is your fracking imagination, nothing more. It reminds me of certain political groups that would rather bad mouth the other side than look at the actual facts.



I don't think you know what "completely contradict" even means. The only way a person could see better than the previously posted chart is if their eyes are better than 20/20 vision. Despite the claims of that article, few people have "significantly" better vision than 20/20. I don't know many that have better than 20/10 vision period (and I have had 20/10 "corrected" before, but like most people, my vision varies a bit during the course of the day, sinuses, etc.). At 20/10, you might be able to double the distances you could see a "difference" at, but that puts the limits of being able to just notice a difference at around 15 feet for a 50" set. Even if I give the article you posted the benefit of the doubt and say it's 17-18 feet, that is the JUST NOTICEABLE difference. To see ALL the detail of 4K, it would be closer to 7 feet as the maximum distance you could sit from a 50" set. Now to say that the article's assessment is the "truth" and what was posted previously has "no science behind the chart" is utterly laughable for the simple reason that the web page you quote simply accounts for visual acuity differences (the original chart is based on 20/20 vision, not 20/10 vision that is not normal vision, but fighter jet pilot level vision that FEW HAVE).

In other words, the original chart is 100% based in science on 20/20 vision. The rest of the article about sampling is pretty much BS. The recording end is what matters for the ultimate signal and there is no modern technical reason you can't properly scale that down using proper quality equipment to 1080p and get the full 1080p resolution out of it from a higher resolution source. The fact even Blu-Ray compresses the signal means it will never be 1:1 any time soon at the consumer level (typical movie would require over 7TB of data for a true uncompressed signal or at least 2.5TB with lossless compression and that's only 1080P.)

The funny thing is that I'm not against 4K at all. I want a huge-arse home theater and 4K is the ticket to making it look the real deal, particularly with a 2.35:1 screen. I'm 100% for people subsidizing the expansion into 4K with cheap and crappy sets since a faster adoption means lower prices sooner for REAL 4K equipment like projectors (currently at $8k and up). So go ahead and buy lots of 4K TV sets! But don't pretend that everyone buying one is enjoying the full benefits of 4K at home since I wager most are sitting way too far away for that. But that doesn't matter to me as long as 4K continues to come down in price.

You can believe whatever you want, but don't call a belief science.

What you keep citing is not science. It is even worded in the very article as not being science... "I have estimated where it becomes noticeable..."

Estimated.

Not science. Someone's speculation, like yours, that's wrong. My article details why... with science.

On the other hand, the "better than 20/20 vision" in my article is the author's theory as to why his previously cited scientific experiments in his article have proven your BS chart wrong. He even frames it as such, not claims that it is accurate. He even supports that with scientific data, showing that 20/15 is the current average visual acuity of the population that has visited an eye doctor in their lives. Data ranging back to 1862 wherein healthy individuals are better than 20/20 through 75 years old, where average people don't fall below 20/20 until they are 60.

"Let’s first look at a series of experiments done by NHK that compared a plaster bust, model ship and butterflies to a display. These results can be found in ITE Technical Report Vol. 35, No. 16. The summary of the results are shown in the chart above. NHK claimed the tests showed 310 pixels/degree are needed for an image to reach the limit for human resolution. [snip] At a THX recommended 36 degree viewing angle this corresponds to a 11K display to hit the 310 pixels/degree limit NHK observed."

11k, to reach the human limit... at your charts recommended levels... 11k. Nearly 3x as much vertical and horizontal per line...

"The ARRI film scanner results above show the results for a 2K, 4K and 10K film scanners. These scans are magnified from a section of 35mm film. I have heard in my work with the film industry that 12K is what is needed to replicate the best 35mm film."

A different experiment showing 12k...

So yes, I haven't proven anything. I never had to prove anything... but science did. It proved everything.
 
Actually the OP is much closer to right than you are. Physics are not your friend in this argument. The human eye cannot see the difference between 4K and 1080P on a 40 inch screen unless you're about 3 feet away.

And the large percentage of people don't have 25Mbps.

Not saying you shouldn't buy a 4K TV if you're in the market for a new TV. But a lot of people have bought 1080P in the last few years, and a bunch of them are smart enough to know that there is no benefit for them to replace their currently working TV for a 4K.

25mbps is merely sufficient for 1080p.
 
The human eye cannot see the difference between 4K and 1080P on a 40 inch screen unless you're about 3 feet away.

My experience with 4K has shown that it does matter. The color pallet is fantastic and blacks are incredible. That matters. From a tech standpoint it needs HDMI 2.X and H.265 as well.

It hasn't been a hobby for awhile now.

No sold on this.

Not going 4k until 2017 at the earliest (pointless at the moment IMO), but thanks for all the suckers - sorry - early adopters for dropping the price down for us sensible ones.

If this gets delayed like ATV4 did it will be 2017. :eek:

Remote needs TouchID.

Indeed. Or see below:

Regardless of performance, people should stay away from this new Apple TV until they fix the Bluetooth to allow keyboards to connect to it. They claim that only Bluetooth 4 can connect but there are no 4's on the market at the moment.

We need a remote redesign!!!!

They should just put out an app for iPad, iPod touches and iPhones that does it all.
 
I can believe my lyin' eyes.

I can walk into a room full of various TVs and easily appreciate that the 4K displays look distinctly better, are less grainy, at what in my home is normal viewing distances. Yes, I can tell the difference; yes, I have a clue (worked at Kodak's digital cinema department). You can throw all the "science" around you like, but if I can tell the difference when your numbers say I can't, you might want to revisit your science.

I agree. What we need to agree on here is that this is about much more than just the distance and size of the pixels. Apparently 4K TVs provide better viewing experience because of a number of factors beyond just a multiplication of pixels. Several people here have reported that but the people that keep posting the graph conveniently ignore this aspect and stick to the pixel argumentation.
 
  • Like
Reactions: ErikGrim
What you keep citing is not science. It is even worded in the very article as not being science... "I have estimated where it becomes noticeable..."

Estimated.

Not science. Someone's speculation, like yours, that's wrong. My article details why... with science.

It's clear to me you don't appear to know what "science" is. There is no "speculation" here. The studies of visual acuity have been around for a long time. Snellen charts are based on them. Where do you think we get measurements for vision to begin with like 20/20??? Your statements are as absurd as those people that claim there's no such thing as global warming because it snowed in Texas in a recent year. That's not proof of overall warming one way or another. Science deals with repeatable tests, not your opinion or desire for it to be something else. Snellen charts accurately predict your ability to resolve written information detail. Where do you think 20/20 comes from to begin with? It's a measure of visual acuity (https://www.nlm.nih.gov/medlineplus/ency/article/003396.htm) in human vision. Here is a PROPER web page discussing Visual Acuity (not viewing angle which is relative to your seating location off center axis in a room or movie theater:

(http://webvision.med.utah.edu/book/part-viii-gabac-receptors/visual-acuity/)

Here's a web site that discusses home theater screen sizes relative to your seating distance (viewing arc angle) relative to 1080p and includes your THX 36 degree angle:

http://myhometheater.homestead.com/viewingdistancecalculator.html

Based on my 93" screen, the maximum OPTIMAL viewing distance to fully resolve 1080p is 12.2 feet or less. My couch location is 11 feet away, putting it in the "slightly noticeable" range of 4k projection. Moving my couch back a foot or so would make 4k pointless unless I increased the size of the screen.

According to the article YOU point to, the distance is over twice that. I have a 1080p monitor upstairs. According to you and your article, I should be able to sit considerably further away than I can to see detail. I'm not buying it based on my own observations. 720p looks exactly the same past a certain point and 480p looks the same about 4x the distance away.

This 11K business does seem to have some basis in reality, but from what I've been reading it's not an ability to distinguish DETAIL apart at those resolutions, but rather an ability to use some tricks to achieve 3D without glasses and that requires more detail to pull off the effect (you're essentially cramming more spatial information into the same image space). In other words, 11k is required to FOOL someone into thinking an object is REAL in a dark room when in fact, it's only an IMAGE of an object (hologram like effect, assuming you you don't start moving around to tell the angle of the image doesn't change with your movement).

Does that sound neat? Yes. Will 8k do it? It wouldn't appear so. You need 11k and that's a long way off (probably at least another decade or more). I think it will be more useful for Virtual Reality than cinema. There are plenty of people out there that miss actual 35mm film because of the GRAIN (i.e. too sharp and clean looks "fake" to them as does 60fps). I saw the Hobbit in that 48fps and it LOOKED like a computer rendering during scenes like that spin around camera view of the haunted castle. It looks less believable, not more (soap opera-like effect).

"The ARRI film scanner results above show the results for a 2K, 4K and 10K film scanners. These scans are magnified from a section of 35mm film. I have heard in my work with the film industry that 12K is what is needed to replicate the best 35mm film."

Comparing 35mm film to digital is apples and oranges to some extent. You don't get pixels with film, you get grain and the film emulsion and speed (and therefore the lighting you are shooting in as well; low light produces more visible grain, for example) and the camera quality all affect the outcome. 3840x2160 is 4K and most sources I've seen suggest that is approximating good quality 35mm film. 8k would certainly produce an equivalent shot and better it in other areas considerably. (http://pic.templetons.com/brad/photo/pixels.html)

So yes, I haven't proven anything. I never had to prove anything... but science did. It proved everything.

Sadly, what it proved is you didn't really understand what you read and then are quick to accuse everyone else of not using science. :rolleyes:
 
I call ********. When have Apple ever superseded a brand new product inside of 12 months? Click-bait trash.

In addition to what others have pointed out, you may recall a time when Apple updated their Macs multiple times a year. Until iOS devices exploded, that was actually kind of the norm. ;)
 
I can believe my lyin' eyes.

I can walk into a room full of various TVs and easily appreciate that the 4K displays look distinctly better, are less grainy

Digital doesn't contain grain. It has pixels. I don't know what you are seeing, but it's not the difference between 1080p and 4k.

, at what in my home is normal viewing distances. Yes, I can tell the difference; yes, I have a clue (worked at Kodak's digital cinema department). You can throw all the "science" around you like, but if I can tell the difference when your numbers say I can't, you might want to revisit your science.

If you worked at Kodak in projection, you should know the differences between analog film and digital. Your comment suggests otherwise to me. On the other hand, I can work at a restaurant and not be the chef, let alone a gourmet chef so just working at Kodak in the cinema division doesn't really give me a sense of your credentials.

I've also heard the SAME type of claims made about audio about every snake oil device, LP vs CD sound, various DACs, etc., but when it comes to a double blind test, I have yet to hear of a single person pass one of the more onerous claims yet like snake oil devices like the green CD pen that makes CDs sound "less harsh", a $3000 DAC versus a $20 (it's amazing how hard it is to hear a 0.03dB difference; you can see this sort of difference in the specs, but golden ears believe they can absolutely hear it and it makes a "BIG" difference) or telling an LP from the recorded copy of the same LP (it would seem the difference they like is a DISTORTION, probably a 2nd order one that sounds good like tube-amp based guitar distortion). They never get better than guessing because their Ego tells them they are super-human. "Maybe YOU and your tone deaf ears can't tell, but it's OBVIOUS to me!" is a typical response. So forgive me if I don't take you at your word.

Of course you can tell the difference.

Those TV's at the stores are running hard drives with 4k media. Is your TV at home running 4k media? Doubt it.

You also stand a few feet away from the 4K screen. I saw them at Best Buy. They looked awesome 3-5 feet away. I backed up to 15 feet (~65" set as I recall) and it looked like 1080p. I also noticed they had the 4K sets turned the other way in a cubicle set up area so you could not look directly at one and a 1080p one on the wall at the same time or you might stand back far enough and notice that they looked the same as you backed up.
 
Regardless of performance, people should stay away from this new Apple TV until they fix the Bluetooth to allow keyboards to connect to it. They claim that only Bluetooth 4 can connect but there are no 4's on the market at the moment.

Umm, why would I "stay away" from a product that does a bunch of other amazing stuff that I want it to do just because it doesn't yet connect to a keyboard that I might use once every three weeks?

My experience with 4K has shown that it does matter. The color pallet is fantastic and blacks are incredible. That matters. From a tech standpoint it needs HDMI 2.X and H.265 as well.

You're not seeing anything related to 4K when you talk about color palette and blacks. 4K is about nothing but pixel density.
 
It's clear to me you don't appear to know what "science" is. There is no "speculation" here. The studies of visual acuity have been around for a long time. Snellen charts are based on them. Where do you think we get measurements for vision to begin with like 20/20??? Your statements are as absurd as those people that claim there's no such thing as global warming because it snowed in Texas in a recent year. That's not proof of overall warming one way or another. Science deals with repeatable tests, not your opinion or desire for it to be something else. Snellen charts accurately predict your ability to resolve written information detail. Where do you think 20/20 comes from to begin with? It's a measure of visual acuity (https://www.nlm.nih.gov/medlineplus/ency/article/003396.htm) in human vision. Here is a PROPER web page discussing Visual Acuity (not viewing angle which is relative to your seating location off center axis in a room or movie theater:

(http://webvision.med.utah.edu/book/part-viii-gabac-receptors/visual-acuity/)

Here's a web site that discusses home theater screen sizes relative to your seating distance (viewing arc angle) relative to 1080p and includes your THX 36 degree angle:

http://myhometheater.homestead.com/viewingdistancecalculator.html

Based on my 93" screen, the maximum OPTIMAL viewing distance to fully resolve 1080p is 12.2 feet or less. My couch location is 11 feet away, putting it in the "slightly noticeable" range of 4k projection. Moving my couch back a foot or so would make 4k pointless unless I increased the size of the screen.

According to the article YOU point to, the distance is over twice that. I have a 1080p monitor upstairs. According to you and your article, I should be able to sit considerably further away than I can to see detail. I'm not buying it based on my own observations. 720p looks exactly the same past a certain point and 480p looks the same about 4x the distance away.

This 11K business does seem to have some basis in reality, but from what I've been reading it's not an ability to distinguish DETAIL apart at those resolutions, but rather an ability to use some tricks to achieve 3D without glasses and that requires more detail to pull off the effect (you're essentially cramming more spatial information into the same image space). In other words, 11k is required to FOOL someone into thinking an object is REAL in a dark room when in fact, it's only an IMAGE of an object (hologram like effect, assuming you you don't start moving around to tell the angle of the image doesn't change with your movement).

Does that sound neat? Yes. Will 8k do it? It wouldn't appear so. You need 11k and that's a long way off (probably at least another decade or more). I think it will be more useful for Virtual Reality than cinema. There are plenty of people out there that miss actual 35mm film because of the GRAIN (i.e. too sharp and clean looks "fake" to them as does 60fps). I saw the Hobbit in that 48fps and it LOOKED like a computer rendering during scenes like that spin around camera view of the haunted castle. It looks less believable, not more (soap opera-like effect).



Comparing 35mm film to digital is apples and oranges to some extent. You don't get pixels with film, you get grain and the film emulsion and speed (and therefore the lighting you are shooting in as well; low light produces more visible grain, for example) and the camera quality all affect the outcome. 3840x2160 is 4K and most sources I've seen suggest that is approximating good quality 35mm film. 8k would certainly produce an equivalent shot and better it in other areas considerably. (http://pic.templetons.com/brad/photo/pixels.html)



Sadly, what it proved is you didn't really understand what you read and then are quick to accuse everyone else of not using science. :rolleyes:

TLDR. Yet again.

Let's just end this here. My article proved your stupid viewing distance chart wrong - yes, the one you and a few others that refuse to read it linked to and continue to link to that contains no science - with scientific experiments that were commissioned by the very ITU that sets the actual industry standards. And the test results show that you can double the distances prescribed by them.

In fact, high end videophile retailers are even switching from such hard numbers like you've got stuck in your head, to wide ranges in distances to better prescribe sets and sizes, as each videophile working group - ie SMPTE or THX (and THX even has its own range, that your calculator has wrong) recommends wildly different viewing angles.

The visual acuity point is his hypothesis on why they're so wrong based on the findings of the scientific studies that were conducted. And it includes studies on visual acuity that proves the average visual acuity is better than 20/20 or 6/6 with any necessary correction going back to the year 1862 when the chart was created. Don't know what 20/15 is? Read line 9 on the chart next time.

Don't like that? Too bad. You wanted science. You got science.

End of discussion.
 
Last edited:
Problem is that my local content is all from itunes store, no ripped CD's or DVD's thus it has the DRM on much of it except perhaps older itunes moves maybe? That means it will not play on plex for me. I have only itunes content no other sources of movies. So unfortunately this is not a choice that is likely to work for me.

And I do not have the skillset/experience and software to convert my itunes DRM files thus Plex really is not a solution that would do much from what I read on their site. They specifically say the DRM protected files of iTunes will not work in Plex.

Why are you playing local content that you purchased from iTunes? You stream it from Apple, not from your computer's hard drive.
 
Digital doesn't contain grain. It has pixels.

Pixels are the grain of digital. They just happen to be well-ordered square equal-sized grains.
Being pedantic doesn't help your argument.

As another poster noted above, good 35mm film equates to 12K projection. Yes, I've seen top-quality film projection, and there is a difference vs digital cinema.

{SNIP}

the green CD pen

Red herrings don't help your argument.

You also stand a few feet away from the 4K screen. I saw them at Best Buy. They looked awesome 3-5 feet away. I backed up to 15 feet

I regularly watch TV at 5 feet; best way to mimic theater-sized screen without driving to a large room with sticky floors and shelling out $12. My living room is less than 15 feet long.

I'm routinely puzzled by people insisting that larger displays should be viewed at ever-farther distances, apparently in an attempt to negate any improvements in resolution. (To wit: if I swap a 42" HDTV for a 65" 4KTV I should get a bigger living room.)

I also noticed they had the 4K sets turned the other way in a cubicle set up area so you could not look directly at one and a 1080p one on the wall at the same time

Costco has 'em unabashedly side-by-side. The difference is unmistakeable at 15'.
 
Last edited by a moderator:
  • Like
Reactions: peterdevries
Is your TV at home running 4k media? Doubt it.

Heard the same rhetoric when 1080p was coming: "Is your DVD player or satellite TV at home running HD? Doubt it." Blu-ray, HD cable, and streaming HD were available in volume months later.

Nothing wrong with addressing the looming chicken-and-egg problem by getting the hardware now and enjoying "first adopter" as the content arrives. Other media devices already support 4K. TVs themselves often have hardware playback built in (dump the movie onto an SD card, plug it in, watch).

And yes, you CAN download high-quality 4K content _now_. Might take a little longer than instant streaming, might require downloading it completely before viewing, but that's quite tolerable.

If you're going to replace your TV anyway, no reason to NOT get a 4K display. Slap an ATV3 on there for now cheap, then get the 4K ATV5 when it comes out. In the meantime, download 4K movies onto removable media and let the TV play it directly.
 
Heard the same rhetoric when 1080p was coming: "Is your DVD player or satellite TV at home running HD? Doubt it." Blu-ray, HD cable, and streaming HD were available in volume months later.

Nothing wrong with addressing the looming chicken-and-egg problem by getting the hardware now and enjoying "first adopter" as the content arrives. Other media devices already support 4K. TVs themselves often have hardware playback built in (dump the movie onto an SD card, plug it in, watch).

And yes, you CAN download high-quality 4K content _now_. Might take a little longer than instant streaming, might require downloading it completely before viewing, but that's quite tolerable.

If you're going to replace your TV anyway, no reason to NOT get a 4K display. Slap an ATV3 on there for now cheap, then get the 4K ATV5 when it comes out. In the meantime, download 4K movies onto removable media and let the TV play it directly.

I'm not questioning the value of 4k.

What I'm questioning is your comment that you can tell the difference between your TV and the 4k sets at stores at far distances. The reason you can is because the 4k sets are using higher quality sources than your TV at home.

To do a true comparisson you need to use identical TV's except that one uses a 1080p panel and the other a 4k panel. Both need to be using the same source. Using those parameters most professional reviewers say they can't tell the difference between 4k and 1080p at normal viewing distances on TV's smaller than 60 inches.

Sources make a HUGE difference. Watch a VCR tape on a 1080p TV set. Now watch a Bluray on a 720p TV set. Guess what? The 720p TV will look better because its using a better source. Same thing with your experience. The showrooms are using 2160p video feeds from harddrives vs your Netflix stream or 1080p Bluray disc.
 
The reason you can is because the 4k sets are using higher quality sources than your TV at home.

Tautological reasoning. 4K content provides better quality than 1080p content. Go figure.

With the onset of HD we had exactly the same rhetoric you're using now ... and today you won't contend that 1080p is indistinguishable from 480i.

We'll have sufficiently good 4K content sourcing soon enough - which you'll be able to watch with your 4K hardware if you have it. In the meantime, upscaling (like 480i -> 1080p) will make current lower-resolution media look better.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.