It's clear to me you don't appear to know what "science" is. There is no "speculation" here. The studies of visual acuity have been around for a long time. Snellen charts are based on them. Where do you think we get measurements for vision to begin with like 20/20??? Your statements are as absurd as those people that claim there's no such thing as global warming because it snowed in Texas in a recent year. That's not proof of overall warming one way or another. Science deals with repeatable tests, not your opinion or desire for it to be something else. Snellen charts accurately predict your ability to resolve written information detail. Where do you think 20/20 comes from to begin with? It's a measure of
visual acuity (
https://www.nlm.nih.gov/medlineplus/ency/article/003396.htm) in human vision. Here is a PROPER web page discussing Visual Acuity (not viewing angle which is relative to your seating location off center axis in a room or movie theater:
(
http://webvision.med.utah.edu/book/part-viii-gabac-receptors/visual-acuity/)
Here's a web site that discusses home theater screen sizes relative to your seating distance (viewing arc angle) relative to 1080p and includes your THX 36 degree angle:
http://myhometheater.homestead.com/viewingdistancecalculator.html
Based on my 93" screen, the maximum OPTIMAL viewing distance to fully resolve 1080p is 12.2 feet or less. My couch location is 11 feet away, putting it in the "slightly noticeable" range of 4k projection. Moving my couch back a foot or so would make 4k pointless unless I increased the size of the screen.
According to the article YOU point to, the distance is over twice that. I have a 1080p monitor upstairs. According to you and your article, I should be able to sit considerably further away than I can to see detail. I'm not buying it based on my own observations. 720p looks exactly the same past a certain point and 480p looks the same about 4x the distance away.
This 11K business does seem to have some basis in reality, but from what I've been reading it's not an ability to distinguish DETAIL apart at those resolutions, but rather an ability to use some tricks to achieve 3D without glasses and that requires more detail to pull off the effect (you're essentially cramming more spatial information into the same image space). In other words, 11k is required to FOOL someone into thinking an object is REAL in a dark room when in fact, it's only an IMAGE of an object (hologram like effect, assuming you you don't start moving around to tell the angle of the image doesn't change with your movement).
Does that sound neat? Yes. Will 8k do it? It wouldn't appear so. You need 11k and that's a long way off (probably at least another decade or more). I think it will be more useful for Virtual Reality than cinema. There are plenty of people out there that miss actual 35mm film because of the GRAIN (i.e. too sharp and clean looks "fake" to them as does 60fps). I saw the Hobbit in that 48fps and it LOOKED like a computer rendering during scenes like that spin around camera view of the haunted castle. It looks less believable, not more (soap opera-like effect).
Comparing 35mm film to digital is apples and oranges to some extent. You don't get pixels with film, you get grain and the film emulsion and speed (and therefore the lighting you are shooting in as well; low light produces more visible grain, for example) and the camera quality all affect the outcome. 3840x2160 is 4K and most sources I've seen suggest that is approximating good quality 35mm film. 8k would certainly produce an equivalent shot and better it in other areas considerably. (
http://pic.templetons.com/brad/photo/pixels.html)
Sadly, what it proved is you didn't really understand what you read and then are quick to accuse everyone else of not using science.