Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Which programs have you watched in 4k on a TV screen that are 'far superior'?

*yawn*

1. Look, I'm into 4K, technique, recording, following EOSHD and other indie communities. You obviously aren't. This is why I don't even bother presenting you some links - do your homework yourself.

2. You surely bravely fought against 1080p while Apple were still stuck with 720p, weren't you?
 
W ... T ... F ...

The news story is that an iPhone can play back video files with non-native resolutions?

Isn't it fairly rare to have video files that ARE at native resolutions?

I've been playing 720p and 1080p video files on my iPhones for years. Again, WTF.

Sure, being able to play back standard-resolution videos is the easiest way to go - no need to trandscode anything prior to playback. Fortunately, using hardware decoding, 8-bit (and, with the latest A7/A8 devices, even 10-bit) 1080p H.264 playback is possible on even "old" devices like the iPad 1.

Given that a lot of people start shooting at 4K on their dedicated cameras (LX100, FZ1000, GH4, NX1 etc.) and phones (LG G3, Note 3/4 etc - lol, Note 3 introduced 4K recording more than a year ago - talk about Apple's not being behind...), avoiding 4K transcoding before playback is a big plus and will become increasingly handy.
 
Meanwhile in Japan they are testing 8K in limited markets.

Japan also think a LOT about anything related to the toilets, so.. Yes, I've been to Japan a lot ;-).

8K and the like are perfect for advertising, more than for movies, since people pass right next to it, much closer than if you watch the whole screen. For that, it makes load of sense.

I'm much prefer someone stops motion blur and improve the dynamic range of colors than a even higher resolution crappy rendition of reality...
 
Many of these same pessimistic arguments were flying when Apple was still clinging to 720p as HD.

No point in making iDevices shoot 4K if the iDevice screen is not 4K? How about shooting in 4K and then moving the video to a 4K TV? Even prosumer 4K cameras come with little screens that don't have 4K resolution. Why? Because what you shoot with and on what you play that video doesn't have to be the same device. For more than 100 years now, major motion pictures have been shot to film. The viewfinder for that was often an eyepiece peering through a hole and lens. Never could the shooting device play back the film being shot at full resolution. But that was also never the goal. For a long time now, the resolution of photos able to be shot with iDevices has greatly exceeded the resolution of the screen of that iDevice. No complaints there. So this is motion pictures instead of static pictures. So what.

No point in 4K :apple:TV until there's 4K content in the iTunes store? That's purely chicken vs. egg. We already have other sources for 4K now. Camcorders and some cameras have been able to shoot 4K for a while now. Getting an :apple:TV that can play that back at 4K would deliver great utility. Once there's enough chickens, the eggs will follow.

Another way to look at that one: right now, there's no apps in the app store that can fully exploit the A9 processors coming next year or the A10 coming the year after that. So since there is no such content in the app store, why bother building iPhones with newer A processors?

But what about the memory-hogging nature of 4K on memory-starved iPhones? First, you don't have to shoot anything at 4K. Second, if you intend to shoot 4K, you know you'll need the storage so pay up and get bigger storage. Cheapest price and highly-demanding, latest & greatest video standards are rarely compatible.

4K-capable iDevices would bring many benefits, from pressuring competitors to roll out more competitively-priced alternatives to getting to capture precious moments at higher resolution that you can never go back and capture again when all the con excuses are fully addressed. 4K iDevices probably comes with h.265 and probably spurs on a h.265 4K :apple:TV too. Concerns about 16GB clashes might motivate Apple to shift the base up to something more (which would be good for just about all uses of iDevices). Home movies & Vodcasts could step up to 4K first and then some Studio would eventually be tempted to try some 4K offerings in the iTunes store. If they make some money, they'll roll out more and the other Studios will compete.

Some of you guys crack me up. There's no software to fully exploit the next generation of hardware coming in Macs but that doesn't stop us from pining for the latest & greatest in the next incarnation of Macs. We so look forward to that A8X or A9 in iDevices. We so look forward to the next generation graphic cards. We want faster/better/sharper everything except this one thing, which we decide to cast as gimmick and then spin ideas like eyes can't see the difference (though we jumped all over retina HD when retina (non HD) was spun as the max eyes can see).

Much like it was before Apple got around to embracing 1080p, Apple doesn't have 4K now so "we" can find all kinds of flaws with it. But once Apple formerly endorses it, we'll forget all such arguments against it. There were very passionate pro-720p, con-1080p argument spun before Apple endorsed 1080p (including some of the very same ones that have been repurposed into this thread). Then Apple endorsed 1080p and all that evaporated. No one called Apple stupid for going 1080p, the whole internet did not crash, somehow those interested in 1080p found a way to store those "huge" files, "the chart" stopped being slung around, and the 1% that wanted 1080p (because apparently "99% did not") were apparently so enthusiastic about buying it, they motivated just about every offering in the iTunes store to offer a 1080p version.

Spot on. As always with your posts. Particularly the last part. These "Apple enthusiasts" and their constant attacking everything new, useful and (still-)not-implemented-by-Apple are really tiresome.

----------

I'm much prefer someone stops motion blur

It's exactly motion blur that makes 24p cinema cinema-like. A filmmaker will never want to get rid of it - exactly the opposite. They oppose to everything that could make movies "video-like", that is, ones that are shot with a much higher (say, 1/60s or even higher) shutter speed.

Of course, in a lot of other areas, motion blur is something to avoid; for example, when shooting a sweep pano video for later framegrabbing and high-res still pano creation. Individual grabbed frames will be blurred even at 30p if the shutter speed can't be manually set to, at least, 1/250s, assuming wideangle shooting.
 
Which programs have you watched in 4k on a TV screen that are 'far superior'?

I think you can buy hard drives with 4K content on it.... That's probably the only thing that would make 4K look decent, certainly not a compressed Netflix stream...

Anyway, the closest you can sit to a 1080P screen and both fill your field of vision and not see pixels with a 20/20 vision is I think 5.5 feet for a 51 inch (that's the closest range by far, there are recommendations that are further away).

With a 4K screen you'd need to be 2 foot 8 inch away to not see the pixels with 20/20 vision and at that range, you're NOT seeing the whole screen no matter how good your vision is. For a computer screen, that's not a problem since you're not supposed to be able to see the whole screen necessarily, but for a movie or TV it is a problem.

Motion blur is a lot worse on a LCD (especially when its close like that) than on plasma, not sure how it is on those 4K, but I doubt they've improved it a lot if they're focusing so much on resolution.

----------

Spot on. As always with your posts. Particularly the last part. These "Apple enthusiasts" and their constant attacking everything new, useful and (still-)not-implemented-by-Apple are really tiresome.

----------



It's exactly motion blur that makes 24p cinema cinema-like. A filmmaker will never want to get rid of it - exactly the opposite. They oppose to everything that could make movies "video-like", that is, ones that are shot with a much higher (say, 1/60s or even higher) shutter speed.

Of course, in a lot of other areas, motion blur is something to avoid; for example, when shooting a sweep pano video for later framegrabbing and high-res still pano creation. Individual grabbed frames will be blurred even at 30p if the shutter speed can't be manually set to, at least, 1/250s, assuming wideangle shooting.

The motion blur from plasma is massively different (and more cinema like) than the crap that comes from LCD. Not sure why that is. But it is terrible. Watching Hockey or soccer on a LCD is a horrendous experience on all TV's of that type I've ever seen no matter how much they cost.
 
The motion blur from plasma is massively different (and more cinema like) than the crap that comes from LCD. Not sure why that is. But it is terrible. Watching Hockey or soccer on a LCD is a horrendous experience on all TV's of that type I've ever seen no matter how much they cost.

Dunno - IMHO, LCD's can render motion blur as accurately as analogue film. That is, the rendering quality on both film and LCD will only depend only on the shooting parameters. Most importantly, the shutter speed: anything over 1/24s will likely result in a video-like, non-cinematic look. Exactly the one you've talked about: sport events are shot with very high (much-much higher than 1/24s) shutter speeds. This is why they aren't at all film-like. They aren't supposed to look like a film.
 
4K video ? wait until apple provide iPhone with 64gb as the entry level.
with 16 gb space, it is pretty much useless.

No not really. 4K - h.265 streaming (not to be confused with h.264). The iPhone 6 is the only phone to support h.265 encoding and decoding :)
 
Exclusivity doesn't work and it'll just end up pissing off customers and driving them to competitors. They need to roll it out across all flagship products. Was looking at getting the new Mini 3 until I found out they gimped it by not including the 2GB DRAM and tri-core from iPad Air 2.

While I was also disappointed about Apple's departure from feature parity between iPad mini & full size....
When it comes to split screen, our opinions are vastly different.
I think multiwindows ONLY make sense on a tablet size device. I have ZERO interest in a craptastic Samsung-like "we did it because we could, not because we should" experience on my iPhone. At all. Ever. Yuck.
 
All of you do know that this isnt that earth shattering right? Mobile devices have been able to play 4k video for a while...
 
Ah

I can definitely see the difference.

4k could almost be an addiction to me - I had laser eye surgery last december and the day after my eyes went from 20/200 to 20/15 and everything had an almost shocking clarity I never knew before.

Let me tell you it was a big surprise that all these years my prescription glasses were really just a "approximation" of my "real" prescription.

So, looking at old resolution just looks silly to me.

I can even tell the 4k videos I dropped to my iPhone 6+ look slightly nicer than the 1920x1080 HD video but remember too, many of the "demo videos" are being recorded with extremely high grade pro cameras and lenses.

If your 6 could record 4K the lens isn't going to rival the real pro grade cine stuff.
 
I can definitely see the difference.

4k could almost be an addiction to me - I had laser eye surgery last december and the day after my eyes went from 20/200 to 20/15 and everything had an almost shocking clarity I never knew before.

Let me tell you it was a big surprise that all these years my prescription glasses were really just a "approximation" of my "real" prescription.

So, looking at old resolution just looks silly to me.

I can even tell the 4k videos I dropped to my iPhone 6+ look slightly nicer than the 1920x1080 HD video but remember too, many of the "demo videos" are being recorded with extremely high grade pro cameras and lenses.

If your 6 could record 4K the lens isn't going to rival the real pro grade cine stuff.

Give me a fracking break... Unless you sit 3 foot away from your 50 inch TV, you won't see the difference. As for on a phone, it is even more ludicrous.

One thing that'S not mentioned in the assessment of TV'S is that 4K TV are often better quality TVs for other reason than resolution, contrast, color range, black level, color accuracy, etc. All of that can make the image better regardless of resolution. Many people mistake improvement in any of these factors with the improvements that come from the resolution.
 
*yawn*

1. Look, I'm into 4K, technique, recording, following EOSHD and other indie communities. You obviously aren't. This is why I don't even bother presenting you some links - do your homework yourself.

2. You surely bravely fought against 1080p while Apple were still stuck with 720p, weren't you?

For almost all consumers, televisions are for viewing/streaming broadcast content. If you are claiming a far superior viewing experience then surely you would have some recommendations of what particular content that is currently available looks superior on a 4K television.
 
I think you can buy hard drives with 4K content on it.... That's probably the only thing that would make 4K look decent, certainly not a compressed Netflix stream...

Anyway, the closest you can sit to a 1080P screen and both fill your field of vision and not see pixels with a 20/20 vision is I think 5.5 feet for a 51 inch (that's the closest range by far, there are recommendations that are further away).

With a 4K screen you'd need to be 2 foot 8 inch away to not see the pixels with 20/20 vision and at that range, you're NOT seeing the whole screen no matter how good your vision is. For a computer screen, that's not a problem since you're not supposed to be able to see the whole screen necessarily, but for a movie or TV it is a problem.

Motion blur is a lot worse on a LCD (especially when its close like that) than on plasma, not sure how it is on those 4K, but I doubt they've improved it a lot if they're focusing so much on resolution.

----------



The motion blur from plasma is massively different (and more cinema like) than the crap that comes from LCD. Not sure why that is. But it is terrible. Watching Hockey or soccer on a LCD is a horrendous experience on all TV's of that type I've ever seen no matter how much they cost.

The 'motion blur' problem comes from SD content being upscaled to HD, and/or from compression of the video stream.

Upscaling from HD to 4K is a 4x increase in the number of pixels, which isn't going to help when one pixel is smeared across 4 for playback.

There really isn't any advantage to 4K for any television under 65" since the average viewing distance in most households is 7-10 feet. I'm all for higher resolution computer screens because you sit much closer and don't need to view the entire screen at once, but for Television, it is pointless.

That said, Samsung's use of 3D/4k technology to display two different programs on the same screen at the same time is a practical use for 4K technology - particularly with multiplayer gaming.
 
For almost all consumers, televisions are for viewing/streaming broadcast content. If you are claiming a far superior viewing experience then surely you would have some recommendations of what particular content that is currently available looks superior on a 4K television.

There was exactly the same debate with HD and people like you were claiming that HD was useless because there was no contents... Actually, we even have the same debate when moving from 4/3 to 16/9: all the broadcast content was in 4/3, so why bother with 16/9? We actually did not have the color vs black&white debate because we had only state TV back then (so the switch was instant and total), but you probably did.

Yet, now we have plenty of HD content and it is in 16/9 and you even have color. This is a chicken and egg problem: you won't have content until you have a market, but if you have a market, you will have content.

And content is there already: many movies have a 4k version (we have 12 screens that show 4k movies in the city and suburb), TV channels already capture and master in 4k... Moreover, the switch is easier than before. When TV was analog, moving to color or HD was difficult. Nowadays, you just have to upgrade an Internet box and you can have 4k content, you just need the bandwidth (which is less and less of a problem with FTTH).

As for particular content, if you happen to be in Paris during spring, you should be able to watch the Roland Garros tennis championship in 4K like we were last year. In the meantime, we have two TNT channels broadcasting sample 4k content full time. We will also have the soccer world cup (or something, I'm not a fan) in 4k.
Very high bandwidth 4k makes a lot of sense for sports, since sports often have a lot of very high frequency contents (grass, spectators...) and that does not look very good with low bandwidth (artefacts) or low resolution (aliasing).
 
The 'motion blur' problem comes from SD content being upscaled to HD, and/or from compression of the video stream.

Errr... No. You're making a confusion between several kinds of blur.

Motion blur happens at capture, when you're using a shutter speed that is too slow for the action in the scene. Then, each moving object will have edges and details blurred in the direction of the movement.
This is actually a good thing in cinema, because our eyes are also sensitive to motion blur and so they expect moving objects to be blurred. This is why cinema, despite being 24 fps, feels smooth. A 24 fps video game or video with no motion blur feels just awful. This is also why video at high shutter speed (like in sport) does have this unreal feeling.
Most TV feature 100Hz or more modes that try to remove motion blur - all of them just butcher movies and make them look like amateur home movies.

What you are describing is in fact what GPU do when playing games and you have anti-aliasing activated : the scene is rendered at higher resolution than the output and then subsampled with specific algorithms. This actually results in a higher feeling of definition and less aliasing artifacts. If the 4k to HD resampling is done with the proper algorithms, you will have a better feeling of resolution and less artifacts in high frequency scenes.

Likewise, some people here speak of motion blur concerning plasma vs LCD. Now, plasma is a great (but sadly defunct) technology - owning a plasma display is part of the reasons why I'm holding on the move to 4k right now. But motion blur is not one of the reasons - motion blur happens at capture, inside the camera, not on your TV. What happens on your TV is ghosting, caused by the low response time of LCD. Contrary to motion blur, ghosting does not feel natural because it makes a lot of unnatural artifacts (such as color shifts) and because it also happens when motion blur doesn't (for instance, during cut scenes or with scrolling text). Plasma is just better because it has more natural color, less artificial sharpness, more contrast...

Upscaling from HD to 4K is a 4x increase in the number of pixels, which isn't going to help when one pixel is smeared across 4 for playback.

Yes, it does, just like activating AA in a game helps.

There really isn't any advantage to 4K for any television under 65" since the average viewing distance in most households is 7-10 feet.

You don't have kids to you? I can assure you their viewing distance is at arm reach...
There were old recommendations to watch TV from very far away - but they're old, dating back to the day of CRT and flickering. Now, you can and should sit around 1,5m away from a HD TV (less than 5ft if my conversions are correct).

We had the same misconception in photography. People used to think that the bigger the photo, the farther away people would settle to watch it. And thus, that you didn't need to increase output resolution as you went bigger. But experience proved that wrong: display a 1,5mx1m print on plexi and you will see people get as close to it as their eyesight allows, because when it's big they want to see the details...

I'm all for higher resolution computer screens because you sit much closer and don't need to view the entire screen at once

When you're sitting in the front row at the cinema, you don't see the whole screen either and it's not a problem. The screen fills more of my vision at the cinema than my monitor does at home - because the goal at the cinema is immersion.
 
4K is a very promising standard. There is no question that 4K content on a 4K display is superior in quality especially at closer distances. The first downloadable 4K movie is about 160GB coming from the awesome timescapes.org which is unbelievably crisp and clear on a 4K display. The only question I have is how long is it going to take for the 4K standard to be adopted by broadcasters and ISPs? How long is it going to take for the infrastructure to be able to support such a demanding standard? How long are we going to endure watching 1080p content upscaled on 4K displays? 2..3..4 years? This interpolation of pixels for unknown intermediate pixels (upscaling) in a 4K display is something I would not want to go through for a long period of time.

Not to mention, 4K has yet to be standardized which makes current 4K displays already obsolete for when it is. Native 4K is beautiful for sure, but if you are shopping for a new TV I think it would be wise to wait on 4K just for a while until these questions are answered. If someone can shed light on these issues I would very much be interested in what they have to say because I really want to buy a 4K TV :D
 
No a big surprise for me. Can we use it as Media player?! lo

You most definitely can - I've just finished testing

- all the three 4K OOC MP4 clips from http://www.dpreview.com/reviews/panasonic-lumix-dmc-lx100/8

- my own 4K recordings on the Samsung Note 4.

Players (nPlayer and a lot more; I conducted the tests in the latest nPlayer version) allowing for direct hardware playback of iOS-native files (mp4 / m4v / mov) play back these files just fine on my Air 2.

And, what is more, A7-based devices will also play back these 4K video clips just fine (I've tested it on my rMini). This info could be added to the OP.

Going back in time, it's A6-based devices (I've tested it on my iPhone5, also, as with all my other A6+ iDevices, running iOS8.1) that are unable to properly play back 4K videos.

All in all, apart from the lack of 4K output support to external monitors (Apple is almost two years behind the already MHL 3.0-compliant competition in this respect), you can use any of your A7-or A8-based iDevices for decoding & playing back 4K content.
 
There are 4K/60p TV channels here in Japan, available on satellite and FTTH connections. Bit rate is around 40Mbps h.265. Other than video on demand, they are claiming the channels are in testing, but anyone with a TV capable can watch. Official broadcasting begins in the spring. The tuners require an HDMI 2.0 connection for enough bandwidth to the TV, so the real issue is that the oldest TVs capable of viewing the signal are only 6 months old.

4K is a very promising standard. There is no question that 4K content on a 4K display is superior in quality especially at closer distances. The first downloadable 4K movie is about 160GB coming from the awesome timescapes.org which is unbelievably crisp and clear on a 4K display. The only question I have is how long is it going to take for the 4K standard to be adopted by broadcasters and ISPs? How long is it going to take for the infrastructure to be able to support such a demanding standard? How long are we going to endure watching 1080p content upscaled on 4K displays? 2..3..4 years? This interpolation of pixels for unknown intermediate pixels (upscaling) in a 4K display is something I would not want to go through for a long period of time.

Not to mention, 4K has yet to be standardized which makes current 4K displays already obsolete for when it is. Native 4K is beautiful for sure, but if you are shopping for a new TV I think it would be wise to wait on 4K just for a while until these questions are answered. If someone can shed light on these issues I would very much be interested in what they have to say because I really want to buy a 4K TV :D
 
There are 4K/60p TV channels here in Japan, available on satellite and FTTH connections. Bit rate is around 40Mbps h.265. Other than video on demand, they are claiming the channels are in testing, but anyone with a TV capable can watch. Official broadcasting begins in the spring. The tuners require an HDMI 2.0 connection for enough bandwidth to the TV, so the real issue is that the oldest TVs capable of viewing the signal are only 6 months old.

Interesting. Now, what impact does the increased bitrate demands have on the overall performance of the internet infrastructure? Will the infrastructure need to be modified and improved? With compression how much of a loss in 4K quality will we see? As far as channels, in the USA I believe DISH network will launch one or two 4K channels soon which I think is nothing. How long until we have ESPN, AMC, Discovery, NATGEO, HBO and the likes broadcasted in 4K? Something tells me it'll be at least a year or two.
 
4K is a very promising standard. There is no question that 4K content on a 4K display is superior in quality especially at closer distances. The first downloadable 4K movie is about 160GB coming from the awesome timescapes.org which is unbelievably crisp and clear on a 4K display. The only question I have is how long is it going to take for the 4K standard to be adopted by broadcasters and ISPs? How long is it going to take for the infrastructure to be able to support such a demanding standard? How long are we going to endure watching 1080p content upscaled on 4K displays? 2..3..4 years? This interpolation of pixels for unknown intermediate pixels (upscaling) in a 4K display is something I would not want to go through for a long period of time.

Not to mention, 4K has yet to be standardized which makes current 4K displays already obsolete for when it is. Native 4K is beautiful for sure, but if you are shopping for a new TV I think it would be wise to wait on 4K just for a while until these questions are answered. If someone can shed light on these issues I would very much be interested in what they have to say because I really want to buy a 4K TV :D
My FilMiC Pro app just got updated to 2k, not sure if is useful yet I will test over thanksgiving
 

Attachments

  • image.jpg
    image.jpg
    189.7 KB · Views: 84
Interesting. Now, what impact does the increased bitrate demands have on the overall performance of the internet infrastructure? Will the infrastructure need to be modified and improved? With compression how much of a loss in 4K quality will we see?

If they switch to H.265, even 70 Mbps would be sufficient for studio-grade stuff. (After all, the new NX1 records H.265 4K at 70 Mbps, as opposed to the H.264-only GH4, which records at 100 Mbps at the same quality).

With H.264, 100 Mbps would be more than sufficient.

Of course, broadcasts will surely use lower bitrates. However, I don't think they'd sacrifice much image quality.
 
You most definitely can - I've just finished testing

- all the three 4K OOC MP4 clips from http://www.dpreview.com/reviews/panasonic-lumix-dmc-lx100/8

- my own 4K recordings on the Samsung Note 4.

Players (nPlayer and a lot more; I conducted the tests in the latest nPlayer version) allowing for direct hardware playback of iOS-native files (mp4 / m4v / mov) play back these files just fine on my Air 2.

And, what is more, A7-based devices will also play back these 4K video clips just fine (I've tested it on my rMini). This info could be added to the OP.

Going back in time, it's A6-based devices (I've tested it on my iPhone5, also, as with all my other A6+ iDevices, running iOS8.1) that are unable to properly play back 4K videos.

All in all, apart from the lack of 4K output support to external monitors (Apple is almost two years behind the already MHL 3.0-compliant competition in this respect), you can use any of your A7-or A8-based iDevices for decoding & playing back 4K content.

Apple almost never leads where standards are in transition, but when they believe that industry is holding them back, they create their own solutions.

4K is still in transition. There's no loss to Apple for holding off support in mobile until it settles. See NFC and how well played that has been for Apple.

Apple doesn't need to lead the feature race, but when they enter, every current device will have it, and it will probably be the most elegant solution.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.