Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Super Xander

macrumors 6502
Original poster
Nov 6, 2016
313
131
Denmark
I’ve an iPhone X which I know has a 60Hz screen which can show 30fps.

How can I see a different in a 60fps video compared to a 30fps if the screen caps out at 30fps?

Haven’t been able to find any info on the internet.

Thanks in advance.
 
I don’t know where you got your 60hz = 30fps info, because it’s wrong.

60hz = capable of showing up to 60 fps. When you lose the 1:1 ratio is when you get judder/stutter, like on panning scene on most movies where the framerate is 24fps.
 
I don’t know where you got your 60hz = 30fps info, because it’s wrong.

60hz = capable of showing up to 60 fps. When you lose the 1:1 ratio is when you get judder/stutter, like on panning scene on most movies where the framerate is 24fps.
I think I got it from somewhere, but you may be right.
In this case my iPhone X is able to show content in 60fps, where the promotion iPads are able to show stuff up to 120fps because of their 120Hz display?
 
  • Like
Reactions: spyguy10709
For old, interlaced resolutions (refreshes half the screen detail at once) - that's true. 60Hz = 30FPS. But pretty much all displays today are progressive scan (refreshes the whole image at once)
Thanks, so what you’re saying is that recording 4K 60fps will make the video appear in the same number of frame every second as the screen are able show?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.