I hope someone can clear this up. I know if you're using a Blu-ray player attached to a TV via HDMI, it can adjust the frame rate to match the content so movies are at 24 FPS and TV at 60.
What happens when I'm playing video on my rMBP? I have files that show a frame rate of 24 (actually 47.952024) and others 29.970030. If I play these in VLC or Quicktime, what rate is it displaying on the LCD panel? How can I tell? Is it different in full-screen?
What about when I output to my Dell u2711 over DP or HDMI? Do these dynamically adjust the rate? Is the player doing interpolation to match the monitor rate? Again, is there some place I can see this info?
I've searched all over the place and haven't found any info on this topic.
Thanks for any info!
David
What happens when I'm playing video on my rMBP? I have files that show a frame rate of 24 (actually 47.952024) and others 29.970030. If I play these in VLC or Quicktime, what rate is it displaying on the LCD panel? How can I tell? Is it different in full-screen?
What about when I output to my Dell u2711 over DP or HDMI? Do these dynamically adjust the rate? Is the player doing interpolation to match the monitor rate? Again, is there some place I can see this info?
I've searched all over the place and haven't found any info on this topic.
Thanks for any info!
David