Excuse me for butting into this side conversation, but I've been reading it and find it confusing. The original film material (assuming it is a film source) is 24fps. Prior to NTSC with monochrome TV (in the US) the rate was 30fps (60i), and the 3:2 pulldown converts without problems. With NTSC the frame rate was lowered to 29.97 to allow bandwidth for the color information. The conversion was then done by slowing the film to 23.976 and doing the pulldown. So it seems to me that material with an original film source should actually be viewed at 24fps to be true to the original. Material with an original NTSC camera source should be viewed at 23.976. European DVDs would present a problem -- do they just run the film ~4.2% faster to match the 25fps frame rate? In which case they should be viewed at 24, not 25. And what are current frame rates being used since NTSC is obsolete? I noticed that my video cam is 29.97 (why?). I also noticed that the new broadcast standard allows both 24/30 and 23.976/29.97 but what is actually used? What does Bluray use since it doesn't have to drive NTSC receivers?
It's a valid question, and I'm not trying to start a debate on what's "right" and what's "wrong," but rather what "is."
What "is" is that Blu-rays and DVDs (99.9%) of the time (and excluding some oddities in a few Blu-ray releases from Japan/Australia etc..) are encoded at 23.976hz. So OUR SOURCE is the Blu-ray - at 23.976hz. The original FILM SOURCE will almost certainly have been 24.000hz - or in other words 24fps exactly, but that's not the source that we as consumers consumers have access to. 24.000hz is not NTSC standard by which all broadcasts (in the US) adhere to, and that's why we get 23.976hz. Is the difference tiny? Absolutely. Does it make a noticeable difference to playback of a movie? Nope (essentially slowed down 0.1% for NTSC). Would it make a difference in OS X? YEP! Because you'd eliminate the frame skip every 41.6 seconds that occurs by playing 23.976hz material at OS X 24p ( which as stated is 24.000hz, not the standard 23.976hz).
In an "ideal" world we'd get 24hz-recorded material on Blu-ray etc at exactly 24p (24.000hz). It's REALLY not a big deal since a lot of modern TVs and projectors can display at 23.976hz, and so the difference between the original movie when filmed and when you watch it at home at 23.976hz on a 23.976hz-capable display is at home is the movie is exactly 0.1% slowed down. The bigger deal is the Mac mini can't do 23.976hz, which in this day and age of home theater is a joke. It undermines a huge reason for having a mini if you're a movie nut (or TV nut, or DVD nut). (I say "ideal" because you also need to look at 23.976hz as being compatible with many other standards beyond US, which is why it's used - but that's another discussion).
So, for giggles, think of it this way (if I have my calculation right):
When I watch a 2 hour 24p Blu-ray movie (i.e. 23.976hz) on my 23.976hz-capable projector with my Windows 7 Intel NUC, it takes 2 hours to watch it. The source matches the output perfectly.
When I watch a 2 hour 24p Blu-ray movie (i.e. 23.976hz) on my 24.000hz-only-capable Mac mini
to my projector (which can also output at 24.000hz), there will be a skipped frame every 41.6 seconds. In a 2 hour movie that's 173 frames skipped. If we say that 173 frames = 7.2 seconds, if my math is correct you're actually skipping 7.2 seconds of the movie (albeit one frame at a time, every 41.6 seconds).
So, if you speed the movie up 0.1% to match the frame-rate of your Mac mini, you would actually finish that 2 hour movie 7.2 seconds before I do! So there you. There's an argument for using the Mac mini at 24.000hz. You can watch more movies than I can in the same period of time.
When it comes to PAL, there aren't actually very many 25p movies/TV shows out there. Every movie I've bought from the UK, for example, is a 24p disc (i.e. 23.976hz). But yes, it would be sped up ~4.2% faster if it were 25p material.