Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I had to go to a "business plan" at home to get anything that was uncapped.

$130/mo for 100/10 - But crucially no data cap (which I was hitting literally every month on the highest residential plan of 200/20)
 
I don't like the "soap opera look" either but I watched a couple of movies and shows on mine through plex and Hulu and they were perfectly fine to my eyes. I suspect it will be fine to most.
 
This is completely true, 24p was originally chosen for economy and was a trade off giving the lowest number of frames whilst still giving smooth motion - it's now just a legacy frame rate that has stuck.

edit: If this guys correct I stand corrected - it was to do with the Audio!


Don't worry about it. This thing doesn't get to be the standard because of one single reason. Still, economic reason is a prime one. To shoot film at higher speed not only unnecessarily wasting celluloid which cost money, it will cost more money for equipment as well. It means the camera, tripod, crane, battery etc. all these things have to be bigger (and made much more durable), need more power and more people to handle for the benefit that almost no one can notice. In those early days the camera was already too big (just to have stable speed that can sync with sound). It was an easy and painless choice.
 
Its not doing anything with them apart from dropping them at sources. There is always a Dolby Digital or DTS backup which is what its sending instead.
Think so? Even if I exclude the core track from the MKV when ripping my blu-rays?
 
Think so? Even if I exclude the core track from the MKV when ripping my blu-rays?

Try it and see - you'd effectively be breaking every standard going by doing that, but I suspect you might well get silence as it'll not see a capable audio track to play. Be interested to see what happens if you bother to give it a whirl.
 
Unless one owns a reasonably recent plasma, OLED or latest generation FALD LCD, is 24p really the main worry in terms of achieveing a truly cinematic experience?
 
Unless one owns a reasonably recent plasma, OLED or latest generation FALD LCD, is 24p really the main worry in terms of achieveing a truly cinematic experience?

Probably more so as I imagine the newer devices are better at reverse telecine and the like. It makes a huge difference on my 2006 Samsung.
 
Try it and see - you'd effectively be breaking every standard going by doing that, but I suspect you might well get silence as it'll not see a capable audio track to play. Be interested to see what happens if you bother to give it a whirl.
I re-ripped two discs last night, one with DTS-HD 7.1 and the other with TrueHD 7.1, and excluded the core in both cases in MakeMKV. They must have been excluded because in both cases the files were 600MB smaller than the rips they replaced. Both played back in surround on the Apple TV with Plex set to "Unlimited 1080p". How, I do not know. I'm sure the Plex server was doing something on its end to make those tracks work on the Apple TV - but whether that process results in any loss of audio quality or if it's still a lossless stream - who knows!

One quirk however was that there was a faint popping/clicking sound coming from the speakers with the TrueHD 7.1 track. Don't know what that was all about and didn't have a chance to do any further testing to see if it's specific to that film or what.
 
I re-ripped two discs last night, one with DTS-HD 7.1 and the other with TrueHD 7.1, and excluded the core in both cases in MakeMKV. They must have been excluded because in both cases the files were 600MB smaller than the rips they replaced. Both played back in surround on the Apple TV with Plex set to "Unlimited 1080p". How, I do not know. I'm sure the Plex server was doing something on its end to make those tracks work on the Apple TV - but whether that process results in any loss of audio quality or if it's still a lossless stream - who knows!

One quirk however was that there was a faint popping/clicking sound coming from the speakers with the TrueHD 7.1 track. Don't know what that was all about and didn't have a chance to do any further testing to see if it's specific to that film or what.

Pretty interesting, I think Plex can transcode audio so it may well have been doing something - I know there were options for forcing Dolby Digital on the old Apple TV etc.

As an audio engineer though I'd challenge anyone to reliably chose the TrueHD over a Dolby Digital or DTS sound track blindfolded. I'd highly doubt anyone could even on the best equipment.
 
Probably more so as I imagine the newer devices are better at reverse telecine and the like. It makes a huge difference on my 2006 Samsung.

I meant in terms of black level, color accuracy and gamut, viewing angles, motion resolution and the like.

In other words, if you own an LCD, 24p should be among your last worries.
 
I meant in terms of black level, color accuracy and gamut, viewing angles, motion resolution and the like.

In other words, if you own an LCD, 24p should be among your last worries.

Well yes of course, you pick one with the best screen before you go buying one for features. But one with a good screen will have 1080p24 support so its a moot point. If two screens were the same and one had it and one didn't you pick the one that did - but again its a moot point because all good screens have it - if you're picking a TV based on those things, you'll get it, if you're just buying a cheap TV you're not bothered anyway. Strange thing to say really.

(1080p24 falls far above any sort of Smart TV features on my list mind. Anything to do with picture quality does - id rather buy the best "dumb monitor" you could buy that have that horrible smart TV junk thrown in)
 
  • Like
Reactions: turbineseaplane
(1080p24 falls far above any sort of Smart TV features on my list mind. Anything to do with picture quality does - id rather buy the best "dumb monitor" you could buy that have that horrible smart TV junk thrown in)
Sadly, a great "dumb monitor" is the one thing the tv market is completely lacking and sorely needs. Everyone's so busy trying to one-up one another on meaningless specs and features (curved? 4K? "Magic" remotes?), shopping for a tv has become a matter of trying to find the least awful television that basically happens to have good picture almost by accident.

I honestly believe that if someone decided to build a 50", 1080p, Flat OLED with factory-calibrated picture out of the box and zero "smart" features, "enhancers" or bogus picture modes - and sell it for under $2k - they could mop the floor with all of their competitors.
 
I honestly believe that if someone decided to build a 50", 1080p, Flat OLED with factory-calibrated picture out of the box and zero "smart" features, "enhancers" or bogus picture modes - and sell it for under $2k - they could mop the floor with all of their competitors.

Totally agree. Pack in about 6 HDMI ports with USB power adjacent for them all and it's a cord cutters dream
 
  • Like
Reactions: ukpetey
I honestly believe that if someone decided to build a 50", 1080p, Flat OLED with factory-calibrated picture out of the box and zero "smart" features, "enhancers" or bogus picture modes - and sell it for under $2k - they could mop the floor with all of their competitors.

I'd say it would have to be under $1K to "mop the floor" considering you can get pretty decent 4K LCDs for under $1K (e.g. Vizio M-series) that have a picture quality that's good enough for most people. However, yes, I'd love to be able to buy a factory calibrated, "dumb" TV that has great picture quality.
 
Sadly, a great "dumb monitor" is the one thing the tv market is completely lacking and sorely needs. Everyone's so busy trying to one-up one another on meaningless specs and features (curved? 4K? "Magic" remotes?), shopping for a tv has become a matter of trying to find the least awful television that basically happens to have good picture almost by accident.

I honestly believe that if someone decided to build a 50", 1080p, Flat OLED with factory-calibrated picture out of the box and zero "smart" features, "enhancers" or bogus picture modes - and sell it for under $2k - they could mop the floor with all of their competitors.

Yeah, the only problem is the OLED bit is the expensive bit, all the superfluous crap they throw in costs peanuts. The dev teams wages cost more than the cost of including the "smart" software on the TV's - and considering they get to roll that rubbish out on every TV they release they get great value for it.

What baffles me too is that Samsung seem to release about 20 models a month. There's no value to their sets anymore because there are too many of them. On Amazon the other day they had a comparison with 4 different 4k models from Samsung, all 65" all released in 2015, all more or less £1600. Ridiculous. They must have 30 4k models on the market right now, each model has at least 3-4 sizes. I've no idea why they want to manage so many SKU's other than their idea is to try and baffle the consumer so they don't have a must have model and people don't wait for versions like they do with phones.
 
  • Like
Reactions: Dunk the Lunk
Yeah, the only problem is the OLED bit is the expensive bit, all the superfluous crap they throw in costs peanuts. The dev teams wages cost more than the cost of including the "smart" software on the TV's - and considering they get to roll that rubbish out on every TV they release they get great value for it.

What baffles me too is that Samsung seem to release about 20 models a month. There's no value to their sets anymore because there are too many of them. On Amazon the other day they had a comparison with 4 different 4k models from Samsung, all 65" all released in 2015, all more or less £1600. Ridiculous. They must have 30 4k models on the market right now, each model has at least 3-4 sizes. I've no idea why they want to manage so many SKU's other than their idea is to try and baffle the consumer so they don't have a must have model and people don't wait for versions like they do with phones.
Yeah the number of models is truly ridiculous. The TV market really needs the "Casper Mattress" of televisions; someone who figures out the absolute best tv to make, and then just makes that one model at various sizes.

OLED is expensive technology, but it's hands-down the best and will certainly get cheaper (there are already 55" LG models selling for under $1,800 US). It's true they must get good "value" for all of that "smart" software, but imagine the savings of just not doing it at all and eliminating the 40 tv models in favor of one? It can't be too cheap to develop and beta test all of those apps and "features", half-assed as they are. Not to mention all of the tech support calls that must go along with that garbage seeing as how rarely it actually works.
 
I was also hoping that the new ATV would have an option for 1080p24 output, but there aren't that many of us who care.

It may affect those of us from PAL countries more, as we're used to movies sped up to 25fps and played back without judder at 50Hz. In NTSC countries 3:2 pulldown has been part of the experience of watching movies at home for a long time. I have to admit that the more I've watched movies on the ATV rather than on Blu-ray, the less I notice it. (But then I notice the lack of judder when I go back to Blu-ray.)

Reverse/inverse telecine from 1080p60 isn't all that common – recent Sony and Vizio panels and the JVC projectors seem to do it but I don't know how well they go with detecting and locking onto the cadence. Obviously it's easier if you can just get a 1080p24 source in the first place.

If the ATV4 had gone to 4K we might have had more luck as I imagine they would have had to support HDMI 1.4 displays/receivers etc that do 4K24/25/30 but not 4K60.

With any luck it's something that can and will be added, though unfortunately it doesn't seem to be a priority.
 
Funnily enough, I just read this fantastic piece about some TV's having a "film" option that works without telecine judder if you feed it an interlaced rather than progressive signal. So choosing 1080i in a device will allow the TV in a film mode to give you judder free video from a 24fps source, perfect for things like Netflix.

Have a read - https://jordanbortz.wordpress.com/2...-netflix-and-cable-on-your-120hz-or-240hz-tv/

One problem with that on the Apple TV though - APPLE TV HAS NO INTERLACED OUTPUT MODES!! Its only 720p or 1080p bah.

The only way to avoid judder is if you have a TV that can work in a 120hz refresh rate as 24 divides equally into it, however i'm not sure if then the Apple TV would already spoil things by doing 3:2 pull down (as it can only output at 50hz or 60hz) before the TV even got the signal...i'm not sure, don't know how that works, but maybe it would be ok, however my TV is 100hz only, no special 120hz mode here.

I'm going to add the option of interlaced output modes to the Radar report as well - I expect to see a tvOS update with this stuff in, its far far too basic a device right now for anything other than a kids bedroom.

This is genius :)

I've got a Pioneer 5090 plasma, and now with my Oppo blu ray player or PS4 set to 1080i, I can view Netflix judder free (I had to set the screen to 72hz too).

Shame we can't do this on the ATV4 :(
 
  • Like
Reactions: dannys1
Sorry mate, but i've bit my tongue a bit here. You're completely wrong in every reply so its comically you've jumped to "educate yourself and "ignorant" and also claim its your job.

24fps was settled on with film because of editing, not economics. They could have gone with 25 or 23 and economically it would have been pretty similar. When you're cutting film you could easily judge how much time you were cutting or adding -- half a second is 12 frames, a quarter is 6 frames, an eighth is 3 frames, etc.

In digital this is no longer need but 24p has been considered the cinema standard now for aesthetic standards. I'm not being funny but even a simple Wikipedia search can teach you this

"Originally, 24p was used in the non-linear editing of film-originated material. Today, 24p formats are being increasingly used for aesthetic reasons in image acquisition, delivering film-like motion characteristics." - Try to educate yourself https://en.wikipedia.org/wiki/24p

Please don't reply again and get something else wrong in this thread, its embarrassing.

The original point remains, for HD content its very disappointing that the Apple TV 4th gen still doesn't support native 24hz output for content that supports it. Such a simple implementation.

Well I found this replay from bennynihon on Apple developers forum which makes more sense to me than the two of you:
Film or movies are shot at 24 frames per second. Historically, TV’s VSYNC ran at 60 Hz in order to match the AC frequency at the outlet (in North America), and the first content was shown at 60 interlaced fields per second. So VCRs and the first DVD players needed to fit 24 fps into 30 fps (and then interlaced to make it 60 fields per second). This requires adding some frames more than others to fit 24 nicely into 30 (a process called Telecine or 2:3 pulldown). This leads to judder, which is most prominent in scenes that pan from side to side. Eventually DVD players came out that offered 24p output (or 24 progressive fps), the native rate of the movie. This of course still requires a TV that can properly display the 24 fps without needing to apply its own Telecine. TVs that have a 120 Hz panel work (as 120 is exactly five times 24), as do 240 Hz panels. Problem is, the Apple TV, Fire TV, Chromecast, and others always output at 60 Hz. They don’t support 24 fps output, so we’re stuck watching movies with crummy Telecine applied. Again, this is a feature that has been available for 10 years on DVD players. No excuse not to have it with the modern boxes, as I’m sure the video chipsets they’re using already allow for that video frequency. Are there no movie buffs working at Apple? Because no self respecting movie purist would use an Apple TV if it can't support a movie's native frame rate. It blows my mind that a $200 box does not support this. It's akin to a music player adding spurious samples into music for it to fit into a sampling rate other than the digital music file's original sampling rate.
 
Well I found this replay from bennynihon on Apple developers forum which makes more sense to me than the two of you

I don't know why, he's saying exactly what i've been saying the entire thread and why i started it in the first place!
 
Well I found this replay from bennynihon on Apple developers forum which makes more sense to me than the two of you:
Film or movies are shot at 24 frames per second. Historically, TV’s VSYNC ran at 60 Hz in order to match the AC frequency at the outlet (in North America), and the first content was shown at 60 interlaced fields per second. So VCRs and the first DVD players needed to fit 24 fps into 30 fps (and then interlaced to make it 60 fields per second). This requires adding some frames more than others to fit 24 nicely into 30 (a process called Telecine or 2:3 pulldown). This leads to judder, which is most prominent in scenes that pan from side to side. Eventually DVD players came out that offered 24p output (or 24 progressive fps), the native rate of the movie. This of course still requires a TV that can properly display the 24 fps without needing to apply its own Telecine. TVs that have a 120 Hz panel work (as 120 is exactly five times 24), as do 240 Hz panels. Problem is, the Apple TV, Fire TV, Chromecast, and others always output at 60 Hz. They don’t support 24 fps output, so we’re stuck watching movies with crummy Telecine applied. Again, this is a feature that has been available for 10 years on DVD players. No excuse not to have it with the modern boxes, as I’m sure the video chipsets they’re using already allow for that video frequency. Are there no movie buffs working at Apple? Because no self respecting movie purist would use an Apple TV if it can't support a movie's native frame rate. It blows my mind that a $200 box does not support this. It's akin to a music player adding spurious samples into music for it to fit into a sampling rate other than the digital music file's original sampling rate.


It is an odd thing for them to leave out with Apples usual attention to detail and quality. Here's hoping they see sense at some point and push an update to add it in.
 
Question, if my TV has a setting for 24 fps mode will the Apple TV bypass this and still show in 30+ fps?
 
Question, if my TV has a setting for 24 fps mode will the Apple TV bypass this and still show in 30+ fps?

The TV will use the 24p mode if the source it receives is 24p. The Apple TV will not output 24p so this mode will not be used by the TV.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.