Hulu Begins Rollout of 60fps Live Streaming for Select Channels on iPhone, Apple TV, and More

Perhaps someone that knows more about this can answer - is over the air TV broadcast in 60fps? Or, can it be?
Let's not think of broadcast tv today in terms of fps; this is video, not gaming. It is better to think of broadcast tv in terms of the refresh rate it broadcasts at, which is 60Hz. This is standard NTSC, has been for years. (PAL is 50Hz FYI, for our friends in Europe and other foreign markets.)

I have the same question, will the actual broadcasts be in 60fps?

EDIT: ah OTA, afaik the signal is 30fps

Which makes me still wonder, are we actually getting 60fps from camera all the way to tv with this hulu update?
All OTA (ATSC) is 60Hz (same goes for satellite and cable). Some HD channels like CBS and NBC broadcast in 1080i. Others, like FOX and ABC, broadcast in 720p. Both are 60Hz though. In the case of 1080i, this is interlaced video, meaning the motion information per frame is shared between two fields, and technically the frame rate is 30 fps (well, 29.97), but there are still actually 60 fields per second. For 720p, this is progressive video and there are 60 frames/fields per second

I wish not only sports or live, but everything would stream at this framerate. Just as most 4K movies just has been upscaled, they could pre render frame interpolation to deliver smoother playback.
This would actually serve no purpose, unless your TV/decoder just does a really poor job of matching the source content to its native display refresh rate. The vast majority of scripted programming was originally shot at 24fps, so converting this content to 60fps would mean that frames need to be repeated. Now, this is a very typical procedure done in broadcast today for sure, but this is purely for matching the source to the broadcast standard in play for that channel. This is not done to 'benefit' the video. And as most people have already commented on wrt UHD Blu-ray, when a film that was mastered for disc at 2K and scaled to 4K for the UHD disc, the video quality here is far inferior to films that were natively mastered at 4K.

This is all marketing ********. The only 60fps content available is Billy Lynn’s Halftime Walk, a movie shot in that format. All other content is shot at 24fps. Video games can pump out 60 locally. Which leaves live sporting events which would be a new thing if they even have all the hardware set up to do it. Forget news in 60fps. Not happening. CNN would have to upgrade every camera and switcher in the signal path. HDR and 4K would be a far more meaningful upgrade.
Correct about Billy Lynn being the only major Hollywood Studio release of a film natively shot and distributed at 60fps. But for non-Hollywood content, especially sports, this has been going on for years. You would be very hard-pressed to find any sporting event in North America on broadcast TV shot at 24fps. They are shot and produced at either 720p60 or 1080i60, depending on the network.

Exactly my thought. As a secondary matter - is Hulu upconverting the signal, or is Hulu getting a better feed than what everyone else gets OTA.

As I understand it, the OTA signal can be 60hz 720i, or 30hz 1080i (and obviously some are 30hz 720i at half the bandwidth). But I don't know whether any actually transmit at 60hz.
Hulu is getting the same signal your local station relay and satellite or cable provider receives. Again, all broadcast tv is 60Hz. Hulu is just now deciding to deliver to their customers the native 60fps of the 720p channels, and likely separating the fields of the native 1080i channels into sequential frames, which results in the same net effect of 60 fps.

Mostly for Live TV, yes. Live Shows like SNL for instance, Live News, Sports and soap operas are all shot in 60fps and aired in 60fps.
[doublepost=1518113120][/doublepost]

Thankfully most News and Live TV Shows have been shot and aired in 60fps for quite some time now. No hardware upgrades needed. Hulu just needs to upgrade their stream quality to match what has already been airing on Live TV for years.
Right on.

I'm no expert- and did NOT stay at a Holiday Inn last night- but best I know, the answer is NO. I suspect they are taking the 30fps interlaced signals and creating a 60fps progressive signal from it. I haven't seen anything that says these networks are delivering content shot at 60fps.
This is sort of correct. Again, I wouldn't term it as fps, as that does video broadcasting standards a disservice due to interlacing. Hulu is most likely taking the 60i ('30fps') interlaced source and separating the fields into a 60p version of the source to feed into their encoder (this is all likely done within the encoder, but I'm just splitting it out here so that it may be easier to visualize). For native 60fps content, of which about 30% of the channels in NTSC land are, Hulu is just maintaining the input.

1080i60 vs. 1080p60

Side by side, they appear similar in terms of smooth motion, motion blur, etc. So, referring these web channels as 60FPS seems appropriate .
This is a pretty accurate readout actually. Both 1080i and 1080p60 actually have the same number of fields, so on a good tv with a good preprocessing engine, the difference between 1080i and and 1080p would not be too much. The 1080p version should look much sharper, due to it having twice the effective vertical resolution, but the motion itself wouldn't be too much different.

If they are “upconverting” 30 fps to 60fps wouldn’t that cause shows to have a weird “filmlook” effect?
Correct, but this is not what they are doing. There are two main methods to convert video from one lower frame rate to a higher one. The first is called interpolation, and this is what most TVs do when you enable their 'smooth motion' process or whatever they choose to call it. The net effect of this is that 'soap opera' effect you allude to. The second option is call pulldown or telecine. In this method, frames or fields are repeated. The net effect of this is judder. This is no doubt something that everyone is used to as it is the standard used by broadcasters to convert 24fps content to their native broadcast signal.

No. "Upconverting" 1080i 30fps to 1080p 60fps just doubles the frames. the 60fps version tends to look/play better on non-interlaced devices like computer and mobile screens and modern TVs that are basically more computer screens than traditional TV screens. BUT, broadcasters are still married to 1080i because that is the HD broadcast standard. Your TV or STB receives a 1080i signal and dynamically doubles the frames to feed a TV capable of 60fps.

Real 1080p60fps would actually have changing information on up to each of those 60 fps. This is why it's desirable to in-the-know sports fans because it allows fast-moving sports to have less blur-judder-jumpiness. Doubling 30fps to 60fps doesn't do that. But if this is getting stuff shot at 60fps and delivered at 60fps, it may make a noticeable difference- especially in fast-moving scenes (sports, action movies, etc).

This talks to this topic better than I can describe it: https://www.pcmag.com/article2/0,2817,2413044,00.asp

The "film look" issue is typically working with film sources rendered at 24fps and converting it to 30fps. That is not just doubling the frames but a actually involves a bit of complication.

The other "film look" is the other way- the desire to see 24fps at 24fps but not having such an option via :apple:TV until recent software updates.

This: https://www.cnet.com/how-to/1080i-and-1080p-are-the-same-resolution/ actually does a pretty good job talking to all of these topics, though I don't quite buy the final conclusion in that last line.
Again, another misconception here about what a TV is doing. If the TV were just taking in a 1080i signal and doubling the frames to 60fps, this would look exactly like how OTT sports that are encoded at 30fps would look. This is because NO TV has a native refresh rate of 30Hz. What the TV is actually doing is similar to what I describe Hulu is probably doing, which is separating each field in an interlaced channel and laying them out sequentially to maintain all of the original source motion resolution. This method does require a vertical scale, but the TV has to do that too in order to size the source for its own native resolution. To test this, you can go ahead and tune to a 1080i channel and point a camera at it that can record at 60fps (even an iPhone these days would suffice). Then view the recorded file frame-by-frame. You will see that all frames on your TV's display are unique (of course, the content has to be something like sports, and not the latest episode of NCIS or such).

I delivery broadcast television, I can tell you for a fact that it isn’t. In North America most content is sent to broadcast as 29.97 frames per second (technically interlaced 59.94 but it displays back as 29.97).

So basically this is a gimmick at best. Even broadcasting at 29.97 most Footage is shot at 23.98 so they can play it back at whatever they want but it won’t change the look any because it wasn’t shot with that many frames.

The only time 60 FPS makes a difference is when you are talking Footage that’s created at 60 such as video game footage and maybe some sports, I’m not sure what they shoot at. Content created under won’t see any benefit to a higher frame rate.
The split for HD broadcast in NTSC-land is around 70-30% for 1080i vs 720p. But to think of video in terms of fps (and really someone in delivery broadcast television should know this) is not correct. Almost all video broadcast by standard non-PPV channels are broadcast at 60Hz. For 720p60 on a standard 60Hz TV, this is easy to understand how each frame is displayed. For 1080i60 on a 60Hz TV, this should also be relatively easy to understand how each field is displayed, especially given my above descriptions. I do not think you could find a TV that would just discard the non-dominant field of an interlaced signal and actually only display 30fps (maybe if you go back a few years to the first 4K 'TVs' that only supported 4K @ 30Hz, then this may be an example, but those were rare and really more akin to monitors than true TVs anyway).

So in short (lol I know by now if you're still with me), this is most definitely NOT just a gimmick. For any non-sports, non-news, or non-reality-tv programming (i.e. all film-based original content), then this announcement indeed has little value. But if you want to watch events like the Olympics, the World Cup, the NCAA basketball playoff, or the Super Bowl on one of these streaming services on your TV and not feel like you're watching a pirated stream, then this is fantastic news and ushers in a new era for more competitive video quality on these streaming services (and yes, I know, Hulu is a bit late to the game here so I am not just talking about them).
 
Last edited:
So this is going to require more bandwidth? Because my last experience with Hulu was constant buffering issues on my 100mbps connection.
 
It's not even clear if we are actually getting something different. This might be like a Verizon or AT&T move, where they make a big announcement about new offerings that are really just the same old offerings (or less than the old offerings) with a new coat of paint.

if it's legitimately 60fps, the file sizes are likely to be modestly bigger (unless they change other variables like compression). However, the difference in file sizes is not extreme. For example, if you think of this as doubling the frames- which does seem to be the message- what's stored in a h.264 file is not twice as much data, only what changes between frames.

Until an :apple:TV could play back 1080p60fps, I used to make 2 versions of all of my 1080p60fps shot video: one at 60fps (for the future) and the other at 30fps (for prior generation :apple:TVs that choked on 60fps). File sizes of mostly sports that I shot would typically be only a little larger in a h.264 container.

I don't remember a 60fps render ever ending up a little smaller, so if Hulu used to choke for you, it will probably still choke for you if all else remains the same except this.
 
Last edited:
This is all marketing ********. The only 60fps content available is Billy Lynn’s Halftime Walk, a movie shot in that format. All other content is shot at 24fps. Video games can pump out 60 locally. Which leaves live sporting events which would be a new thing if they even have all the hardware set up to do it. Forget news in 60fps. Not happening. CNN would have to upgrade every camera and switcher in the signal path. HDR and 4K would be a far more meaningful upgrade.

I agree. Or even just HDR. Netflix and iTunes both offer DV & HDR at 1080p. That's what all the services should be striving for at a minimum, but 4K and 60p are the buzzwords that drive big screen sales.
 
So this is going to require more bandwidth? Because my last experience with Hulu was constant buffering issues on my 100mbps connection.

It's not even clear if we are actually getting something different. This might be like a Verizon or AT&T move, where they make a big announcement about new offerings that are really just the same old offerings (or less than the old offerings) with a new coat of paint.

if it's legitimately 60fps, the file sizes are likely to be modestly bigger. However, the difference in file sizes is not extreme. For example, if you think of this as doubling the frames- which does seem to be the message- what's stored in a h.264 file is not twice as much data, only what changes between frames.

Until an :apple:TV could play back 1080p60fps, I used to make 2 versions of all of my 1080p60fps shot video: one at 60fps and the other at 30fps (for prior generation :apple:TVs that choked on 60fps). File sizes of mostly sports that I shot would typically be only a little larger in a h.264 container.

I don't remember a 60fps ever ending up a little smaller, so if Hulu used to choke for you, it will probably still choke for you if all else remains the same except this.
These are both valid points. In order to deliver double the frame rate on sources that can take advantage of it (i.e. sports), a higher bandwidth (or a switch from H.264 to H.265, though there is no mention of that here nor are there likely enough devices yet to warrant such a move yet) will be required. So if the top profile for a channel previously was 1080p30 @ 5Mbps, they will likely need to encode at something like 1080p60 @ 8Mbps (just using a random example here). Thankfully, the bandwidth doesn't have to scale 1:1 with the frame rate since the differences in motion between frames/fields is less than at the lower frame rates (i.e., at 60fps, frame 1 and 2 look more similar to each other than at 30fps), and thus the encoder's motion estimation algorithms can cope better.
[doublepost=1518127459][/doublepost]
I agree. Or even just HDR. Netflix and iTunes both offer DV & HDR at 1080p. That's what all the services should be striving for at a minimum, but 4K and 60p are the buzzwords that drive big screen sales.
While this indeed would be ideal, what Netflix and iTunes are delivering is pure VOD content, so they have access to masters which are presented in these formats. Live streaming TV services such as Hulu only have access to the broadcast feed, and unfortunately we still too early in the technological lifespan of 4K and HDR for there to be (m)any options in this regard.
 
Let's not think of broadcast tv today in terms of fps; this is video, not gaming. It is better to think of broadcast tv in terms of the refresh rate it broadcasts at, which is 60Hz. This is standard NTSC, has been for years. (PAL is 50Hz FYI, for our friends in Europe and other foreign markets.)


All OTA (ATSC) is 60Hz (same goes for satellite and cable). Some HD channels like CBS and NBC broadcast in 1080i. Others, like FOX and ABC, broadcast in 720p. Both are 60Hz though. In the case of 1080i, this is interlaced video, meaning the motion information per frame is shared between two fields, and technically the frame rate is 30 fps (well, 29.97), but there are still actually 60 fields per second. For 720p, this is progressive video and there are 60 frames/fields per second


This would actually serve no purpose, unless your TV/decoder just does a really poor job of matching the source content to its native display refresh rate. The vast majority of scripted programming was originally shot at 24fps, so converting this content to 60fps would mean that frames need to be repeated. Now, this is a very typical procedure done in broadcast today for sure, but this is purely for matching the source to the broadcast standard in play for that channel. This is not done to 'benefit' the video. And as most people have already commented on wrt UHD Blu-ray, when a film that was mastered for disc at 2K and scaled to 4K for the UHD disc, the video quality here is far inferior to films that were natively mastered at 4K.


Correct about Billy Lynn being the only major Hollywood Studio release of a film natively shot and distributed at 60fps. But for non-Hollywood content, especially sports, this has been going on for years. You would be very hard-pressed to find any sporting event in North America on broadcast TV shot at 24fps. They are shot and produced at either 720p60 or 1080i60, depending on the network.


Hulu is getting the same signal your local station relay and satellite or cable provider receives. Again, all broadcast tv is 60Hz. Hulu is just now deciding to deliver to their customers the native 60fps of the 720p channels, and likely separating the fields of the native 1080i channels into sequential frames, which results in the same net effect of 60 fps.


Right on.


This is sort of correct. Again, I wouldn't term it as fps, as that does video broadcasting standards a disservice due to interlacing. Hulu is most likely taking the 60i ('30fps') interlaced source and separating the fields into a 60p version of the source to feed into their encoder (this is all likely done within the encoder, but I'm just splitting it out here so that it may be easier to visualize). For native 60fps content, of which about 30% of the channels in NTSC land are, Hulu is just maintaining the input.


This is a pretty accurate readout actually. Both 1080i and 1080p60 actually have the same number of fields, so on a good tv with a good preprocessing engine, the difference between 1080i and and 1080p would not be too much. The 1080p version should look much sharper, due to it having twice the effective vertical resolution, but the motion itself wouldn't be too much different.


Correct, but this is not what they are doing. There are two main methods to convert video from one lower frame rate to a higher one. The first is called interpolation, and this is what most TVs do when you enable their 'smooth motion' process or whatever they choose to call it. The net effect of this is that 'soap opera' effect you allude to. The second option is call pulldown or telecine. In this method, frames or fields are repeated. The net effect of this is judder. This is no doubt something that everyone is used to as it is the standard used by broadcasters to convert 24fps content to their native broadcast signal.


Again, another misconception here about what a TV is doing. If the TV were just taking in a 1080i signal and doubling the frames to 60fps, this would look exactly like how OTT sports that are encoded at 30fps would look. This is because NO TV has a native refresh rate of 30Hz. What the TV is actually doing is similar to what I describe Hulu is probably doing, which is separating each field in an interlaced channel and laying them out sequentially to maintain all of the original source motion resolution. This method does require a vertical scale, but the TV has to do that too in order to size the source for its own native resolution. To test this, you can go ahead and tune to a 1080i channel and point a camera at it that can record at 60fps (even an iPhone these days would suffice). Then view the recorded file frame-by-frame. You will see that all frames on your TV's display are unique (of course, the content has to be something like sports, and not the latest episode of NCIS or such).


The split for HD broadcast in NTSC-land is around 70-30% for 1080i vs 720p. But to think of video in terms of fps (and really someone in delivery broadcast television should know this) is not correct. Almost all video broadcast by standard non-PPV channels are broadcast at 60Hz. For 720p60 on a standard 60Hz TV, this is easy to understand how each frame is displayed. For 1080i60 on a 60Hz TV, this should also be relatively easy to understand how each field is displayed, especially given my above descriptions. I do not think you could find a TV that would just discard the non-dominant field of an interlaced signal and actually only display 30fps (maybe if you go back a few years to the first 4K 'TVs' that only supported 4K @ 30Hz, then this may be an example, but those were rare and really more akin to monitors than true TVs anyway).

So in short (lol I know by now if you're still with me), this is most definitely NOT just a gimmick. For any non-sports, non-news, or non-reality-tv programming (i.e. all film-based original content), then this announcement indeed has little value. But if you want to watch events like the Olympics, the World Cup, the NCAA basketball playoff, or the Super Bowl on one of these streaming services on your TV and not feel like you're watching a pirated stream, then this is fantastic news and ushers in a new era for more competitive video quality on these streaming services (and yes, I know, Hulu is a bit late to the game here so I am not just talking about them).

Can we just replace the article with this post?? Thank you for confirming all this! Macrumors should update the article with a brief summary of this info!
 
Yeah, Netflix though is not delivering classic live television service like the others. Generally, some of these do have some movie-streaming offerings (I think DirecTV Now does that too), getting 5.1 in those movies to customers. But TV and Sports almost always have 5.1 sound in them and have for many years. Doing without to "beat the man" seems "expensive" IMO to maybe save $10 or $30/month
While I won't try and argue for the side of the streaming providers regarding 5.1 vs 2.0 audio, there are a couple of factors at play here.

First, one of the key reasons for there not being 5.1 audio is due to licensing costs. Formats such as Dolby Digital and Dolby Digital Plus are much more expensive to license than AAC or HE-AAC. Your cable, satellite, or IPTV provider can deliver in these codecs because they have a much larger subscriber base and have negotiated much more amenable discounts because they can guarantee a certain minimum number of royalties to be paid. So partly blame Dolby on this one as well.

Secondly, until recently, delivering in 5.1 would have added little value for a majority of consumers because the decoders were not available. Take the iPhone for example. Before iOS 11, Apple didn't provide a universal decode license for Dolby Digital (AC-3; actually not sure if they still do, but I think they do now for DD+/E-AC3). So as a provider, if you encoded to AC-3, then you also potentially had to pay for a decoder to be distributed in your app. For devices such as the AppleTV, where the AC-3 signal could easily have just been pass-through to an external decoder, this wouldn't have been an issue of course, but you have to remember that the market for 'such-and-such live TV streaming service' on an iPad or iPhone is infinitely larger than for AppleTV.

And for those wondering about 5.1 AAC encoders, yes, I know they exist and there aren't all the licensing costs surrounding that format as there are for DD, but you need to remember two things: 1) Fraunhofer has done a terrible job licensing AAC to be embedded as a bit stream format in external receivers, especially compared to Dolby. 2) Even if you did have 5.1 AAC on your AppleTV, the AppleTV or other such streaming device would still likely decode that signal into 2.0 stereo PCM, thus negating the whole purpose of 5.1 encoding to begin with. About the only people who would be able to take advantage of this format would be customers using their computer and on-board sound card to directly decode to discreet channels – this would be a very miniscule amount of the target audience for these services.

As I allude to above though, hopefully this will all change soon enough. Apple at the least has appeared to have provided much better tools in iOS 11 and macOS 10.13 for multi-channel audio decode. The industry is usually at least a year behind on this stuff though, so give it a few more months...
 
With all due respect, who owns DirecTV Now? Who owns Hulu? Who owns YouTube TV? Who owns PS Vue? Who owns Sling?

Are you really trying to imply that those owners may not be able to afford DD licensing? 2 of those ALREADY offer "streamed" DD content via their SATT side. Conceptually, they could bundle and/or leverage their Satt deal into getting the same for their streaming service and thus differentiate their streaming service from most of these other players.

I don't know what it costs but I can't believe Dolby demands so much that it plays a sizable factor in these services NOT delivering it. But since I don't know what it costs, I can't completely dismiss the idea. I simply find it hard to believe that that is actually an issue (of substance) here.

To "secondly", can't these services be smart enough to know when they are receiving requests from a device like a phone vs. a device like :apple:TV and then feed them a version of the file that best fits? In other words, feed a request from :apple:TV a stream with 5.1 because it's much more likely to have a surround sound setup hooked to it than a phone which is much less likely.

OR, bundle up a "with 5.1 tier" for a few dollars more each month and let those that want it pay the extra for it.
 
Last edited:
Until an ISP in my area(Bay Area) can provide an internet only plan under $40-50 I'll have to pass on these cord cutter services.
 
With all due respect, who owns DirecTV Now? Who owns Hulu? Who owns YouTube TV? Who owns PS Vue? Who owns Sling?

Are you really trying to imply that those owners may not be able to afford DD licensing? 2 of those ALREADY offer "streamed" DD content via their SATT side. Conceptually, they could bundle and/or leverage their Satt deal into getting the same for their streaming service and thus differentiate their streaming service from most of these other players.

I don't know what it costs but I can't believe Dolby demands so much that it plays a sizable factor in these services NOT delivering it. But since I don't know what it costs, I can't completely dismiss the idea. I simply find it hard to believe that that is actually an issue (of substance) here.
  1. DirecTV Now = ATT
  2. Hulu = will be mostly Disney if FOX deal goes through, plus a minority ownership from Comcast
  3. YouTube TV = Google
  4. PS Vue = Sony
  5. Sling = Dish Network
I am not at all trying to suggest that these companies themselves couldn't afford an AC-3 or E-AC3 license per subscriber/stream. I think you are missing my point. But heck, if you want to make that your main argument, then why did Apple not include AC3 since the beginning as well, or at least since iOS 7 or so when these online video services really started taking off? Surely Apple, with a market cap larger than any of those other companies, could have afforded the include a hw decoder in every iPhone since the 5s... What I am saying is that due to this extra licensing cost (note that they would have had to pay on both the encode, which is cheap, and the decode, which could be considerably more costly), and also quite importantly because of playback device limitations (after all, what good is a AC3 decoder on a device if the output is just going to be to a pair of stereo headphones?), there was less incentive for these companies to initially offer 5.1 audio. Again, I'm not defending these practices, just providing some insight into how these companies in the internet streaming TV market approached audio. You also have to remember that none of these services are in-and-of-themselves hugely profitable yet.

To "secondly", can't these services be smart enough to know when they are receiving requests from a device like a phone vs. a device like :apple:TV and then feed them a version of the file that best fits? In other words, feed a request from :apple:TV a stream with 5.1 because it's much more likely to have a surround sound setup hooked to it than a phone which is much less likely.

OR, bundle up a "with 5.1 tier" for a few dollars more each month and let those that want it pay the extra for it.
Absolutely. In fact, there are some services already doing this (not sure if any of the above do, but I know of at least two who do so outside the US). One of the major features these OTT services utilize for ABR is something called variant playlists. Essentially, certain devices receive certain playlists for the ABR stream. As you noted, the server will detect the playback device and deliver a specific manifest to the player. It is a relatively simple thing to siphon audio to certain devices vs others. Apple actually already delivers AC3 audio in their streams (both ABR and downloaded MP4s) to the AppleTV, but this is all VOD-based workflow. But at least this shows it is quite possible to achieve.

There just needs to be more people clamoring for 5.1, similar to the requests for 60p, for these services to take notice and do something and give their customers what they want. It is the lack of these two features which I know has personally kept me from signing up long term for these services. I suspect the reason why 60p is gaining faster traction than 5.1 audio is also because the 60p thing at least is visual, and difficult to escape, once you've seen it on a competitor. Stereo audio may be way inferior to 5.1, but for most people, viewing on their handset (and trust me, most of these services are actually placing heavier emphasis on the mobile experience than the actual to-the-tv experience), they won't either notice the difference or have the capability to notice the difference. And even then, how can you blame these services for placing less weight on the audio experience, when for years consumers have proven that they were totally fine and willing to pirate music ripped at 96-128kbps MP3 (which just sounds awful)?
[doublepost=1518141003][/doublepost]
Do NOT want. 60fps just looks unnatural.
It is actually quite the opposite. We're not talking about interpolating content to 60fps, which I agree just looks weird (primarily because we are just so used to judder in the NTSC world). We are talking about broadcasting content at its native 60Hz, versus what is typically done today, which is to take 60Hz content and halve the motion resolution to 30Hz. 30Hz is what looks unnatural for sports content.
 
I've been consuming Apple iTunes video since :apple:TV1 and it's always had the DD 5.1 soundtrack playback option built in. What streams from iTunes is a file with both a stereo and DD stream and the device plays the one that works on the device.

Maybe you are talking about something else? Every Apple device doesn't need an AC-3 hardware decoder because there is the combination of a AAC stereo and 5.1 AC-3 track in these videos. A device that has no use of AC-3, plays the stereo track. A device that can make use of AC-3 passes that through to another device that does the decoding. I would think that these streaming channel "files" would work the very same way.

Correct these assumptions if they are wrong but presumably:
  • these streaming service providers are receiving the source signals from the individual channels themselves.
  • it's the SAME feed that is being sent to cable & satt providers. If so, (and again) presumably,
  • they are receiving the version that ALREADY has a 5.1 audio track embedded and
  • going to the trouble of stripping that out and/or potentially converting it to stereo if there is not already a stereo audio default track already in the source signal to then push out to subscribers of these streaming services.
I'm not an insider, so I can't know that for sure. Would the sources of these channels be creating a special stereo-only version of their programming for these streaming service middlemen? Or would these middlemen be going to the trouble of stripping a 5.1 audio stream out and passing through a stereo-only audio track?

My gut guess is that they want to pinch the bandwidth and an AC-3 track eats up some extra bandwidth. So they are assuming people won't care/notice and thus stripping it out for a smaller file to stream. Maybe there's a little bit of "protect our existing partners" by keeping better audio options only for traditional cable/satt? Etc.

I'm struggling with the idea that Dolby would be so expensive in their licensing to make "the future" providers steer clear of using DD at all due to cost. Dolby can't make a nickel on zero use. If cord cutting is going to replace the traditional providers, Dolby would be somewhat cutting their own throat by not cooperating with "the future." After all, the longer "the future" gets away with NOT providing DD, the less important it apparently is to users of streaming services. Why should the streamers EVER add DD after some point of long-term indifference?
 
Last edited:
I've been consuming Apple iTunes video since :apple:TV1 and it's always had the DD 5.1 soundtrack playback option built in. What streams from iTunes is a file with both a stereo and DD stream and the device plays the one that works on the device.
So a couple of notes here just to clarify some things:
  1. Correct, as far as I can recall since the first AppleTV (yes, I'm counting the pre-'A' Series Intel-based models of which I owned), Apple has encoded their VOD deliverable content with at least one AAC track and one Dolby Digital track
  2. I guess I have to be more specific here, because it seems like you want to argue every point even though we're basically on the same side... so my bad for not being clear enough in my last response. I wasn't talking about the AppleTV previously. I specifically referred to iOS, which the AppleTV doesn't run (it runs tvOS). And Apple does not include an AC3 hw or sw decoder (although there are third-party SDKs which include AC3 sw decode support certainly) in iOS devices (again, I think they do include E-AC3 support now in iOS 11), even though they certainly very easily could. Why not? My guess would be because it would cost a few pennies more per device, but who knows...
Maybe you are talking about something else? Every Apple device doesn't need an AC-3 hardware decoder because there is the combination of a AAC stereo and 5.1 AC-3 track in these videos. A device that has no use of AC-3, plays the stereo track. A device that can make use of AC-3 passes that through to another device that does the decoding. I would think that these streaming channel "files" would work the very same way.
No disagreement here. But you're not rationalizing this as a bean-counter would at one of these streaming companies does. So, hypothetically, if you project that live tv streaming service 'A' is going to see splits of 80% mobile viewing versus 20% tv viewing, and the cost to include AC-3 encoding is more expensive than not including it, and you aren't projecting to make a substantial profit within your first year of operation, what do you think that bean counter is going to recommend? I'm not saying this is right, just telling you how it is, and that multi-channel audio is very likely a phase 2 kind of thing until more subscribers are on board with the service.

Correct these assumptions if they are wrong but presumably:
  • these streaming service providers are receiving the source signals from the individual channels themselves.
  • it's the SAME feed that is being sent to cable & satt providers. If so, (and again) presumably,
  • they are receiving the version that ALREADY has a 5.1 audio track embedded and
  • going to the trouble of stripping that out and/or potentially converting it to stereo if there is not already a stereo audio default track already in the source signal to then push out to subscribers of these streaming services.
Basically all correct, except the last bullet as it really is no 'trouble' for them to convert from 5.1 to 2.0 – the ingest decoder into the encoder does this work at almost no CPU cost. It would be more efficient for them to just pass-through that DD source unaltered of course, but most of the 5.1 audio sources are encoded at either 384 or 448Kbps, which is usually deemed too high a bandwidth solely for audio by these operators for OTT (this is where DD+ comes in, with similar audio quality to DD at half the bit rate; and yes, I know that this makes little sense when the top profile is encoded at up to 8Mbps, but audio is seen as sort of the red-headed bastard stepchild and can be ritually sacrificed).

I'm not an insider, so I can't know that for sure. Would the sources of these channels be creating a special stereo-only version of their programming for these streaming service middlemen?
No, as stated above, they are receiving the original 5.1 mix in most cases. There are some source aggregators out there who may be doing some conformance on the sources before sending to the streaming operator (mostly overseas-type channels and some of the regional sports networks), but most of the channels these live tv streaming providers are encoding from come directly from the either the IRD or other original broadcast format over fiber.

My gut guess is that they want to pinch the bandwidth and an AC-3 track eats up some extra bandwidth. So they are assuming people won't care/notice and thus stripping it out for a smaller file to stream. Maybe there's a little bit of "protect our existing partners" by keeping better audio options only for traditional cable/satt? Etc.
Now we're on the same page. As mentioned, it is two-fold... the cost of the decoder (the encoder is actually pretty cheap) is one aspect, and the cost of a higher bandwidth required is another (because it does cost more to the provider via their CDN costs to deliver a higher total bandwidth, even if that difference is only 96Kbps vs 448Kbps for the audio portion). I doubt it has much to do with your latter suggestion however. There will be 5.1 for these streaming services later this year, and I don't think protecting existing ecosystems is a high priority for the majority of these services.

I'm struggling with the idea that Dolby would be so expensive in their licensing to make "the future" providers steer clear of using DD at all due to cost. Dolby can't make a nickel on zero use. If cord cutting is going to replace the traditional providers, Dolby would be somewhat cutting their own throat by not cooperating with "the future." After all, the longer "the future" gets away with NOT providing DD, the less important it apparently is to users of streaming services. Why should the streamers EVER add DD after some point of long-term indifference?
Dolby isn't 'cutting their own throat' as they are actively promoting and even recommending OTT providers go with DD+. I get where you're coming from, but logic doesn't always dictate business decisions. I'm not trying to justify any of those decisions, just hoping to provide some insight. Take it for what it is. Or don't.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.
Back
Top