Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Nermal

Moderator
Staff member
Dec 7, 2002
20,627
3,986
New Zealand
Does anyone happen to have heard anything about an update to Compressor that supports this new profile? Since it's Apple I'm guessing not, but you never know...

Would apple's source not be 8bit anyway?

The studios are asked to submit content in ProRes 422 which supports 10-bit, but whether this actually happens is another question :)
 

MagnusVonMagnum

macrumors 603
Jun 18, 2007
5,193
1,442
ha ha i don't understand why people are praising apple for upgrading to 1080p while only upping the file size marginally.

Some of us don't care about Apple's files. We just want hardware that can play our own encodes. ATV2 can't output 1080p. It can play 1080p files, but they are output in 720p. ATV3 can output true 1080p. That's what matters.
 

JAT

macrumors 603
Dec 31, 2001
6,473
124
Mpls, MN
You kind of sum up the flaw behind 720p and 1080P.

You won't notice a difference on a smaller screen. 1080P and 720P cannot be distinguished by the human eye on screens 52" or smaller. It's mostly a marketing hoax.

What usually makes say, a 42" 1080P set look better than a 720P set next to it isn't the pixels, but the other hardware and software differences in the sets. Hrtz, refresh, color handling, etc. If a 720P set had the same exact hardware and software of a 1080P you'd never see a difference until the screen size hits around 52". The only other factors can be the distance you are from the TV, but most people aren't sitting five inches in front of their TV set.

With these comparison images, you have to factor in compression as a reason for better brightness and the like.

Now, people who don't want to believe science will argue that what I said is totally wrong... because they often don't want to feel they over paid for something they did not need... but you can google all this and verify. CNET is often posting articles with the hard science (on the human eye especially) and people in the comments still want to insist it's wrong.

The same is true of higher audio. The human ear at a certain point can't hear the extra quality, making it a waste of someones money and just bragging rights on a system.

I personally don't know if I totally buy the 52" inch size... I kind of dispute the science at around 47" but that's when distance from the screen becomes more relevant.

LOL. Maybe if I ever get a small screen, like 52", I'll test your half-written theories.

(I've highlighted the word you should have discussed to make your post mean anything at all)

----------

it's apple's way of promoting bd. ;)
lol!
 

faroZ06

macrumors 68040
Apr 3, 2009
3,387
1
Encoding is always a matter of compromise between size with quality, but if you switch to a more sofisticate encoding profile you can get much better quality for the same size or same quality with much smaller size, usually at the price of higher processing power requirements when decoding.

If they just compress more with the same profile they are not doing a good job, I hope (and guess) they actually employ a better encoding profile instead.

I don't think so. Why wouldn't they already use the better encoding strategy before? This technology hasn't changed recently.

I think they just want to have the "1080p" label. It's better quality than before, so I shouldn't complain, but I think the way these things should be marked should be the bitrate and the strategy.

So my camcorder is MPEG2 80mbps. This is H.264 x kbps.
 

MagnusVonMagnum

macrumors 603
Jun 18, 2007
5,193
1,442
>>It is very successful so I understand why it is there, just do not understand why people who seem intelligent would like it.<<

Yes, because I do not like this one show I have zero sense of humor.

Well...probably no more so than telling someone that an intelligent person wouldn't find Big Bang Theory funny. I simply find it amusing that you seem to think that your idea of what is funny is somehow related to someone's intelligence and that an intelligent person couldn't possibly find a Chuck Lorre show to be funny because apparently you do not (implying that you are intelligent on the one hand and that therefore your low opinion of his shows simply MUST represent everyone else that is "intelligent". What a hoot. :D

Arrested Development is a funny show, so is the original Office and Extras, how about Monty Python's show and movies? How is it possible I find these things funny with no sense of humor?

Well, IMHO, the fact you find those shows and movies to be funny proves nothing since i don't find a single one of them to be funny, especially Monty Python and the last time I checked my IQ score was in the 99th percentile so my taste simply MUST represent everyone in that bracket. :rolleyes:

Now if you had said a show like Red Dwarf.... ;)

What it actually shows is that humor isn't easily defined. If it were, I cannot explain why someone finds a comedian like Ron White to be funny because I didn't laugh once at his standup routine while Jon Pinette had me almost collapsing because I was laughing so hard I couldn't breathe. Meanwhile, I know people that think both of them are funny and that must mean they are an alien hybrid! :eek:
 

faroZ06

macrumors 68040
Apr 3, 2009
3,387
1
Some of us don't care about Apple's files. We just want hardware that can play our own encodes. ATV2 can't output 1080p. It can play 1080p files, but they are output in 720p. ATV3 can output true 1080p. That's what matters.

Good point. This is what really matters, the ability.
 

davidh2k

macrumors newbie
Mar 10, 2012
20
43
The reason that the 1080p versions of the iTunes Store videos can be a good deal better without doubling the file size--or worse--can be found in the tech specs of the new AppleTV and the new iPad. The AppleTV now supports H.264 compression for 1920x1080 resolution video at 30 frames per second using High or Main Profile up to level 4.0, the iPad and the iPhone 4S the same up to level 4.1. The profile indicates what kind of decompression algorithms the H.264 decoder has on board--the "High" profile obviously has some tricks up its sleeve that the "Main" or "Baseline" profiles known to previous devices don't support. The level value indicates how many blocks or bits per second a device can handle.​

This is a bunch of misinformed garbage.
This has everything to do with how the video was encoded.
The video was re-encoded with "high profile". The decoder must support high profile to decode the video. High profile gives better quality at the same resolution and allows you to increase resolution of the output without increasing file size dramatically.

The original content was probably 1080p or 1080i and was scaled down to 720p. Using high profile and no down-scaling allows better quality with only a marginal increase in file size.

The amount of computation required to decode high profile vs main profile is significant if done in software. They are probably using the additional two GPUs on the A5x to do the decode. iOS devices don't have dedicated chips for video decode. It's done either by the GPU or the CPU. Which is why, up until the release of the iPad(Generation 3), iOS devices have not been able to support high profile streams. They run out of CPU/GPU cycles.

Yea, they added software for decode on the devices but it's not just enabling it on the device and there are no tricks in the decoder. The tricks are done in the encoder to get the file size down.


So Apple want's to tell us, that the A/TV 3 (A5 Single Core (+Dual Core Graphics if not also deactivated)) has not eneugh firepower to be capable of playing High@L4.1 profile, but the A/TV 2 (A4 Single Core) can run the same profile (SMOOTHLY!) with the little hack of XBMC? (ignoring the limitation of 720p output!)

On the other hand - Apple always speaks about Post PC Devices
Where is the A/TV "Post PC" if you can't attach any USB Device to it, but rather be forced to attach it to a PC? Anyone thought of this? It's the one thing whats keeping me away from buying the A/TV 3 at the moment....
 

pooprscooper

macrumors regular
Aug 5, 2008
158
1
file sizes generally increased by 15-25% over their respective 720p versions, despite the number of pixels more than doubling to reach the higher standard.

Who are they kidding? 1080P is most definitely NOT double or even more than double the pixels of 720p... That would be 2560x1440.
 

JAT

macrumors 603
Dec 31, 2001
6,473
124
Mpls, MN
Well...probably no more so than telling someone that an intelligent person wouldn't find Big Bang Theory funny.
I'm just trying to figure out why people assert intelligence has anything to do with this show. It's not about intelligence, it's about social awkwardness.

In fact, it's more about Asperger's Syndrome than intelligence. Which is why the stated 180+ IQ of Cooper would have to be a raw score only, if it included the several social and emotional portions of a true IQ testing, those would be below 100 for the character, dragging the overall down quite a bit to (probably) the mere 130s.
 

MagnusVonMagnum

macrumors 603
Jun 18, 2007
5,193
1,442
I'm just trying to figure out why people assert intelligence has anything to do with this show. It's not about intelligence, it's about social awkwardness.

In fact, it's more about Asperger's Syndrome than intelligence. Which is why the stated 180+ IQ of Cooper would have to be a raw score only, if it included the several social and emotional portions of a true IQ testing, those would be below 100 for the character, dragging the overall down quite a bit to (probably) the mere 130s.


I personally don't assert it all. I merely assert that intelligent people can and do find this show to be funny (unlike the original comment that I replied to which suggests that no intelligent person could find any Chuck Lorre show to be funny). This is an absurd idea since humor is more of an emotion than a logical thought process.

In any case, the very idea of an IQ is somewhat flawed since it's a general abstract covering any number of possible fields. Someone can be quite skilled in one thing and quite limited in something else while yet others can be very skilled in both while either may be lesser than the others in any given single field. For example, would Tesla have been as groundbreaking and successful if he had entered the field of medicine instead of researching electricity? It's hard to say. Some people can be brilliant in physics but can't speak articulately to save their life while someone that's brilliant at creative writing might not be worth a darn in math. Thus, just throwing a number on "intelligence" doesn't really say much.
 

JAT

macrumors 603
Dec 31, 2001
6,473
124
Mpls, MN
Yeah. K.

Guess that 1% doesn't mean you can tell when someone agrees with you.

**** me, this place brings me down sometimes.
 

Eidorian

macrumors Penryn
Mar 23, 2005
29,190
386
Indianapolis
The fansub scene have pretty much always been at the forefront of and pushing for new and improved encoding technologies.

Hi10P certainly caused an uproar and lots of division, because it's not necessarily an improvement. Banding etc. improvements are very arguable over other quality encodes, currently it's basically all about file size.

We've got some groups continuing as they were, some groups moving to Hi10P only and other groups providing both. Personally I think it's a bad thing - Reduced file sizes (should we really care that much about this, especially when it comes to archiving anime?) at the cost of moving ALL decoding to software. When these cheap media streamers, SOCs, vdpau and vaapi etc mature I'll be all for it but unlike the past, I think some of these groups have made the wrong decision and don't really see it as pushing forward.
You were stuck with nightlies or SVN for support on MPC or VLC. The stable mainstream just got Hi10P support back in January so it was dealing with months of telling people to use the unstable stuff. I really do not like getting into the filter wars and usually just stick with VLC. Otherwise it is not much more than pure one upmanship on who has the best filters and settings.

It is rare for a group to be doing anything beyond Hi10P for 720p and the standard XviD for 480p. I have stumbled upon a few though.

For archival purposes people just wait for the Blu-rays anyways.

Don't really know understand what you mean by this - The h264 world would be a better place if x264 was the only encoder - But for all these multi-billion pound production studios, why use the best solution available for free when you can spend £100000s on something substandard? :)
Now that is silly. I just meant that the users are informed of alternatives and the bleeding edge. It just comes down to marketing otherwise.
 

SeaFox

macrumors 68030
Jul 22, 2003
2,619
954
Somewhere Else
Would apple's source not be 8bit anyway? Bluray discs/spec isn't 10bit so there's no advantage in quality, only a slightly lower bitrate for the same quality
It's 10 bits of precision not 10 bits of color. :rolleyes: The picture quality is better even if it's an 8 bit source, especially in darker scenes.

This sounds good in theory of course, but I can't think of any device (from £5 arm socs to £500 gfx cards) that'll play this in hardware without issue.
To play using hardware acceleration, the hardware has to know how to play it. Those cards simply don't exist yet. ;)

If a 3+ghz i7 struggles, a single core arm won't stand much chance with software decoding :)
Huh?

Processing power needed isn't much more than 8-bit High Profile. I play it on a 2.66 Ghz i7 with only 20% CPU usage. And I'm also using MadVR for the picture renderer.

I have a friend in the Philippines that played 720p 10-bit on a Core 2 Duo laptop.

Apple are just switching to a high profile that's been 'standard' for years - Real, great and innovative news would be that Apple had switched to using x264 for their encoding!
Yup, if there was a reason I never took the idea of "buying" video files on iTMS seriously, it was that Apple was using a h264 profile the fansub community abandoned years ago for the better quality in High Profile. I just buy the bluray when its on sale instead.

The Hollywood movie pirates are struggling to stop using XviD/AVI still, I don't know why they hold only it except for people watching on DivX DVD players and crappy media tanks.
 
Last edited:

croooow

macrumors 65816
Jul 16, 2004
1,044
206
Well, IMHO, the fact you find those shows and movies to be funny proves nothing since i don't find a single one of them to be funny, especially Monty Python and the last time I checked my IQ score was in the 99th percentile so my taste simply MUST represent everyone in that bracket. :rolleyes:
Thanks for the clarification. ;)

Even though I do not like the show I am glad it is so well received. I am happy to see "geek culture" represented as much as it is including with Big Bang Theory.
 

VoR

macrumors 6502a
Sep 8, 2008
917
15
UK
You were stuck with nightlies or SVN for support on MPC or VLC. The stable mainstream just got Hi10P support back in January so it was dealing with months of telling people to use the unstable stuff. I really do not like getting into the filter wars and usually just stick with VLC. Otherwise it is not much more than pure one upmanship on who has the best filters and settings.

It is rare for a group to be doing anything beyond Hi10P for 720p and the standard XviD for 480p. I have stumbled upon a few though.

Of course, I have no problems in playing media on my desktop - I also use VLC, it's perfect for click'n'play and not infesting your system with dangerous codecs (bad configuration=bad system). I've got a long cable to a tv, but I still use htpcs. Unfortunately, some don't have the horsepower for software decoding of h264.




It's 10 bits of precision not 10 bits of color. :rolleyes: The picture quality is better even if it's an 8 bit source, especially in darker scenes.

To play using hardware acceleration, the hardware has to know how to play it. Those cards simply don't exist yet. ;)

Huh?

Processing power needed isn't much more than 8-bit High Profile. I play it on a 2.66 Ghz i7 with only 20% CPU usage. And I'm also using MadVR for the picture renderer.

I have a friend in the Philippines that played 720p 10-bit on a Core 2 Duo laptop.

Yup, if there was a reason I never took the idea of "buying" video files on iTMS seriously, it was that Apple was using a h264 profile the fansub community abandoned years ago for the better quality in High Profile. I just buy the bluray when its on sale instead.

The Hollywood movie pirates are struggling to stop using XviD/AVI still, I don't know why they hold only it except for people watching on DivX DVD players and crappy media tanks.

From what I've read (I haven't experimented myself), without a 10bit source the quality difference is non-existant, with the only potential improvements being banding in animated content.

To play in hardware, we need the software to be able to play 10bit :) GPUs etc have dedicated decoding engines and tonnes of shaders etc to play with - We probably will have to wait for new generations of cards and the APIs to go with them, not that that's technically required.
My example of an i7 struggling was of course an exaggeration. One of my favourite htpcs is a 3.16ghz c2d which plays all my L4.1/5.1 without issue. But, on my fusion APU I'm stuck - ATI are famous for fantastic hardware and awful drivers... I can play L4.1 all day long, but L5.1 is completely garbled, mpeg2 is obviously too futuristic for xvba to even bother and deinterlacing chromas correctly is about as likely as, urrr me switching to a better supported OS.
The fusion is infinitely more powerful than ARM socs / other dedicated devices in media streamers etc.

There's lots of pirates still churning out divx/xvids yes (slightly unknown reasoning to me also), but all the 'proper' guys moved on ages ago :) The pirate scene still provides the most choice of media and the most choice of encodes - The freedom that standard encodes of drm free media provides is worth my morals stretching a little. It's a little off topic of course, but everyone I know with a decent h264 collection has dvd/bluray/legit collections that dwarfs anyone that buys into itunes/etc.
 

diamond.g

macrumors G4
Mar 20, 2007
11,100
2,440
OBX
Of course, I have no problems in playing media on my desktop - I also use VLC, it's perfect for click'n'play and not infesting your system with dangerous codecs (bad configuration=bad system). I've got a long cable to a tv, but I still use htpcs. Unfortunately, some don't have the horsepower for software decoding of h264.






From what I've read (I haven't experimented myself), without a 10bit source the quality difference is non-existant, with the only potential improvements being banding in animated content.

To play in hardware, we need the software to be able to play 10bit :) GPUs etc have dedicated decoding engines and tonnes of shaders etc to play with - We probably will have to wait for new generations of cards and the APIs to go with them, not that that's technically required.
My example of an i7 struggling was of course an exaggeration. One of my favourite htpcs is a 3.16ghz c2d which plays all my L4.1/5.1 without issue. But, on my fusion APU I'm stuck - ATI are famous for fantastic hardware and awful drivers... I can play L4.1 all day long, but L5.1 is completely garbled, mpeg2 is obviously too futuristic for xvba to even bother and deinterlacing chromas correctly is about as likely as, urrr me switching to a better supported OS.
The fusion is infinitely more powerful than ARM socs / other dedicated devices in media streamers etc.

There's lots of pirates still churning out divx/xvids yes (slightly unknown reasoning to me also), but all the 'proper' guys moved on ages ago :) The pirate scene still provides the most choice of media and the most choice of encodes - The freedom that standard encodes of drm free media provides is worth my morals stretching a little. It's a little off topic of course, but everyone I know with a decent h264 collection has dvd/bluray/legit collections that dwarf anyone that buys into itunes/etc.
Back in the GeForce 6800 era (I believe the HD2000 time for ATI) where hardware based decoders were "rare" both parties did some decoding using shaders (ATI more than Nvidia). Now they rely on dedicated hardware. There isn't a whole lot of incentive to go back to decoding using shaders either.
 

Bdub12

macrumors regular
Feb 23, 2011
153
94
Fredericton NB Canada
Like Nermal, I'm wondering if we'll get a Compressor update with the "High" profile.

I'd like to encode some home movies for the new Apple TV, whenever they decide to send it to me.
 

VoR

macrumors 6502a
Sep 8, 2008
917
15
UK
Back in the GeForce 6800 era (I believe the HD2000 time for ATI) where hardware based decoders were "rare" both parties did some decoding using shaders (ATI more than Nvidia). Now they rely on dedicated hardware. There isn't a whole lot of incentive to go back to decoding using shaders either.

The incentive to use shaders is pretty large when the "built in hardware"/apis don't do what you want (or do it badly). VDPAU is pretty damn good, but certainly not faultless, whereas XVBA>VAAPI is pretty damn poor - No mpeg2 and no L5.1 are perfect examples, completely vital for 'standard use media playback', begged for for years and still nowhere to be seen.

It's pretty sad the UVD is still so closed, especially from a company that 'fully supports' opensource. ATI developers are apparently going through a code review on UVD but I still don't hold up too much hope. Legal teams are so scared that their DRM will be broken (as if people haven't been ripping bluray straight from disc for however long - If low-level UVD decoding was written and released in assembler, how would anyone get any more information than is currently available by running catalyst through an ELF disassembler?
Hooray for lawyers :)
 

KnightWRX

macrumors Pentium
Jan 28, 2009
15,046
4
Quebec, Canada
Most blu rays are 25-40 mbps. The difference in quality especially during motion is definitely noticeable. If only the stupid drm people didn't saddle it with hdcp which makes it harder for law abiding people to actually watch it. The pirates still get their copies.

iTunes movies also require HDCP... but frankly at this point, what about HDCP makes it hard to watch Blu-rays or other content ?

Plug in HDMI connector from source to screen... how is that harder than what we had before ? Heck, HDMI is easier if anything.

----------

You won't notice a difference on a smaller screen. 1080P and 720P cannot be distinguished by the human eye on screens 52" or smaller. It's mostly a marketing hoax.

Uh ? Sit closer to your 52" or smaller TV then and you'll notice the difference immediately.

Same **** as the retina display really.

Here, this graph will tell you how far away to sit from your TV to get the full benefits of the resolution :

resolution_chart.png


----------

I have a friend in the Philippines that played 720p 10-bit on a Core 2 Duo laptop..

I play those files on my MacBook Air flawlessly. In fact, it doesn't use much more CPU than the previous 8 bit encodes does.
 

VoR

macrumors 6502a
Sep 8, 2008
917
15
UK
iTunes movies also require HDCP... but frankly at this point, what about HDCP makes it hard to watch Blu-rays or other content ?

Plug in HDMI connector from source to screen... how is that harder than what we had before ? Heck, HDMI is easier if anything.

DRM is hardly a good thing. HDMI is a great connector and standard, half ruined by a load of DRM handshaking.
It doesn't actually affect me, but there's tonnes of people that have issues with hardware not connecting correctly. Got an old lcd? Got a bad model/cheap chinese tv? Still want to use them to play media? Stuffed!
Going out and buying (decent) equipment is all very well and good, but why should people bother? It's not like the DRM in these connectors actually provides any protection for the content.
I don't think we're in foil hat territory, I think it's simply just continuing incompetence of the media world :)
Anyway, oss driver/ffmpeg/etc developers tell me it's a bad thing - I'm just an cynical, ignorant user.

I play those files on my MacBook Air flawlessly. In fact, it doesn't use much more CPU than the previous 8 bit encodes does.

Yeah exactly, the difference in decoding is minimal.
The problem with pushing to 10bit too quickly isn't with nice big desktop cpus. It's all the devices that are forced to use their puny little cpus (the ones that would struggle with mpeg2) that can chew through everything else without a problem.
 

KnightWRX

macrumors Pentium
Jan 28, 2009
15,046
4
Quebec, Canada
DRM is hardly a good thing. HDMI is a great connector and standard, half ruined by a load of DRM handshaking.
It doesn't actually affect me, but there's tonnes of people that have issues with hardware not connecting correctly. Got an old lcd? Got a bad model/cheap chinese tv? Still want to use them to play media? Stuffed!

Most content (blu-ray included) will still play on an non-HDCP chain. It'll just get downscaled so you don't get the full quality.

And if you have a cheap chinese TV or an older model without HDCP, you probably aren't using Blu-rays or HDMI anyhow. You're probably thinking S-Video was the height of video quality.

I don't agree with DRM in general and by principal, but really, HDCP hardly makes it "much harder to legally watch content".

Yeah exactly, the difference in decoding is minimal.
The problem with pushing to 10bit too quickly isn't with nice big desktop cpus. It's all the devices that are forced to use their puny little cpus (the ones that would struggle with mpeg2) that can chew through everything else without a problem.

Haven't had a problem with 10 bit yet. My MBA doesn't have a nice big desktop CPU either, nor do any of my streamer boxes.
 

VoR

macrumors 6502a
Sep 8, 2008
917
15
UK
I don't agree with DRM in general and by principal, but really, HDCP hardly makes it "much harder to legally watch content".

I agree - My main grievance is the problems it causes for anything that isn't an operating system made by Microsoft or Apple. Like I said, clever guys tell me its bad, so I like to blindly jump on the bandwagon.

Haven't had a problem with 10 bit yet. My MBA doesn't have a nice big desktop CPU either, nor do any of my streamer boxes.

I class the MBA cpu as a 'nice big desktop CPU' :) Obviously an exaggeration, but compared to 3W TDP SOC, it might as well be.
I don't have a problem with 10bit either, but mainly because it's only a few anime groups that are pushing it, and that's not something I watch. But, are you saying something similar, or have you got a cheap streamer that plays it without issue? Again, like I've said - there's technically no issue in playing 10bit media on any hardware that provides acceleration, it just needs to be supported and I wasn't aware that it was anywhere near widespread.
 

nateo200

macrumors 68030
Feb 4, 2009
2,906
42
Upstate NY
Does anyone happen to have heard anything about an update to Compressor that supports this new profile? Since it's Apple I'm guessing not, but you never know...



The studios are asked to submit content in ProRes 422 which supports 10-bit, but whether this actually happens is another question :)
Boy I would sure love to get my hands on some ProRes 422 straight from the studio :D Encoding low bit rate directly from ProRes vs encoding compressed content (yes that includes BluRays) into more compressed content theoretically means that Apple has an advantage in that they are going from lossless (ProRes) to lossy (our Handbrake encodes) vs us going from lossless (studio master) to lossy (BluRay) to lossy (our Handbrake encodes).

Some of us don't care about Apple's files. We just want hardware that can play our own encodes. ATV2 can't output 1080p. It can play 1080p files, but they are output in 720p. ATV3 can output true 1080p. That's what matters.

This. I am done making two different versions of an encode I don't put 720 or even 1080p video on my iPhone/ipad for quality, I do it because I have one encode.

Someone in here may have mentioned audio about Dolby TrueHD so I would like to touch on audio. First off, all of these movies better have something better than 2ch 160kbps Dolby Pro Logic II....having 1080p without multichannel sound is like when guys go to the gym and only workout their upper bodies...its just bad. That said Dolby TrueHD and DTS-MA are pretty much limited to BluRay since I doubt we need an audio stream with a lossless bit rate multiple times the bit rate of the video it stream its self :O

I think Apple needs to really make sure everything has at minimum Dolby Digital (unless the studio explicitly does NOT master in Dolby/Multichannel) and that they should add DTS-ES Discrete and Dolby Digital Plus in to the specs too. DTS-ES is totally underrated, you can pass all 7 discrete channels down optical ( and if your receiver doesn't support it then you just get the 6 channel core or the matrix 7 channel in 6), regardless its 24-bit and 768kbps and Dolby Digital Plus can go even lower with up to 7.1 discrete channels (netflix uses DD+), Netflix supposedly uses as low as 256kbps 7.1 Dolby Digital Plus which is pretty low but Dolby has always been about advanced compression (vs DTS which hasn't so I think Apple will avoid DTS).


Does anyone happen to have heard anything about an update to Compressor that supports this new profile? Since it's Apple I'm guessing not, but you never know...
I'm sure we will get it along with 64-bit support :D :D /sarcasm
 

Attachments

  • Screen Shot 2012-03-14 at 5.12.36 PM.png
    Screen Shot 2012-03-14 at 5.12.36 PM.png
    266.6 KB · Views: 103
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.