Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

kirsch92

macrumors member
Original poster
Apr 30, 2009
96
2
My title is most likely poorly worded, but it was the best I could think of.

I just bought a Panny G5, and will be using the video features for basic home video stuff, mostly the kids doing what they do, nothing fancy, but I will be doing a little editing. Currently using iMovie 09, on a 2008 Core2Duo iMac. These videos would eventually reside in iTunes for ATV viewing.

The G5 can shoot in both AVCHD and MP4, and after browsing several forums I had concluded that I would shoot in MP4 to avoid all the AVCHD hassles.

One forum post mentioned the video's bitrate will have a largest effect on quality, rather than the container used, (which whether or not this is true) got me thinking:

At maximum quality the G5 shoots AVCHD 1920x1080 @ 60p, ~28 Mbps.
MP4 1920x1080 @ 30p, ~20Mbps.

Does this mean the MP4's bitrate is actually higher (per frame), since the AVCHD bitrate is spread across twice as many frames per sec?

Or do I have a gross conceptual error in my thinking? Does it even matter?
 
Last edited:

HobeSoundDarryl

macrumors G5
I'm sure someone will give you a more technical answer so take my more subjective input for what it's worth...

Bitrate in general is a measure of video quality. However, you can't read too much into it. For example, if you aim that camera at a blank piece of white paper and could shoot at 100Mbps vs. 1 Mbps, the outcome should be the same. Why because the level of detail being captured is extraordinarily modest in this situation (the camera doesn't need the detail in 100Mbps to capture such a simple image). The idea of this is very important, as much of what you might shoot is not shot much better if you could push an infinite Mbps button as the ultimate output is still going to end up being capped at the limits of the display. Your eyes can only see so much detail too, so infinity vs. 1000Mpbs vs. 100Mbps vs. 28Mbps may all look about the same when played back.

I have a similar Panasonic HD camcorder. I grappled with similar questions when I bought it. In the end, I shoot just about everything at 1920 x 1080 60fps AVCHD 28Mbps (in other words, max settings). Why? Because home movies are precious memories that you can only possibly shoot once- might as well capture those moments at the highest possible quality. There's not a big tradeoff in file (storage) size.

I chose 60fps over 30fps because the camera could shoot it and higher frame rates yield smoother motion. I shoot a lot of sports so the camera is often moving. 60fps yields more detail in camera motion than 30fps. :apple:TV3 can't play 60fps so I have to downconvert a rendered version for it but I keep those 60fps as masters with hopes that there will eventually be an :apple:TV that isn't locked to only 30fps.

To the question you asked, see the above. 28Mbps with 60fps vs. 20Mbps with 30fps is not really apples to apples. And unless you have huge changes to what is being captured, odds are that something less than 20Mbps for either option would capture a picture that looks just about as good unless you have a very scrutinizing eye. After it's processed then rendered as a file to play on an :apple:TV, it's very likely that bitrate is going to be cut down to less than either anyway... even if you try to retain very high quality. For example, I shoot everything at 28Mbps but the final rendered file tends to end up at an average of about 11Mbps. It's not that I'm heavily compressing the file (I want to retain maximum possible quality while ending up with something that will play on the :apple:TV); it's that what I'm shooting doesn't need more than about an average of 11Mbps to yield a good final render for :apple:TV. And again, I shoot fast-moving sports. If the camera is not moving so much, the average bit rate will slide on down. I've seen iTunes store videos with average bit rates down around 4-5Mbps that look very good.

So there's a lot of variables at play here. Since you have a camera that maxes out at 28Mpbs 60p, I'd shoot at that. You only get now to capture your masters as good as they can be captured. Keep those as masters so that you can re-render for future technology well into the future. For example, right now, the typical target is H.264 but H.265 is not too far away. Coming back to master files will yield better results than converting an H.264 render to H.265.

If you do go this way, iMovie is not a great tool. Get FCP X and Clipwrap. Use the latter to convert your AVCHD streams to ProRes 422. Import those into FCP X for editing. Export from there to ProRes 422. Use Handbrake to convert those into a final file for :apple:TV. Not only is FCP X much better with 1080p, it can also handle 60fps, Dolby Digital 5.1 audio, etc. I tried to make it all work with iMovie but got frustrated with its limitations and did the trial with FCP X. IMO, it's THE way to go for this quality of home movies.
 
Last edited:

boch82

macrumors 6502
Apr 14, 2008
328
24
My title is most likely poorly worded, but it was the best I could think of.

I just bought a Panny G5, and will be using the video features for basic home video stuff, mostly the kids doing what they do, nothing fancy, but I will be doing a little editing. Currently using iMovie 09, on a 2008 Core2Duo iMac. These videos would eventually reside in iTunes for ATV viewing.

The G5 can shoot in both AVCHD and MP4, and after browsing several forums I had concluded that I would shoot in MP4 to avoid all the AVCHD hassles.

One forum post mentioned the video's bitrate will have a largest effect on quality, rather than the container used, (which whether or not this is true) got me thinking:

At maximum quality the G5 shoots AVCHD 1920x1080 @ 60p, ~28 Mbps.
MP4 1920x1080 @ 30p, ~20Mbps.

Does this mean the MP4's bitrate is actually higher (per frame), since the AVCHD bitrate is spread across twice as many frames per sec?

Or do I have a gross conceptual error in my thinking? Does it even matter?

If its just home movies, either one will work just fine.

As a reference, Broadcast (major networks) usually require a minimum of 50mb/s. but that's the finished product, most of the footage is usually shot at 35 mb/s.

You are probably reading way too much into it.

A typical 1080p bluray will look fine at 10-12mb/s.
 

phrehdd

macrumors 601
Oct 25, 2008
4,313
1,311
"At maximum quality the G5 shoots AVCHD 1920x1080 @ 60p, ~28 Mbps.
MP4 1920x1080 @ 30p, ~20Mbps."

The AVCHD will provide more information, sharper image on moving subjects and more.

For typical indoor stuff with little movement, the MP4 is reasonable. For outdoor or more active stuff, go with the AVCHD.

The best deal is to test BOTH under various circumstances - shoot one then the other and see for yourself what is acceptable.

As for AVCHD, it can always be converted with minimal or no loss to another format that is perhaps easier to deal with. Given that files often are VBR (variable bit rate) it may be possible that the 28 and 20 respectively represent maximum bit rates but not always at that given bitrate. Blu Ray movies as example may have static scenes with minimal bitrates then heavily detailed or action scenes at a much higher bit rate - thus the VBR.
 

kirsch92

macrumors member
Original poster
Apr 30, 2009
96
2
Thanks for the input everyone. I want to add a little to this info:
At maximum quality the G5 shoots AVCHD 1920x1080 @ 60p, ~28 Mbps.
MP4 1920x1080 @ 30p, ~20Mbps.

The G5 does shoot VBR and those bitrates are maximums. Now If I were to shoot AVCHD @ 1920x1080 @ 30p vs 60p, the max bitrate is 17 Mbps, vice 28.

Thus my thinking has been that the bitrate in MP4 is actually higher (per frame) for this camera, and thus better in capturing detail.

I understand that 60p will certainly allow for a smoother action, but I wonder how I would ever benefit from this unless TV's start using 60p themselves.

I was basically asking the question because a lot of what I have been reading about why people chose AVCHD over MP4 despite the editing PITA of AVCHD was because their cameras either shot MP4 at 30i, and had interlacing issues, or the quality was lower due to a lower maximum bitrate, or it only shot @ 720p.
I guess the better way to phrase the question is this:

Unless I need to shoot at 60p, does my camera actually shoot a higher quality picture in MP4 than in AVCHD?

I probably DO need to just shut up and go shoot both though...

One note:
I've got tons of too-low light, grainy footage indoors from my old Canon MiniDV the looks like garbage for the most part, but I still love seeing the kids when they were that much smaller.
So maybe it doesn't matter so much?
 

acearchie

macrumors 68040
Jan 15, 2006
3,264
104
As a reference, Broadcast (major networks) usually require a minimum of 50mb/s. but that's the finished product, most of the footage is usually shot at 35 mb/s.

Be careful with your terms. You may be right but here in the UK it's needs to be captured at at least 50Mb/s. That it to say there is yet to be a camera on the BBC's approved list that records less than this.

50mb/s implies to me 50 megabytes per second which is far higher than many digital cameras record bar the high end raw recording equipment.

I assume you meant 50Mb/s which is megabits.
 

Siderz

macrumors 6502a
Nov 10, 2012
991
6
50mb/s implies to me 50 megabytes per second which is far higher than many digital cameras record bar the high end raw recording equipment.

I assume you meant 50Mb/s which is megabits.

Just to clear it with other people:

B = Byte

b = bit

So:

MB = Megabyte

Mb = Megabit

I'm not sure, however, whether or not the case of the 'M' (Standing for 'mega') matters.

acearchie seems to believe that boch82 was referring to "megabytes" even though boch82 used a lower case 'b'.
 

floh

macrumors 6502
Nov 28, 2011
460
2
Stuttgart, Germany
I'm not sure, however, whether or not the case of the 'M' (Standing for 'mega') matters.

It doesn't really matter, but in case anyone stumbles over this:

A lowercase "m" means "milli", which is one thousandth. Like in millimeter (mm) or milliliter (ml). An uppercase "M" means "Mega" which is on million. So there is quite a difference. A factor of 10^9 to be precise. :)
 

Siderz

macrumors 6502a
Nov 10, 2012
991
6
It doesn't really matter, but in case anyone stumbles over this:

A lowercase "m" means "milli", which is one thousandth. Like in millimeter (mm) or milliliter (ml). An uppercase "M" means "Mega" which is on million. So there is quite a difference. A factor of 10^9 to be precise. :)

Ah, that sort of clears it up then.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.