My title is most likely poorly worded, but it was the best I could think of.
I just bought a Panny G5, and will be using the video features for basic home video stuff, mostly the kids doing what they do, nothing fancy, but I will be doing a little editing. Currently using iMovie 09, on a 2008 Core2Duo iMac. These videos would eventually reside in iTunes for ATV viewing.
The G5 can shoot in both AVCHD and MP4, and after browsing several forums I had concluded that I would shoot in MP4 to avoid all the AVCHD hassles.
One forum post mentioned the video's bitrate will have a largest effect on quality, rather than the container used, (which whether or not this is true) got me thinking:
At maximum quality the G5 shoots AVCHD 1920x1080 @ 60p, ~28 Mbps.
MP4 1920x1080 @ 30p, ~20Mbps.
Does this mean the MP4's bitrate is actually higher (per frame), since the AVCHD bitrate is spread across twice as many frames per sec?
Or do I have a gross conceptual error in my thinking? Does it even matter?
I just bought a Panny G5, and will be using the video features for basic home video stuff, mostly the kids doing what they do, nothing fancy, but I will be doing a little editing. Currently using iMovie 09, on a 2008 Core2Duo iMac. These videos would eventually reside in iTunes for ATV viewing.
The G5 can shoot in both AVCHD and MP4, and after browsing several forums I had concluded that I would shoot in MP4 to avoid all the AVCHD hassles.
One forum post mentioned the video's bitrate will have a largest effect on quality, rather than the container used, (which whether or not this is true) got me thinking:
At maximum quality the G5 shoots AVCHD 1920x1080 @ 60p, ~28 Mbps.
MP4 1920x1080 @ 30p, ~20Mbps.
Does this mean the MP4's bitrate is actually higher (per frame), since the AVCHD bitrate is spread across twice as many frames per sec?
Or do I have a gross conceptual error in my thinking? Does it even matter?
Last edited: