No way. That's about what the iPhone 4 720p does. I might be wrong with my guess, but this is way under.
Apple bumped the H264 profile to main - it was baseline for the 4 - probably because they have more horsepower available for encoding with the A5. This means that can get a better picture at the same bitrate.
Picture quality at a given resolution isn't just about bitrate, the types of frames used in the encode (as determined by the profile) are very important as well. Encodes with the same bitrate and resolution look a lot different with Main and Baseline.
Just realized you said my way was more complex, but the better way takes more steps.
We both found the bitrate using a constant for quality, but yours took more steps, surely a more a efficient method is a better method? That's not the problem though, the constant you used for quality was what you guessed for the iPhone 4, not the real value.
For predicting the bitrate apple would go for, it would make sense to assume that resolution:bitrate (for the same profile) would be 1:1. To work that out I used the bitrate from the iPhone 4 at 720p.
You didn't. You essentially performed the same calculation using your guessed 720p bitrate from last time. Which wasn't correct.
This turned out to be very close to actual video. My estimate for iPhone 4 720p HD video was 79MB a minute and real videos came out to about 83MB.
About 1 MB of this difference could be the audio recording
64kilobits/s * 60 seconds = 3840kilobits or 0.47megabytes.
Your 3.53megabyte difference (being 4.5% off the real value, according to your figures) was why we got different results for the same calculation. That's the bottom line, I used an exact value, you didn't.
This little amount could easily vary depending on what you are recording and how well the compression can work.
But that's not how it works. The iPhone aims to achieve the same target bitrate each time. If it starts dropping frames then each frame will just get more bits to utilise.