Sorry, you're right, it seems Apple ProRAW is just for stills.Pretty sure ProRAW is just for stills. You sure you're not thinking of ProRes RAW?
it seems like you are saying this based on absolutely nothing. While you may be right (doubtful) there are other key differences which may enter into the speed of computation which is impacting the result. 1) RAM, 2) extra camera, 3) LIDAR (faster auto-focus, night mode portraits - extra processing). Which actually makes sense, because those features (except RAM of course) are not available on the selfie camera and all models are limited to recording dolby vision at 30fpsProduct differentiation, profit, and 60FPS considered "Pro" Probably no technical reason but a sales/marketing one.
Money. It’s kinda obvious.Wonder what the reason is for the non-pros only getting 30 FPS.
Curious to know what's the bitrate of the Apple ProRAW format compared to HEVC (H.265).
Just for comparison, on the Camera settings it says a HEVC 4K 60fps recording takes about 400 MB/min (= 3200 Mb/min = 53.3 Mb/s).
99.999% of the purchasers of this phone don’t go through a proper post workflow. I think they’ll be fine.and it encourages people to not go through a proper post workflow.
Looks kind of like the DJI OM 4.What’s that device he’s using to hold the iPhone? I have to get one of those.
Wonder what the reason is for the non-pros only getting 30 FPS.
What’s that device he’s using to hold the iPhone? I have to get one of those.
● 10-bit P3 D65 capture packaged in a 10-bit Rec.2020 container.
● 10-bit Rec.2020 capture in a 10-bit Rec.2020 container.
Dolby Vision can support 12-bit, but it doesn’t have to. Minimum requirements are 10-bit.
I’m curious why it only supports 700 million colors though? That’s the number quoted by Apple during the event. 8 bit is 16.7M colors (256x256x256), whereas 10 bit is 1.07B colors (1024x1024x1024). So, is there some middle gamut or bit depth they’re sampling from? Makes no sense. It’s not true 10 bit if it’s not reaching 1.07B.
I’m curious why it only supports 700 million colors though? That’s the number quoted by Apple during the event. 8 bit is 16.7M colors (256x256x256), whereas 10 bit is 1.07B colors (1024x1024x1024). So, is there some middle gamut or bit depth they’re sampling from? Makes no sense. It’s not true 10 bit if it’s not reaching 1.07B.
I was wondering the same thing. Perhaps the sensors being so small can only capture that amount of colour data.