Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Not long ago, as soon as I received my iPhone 12 Pro I downloaded Halide from the App Store figuring I'd give it a whirl over the seven day free trial period. I had high hopes for it, always wanting to shoot RAW for tricky lighting/contrast situations.

But...after using it a few days, I determined it wasn't for me. It felt cumbersome using it. I really want to like it, but didn't see a path forward where I felt like it would be second nature, like it is in the built-in camera app from Apple. Often I need to make a photo within a second or two of seeing a potential photograph and it seemed like it would be in the way. So I canceled within the trial period.

Now... That's just my opinion, and perhaps I didn't give it enough time for it to be conformable using. I know Halide has a large following, and they've been around a long time. Someone who uses it regularly would provide much better feedback.

I will say with iOS 14.3 shooting ProRAW is easy once it's enabled in Settings. Once that's done, and with the camera app open, you just touch the RAW button at the upper right corner of the screen. It defaults to off. Turning it on is temporary. Supposedly it lasts for a couple of minutes of non-use before turning off automatically. I think that's a great feature as I'd only use ProRAW for certain situations. The .DNG RAW files it produces are rather large at around 25 megabytes.

From the iPhone photo app I can AirDrop the .DNG RAW files to a folder in my M1 MBA, and then import them into Lightroom CC for processing. All of that seems to work fine. I'm happy!

Hope the above helps!
That is very helpful, thanks!

I’m in the same boat. I want to use the app (the inner photographer in me sees some value) but overall I haven’t seen much benefit, especially now that ProRAW is out.

The one thing that I don’t love is Apple’s excessive noise reduction. I like the grain and sharpness from pure RAW shots.

I opted out from Halide after the 7-day trial but wanted to understand if there was benefit after ProRAW was released.
 
  • Like
Reactions: citysnaps
Please do enlighten us how many bits per component RAW should have to be considered pro.
I was wondering about that 12bit number as well. On modern DSLR / mirrorless a typical RAW is 14bit, which roughly corresponds to the dynamic range capability of sensors in these camera. But if the iPhones can capture 2 shots or more at the exact same moment with different exposures then it means the resulting computational composite can be getting more than 12bit depth. (like bracketing 2 shots a few EV from each other at the same time and then end up with an HDR composite)

This diagram illustrates how you can effectively capture 2 more stops in the shadows with 14bit vs 12bit:
8559CFCC-8AD4-4DEB-B5EF-F61D4C183132.png
 
I was wondering about that 12bit number as well. On modern DSLR / mirrorless a typical RAW is 14bit, which roughly corresponds to the dynamic range capability of sensors in these camera. But if the iPhones can capture 2 shots or more at the exact same moment with different exposures then it means the resulting computational composite can be getting more than 12bit depth. (like bracketing 2 shots a few EV from each other at the same time and then end up with an HDR composite)

This diagram illustrates how you can effectively capture 2 more stops in the shadows with 14bit vs 12bit:
View attachment 1695802
This is a fair point (and the diagram incidentally explains dynamic range quite well).

How close are we to getting phone-sized sensors that can capture 14bit? Are those a thing on any Android phone yet?
 
they use that as the reason, 4gb probably enough to do it. my 10 yrd old desktop can do it with 8 gb, im sure modern day efficient processors can handle this.... it is all marketing.

Flapdoodle. Your 10-year-old desktop cannot take ProRaw photsos.

All you’ve proven is that you don’t know what you’re talking about. It’s all trolling.
 
This is a fair point (and the diagram incidentally explains dynamic range quite well).

How close are we to getting phone-sized sensors that can capture 14bit? Are those a thing on any Android phone yet?
I would say sensor capabilities on cellphones are physically limited by its surface area. So even when given the same technology that is present in a larger camera's sensor, including capturing and process 14bit bit depth, you are probably still faced with way too much noise in the darkest end of most normal scenes. When all you get are noises then it really doesn't help if you capture more of it. In fact on DSLR / mirrorless that are full frame, it has been argued that even 12 bit is enough for spontaneous photography, and only with a studio setting where every bit of light is controlled or in a naturally high dynamic scene such as landscape with a bright sun that you can still recover shadows in 14 bit. Some medium format sensor camera now also shoot in 16bit RAW, which is also debated to be a waste of file size against similar arguments.

With Apple's expertise in software, and also its freedom in creating this ProRAW specification, I would venture to guess they know what they are doing when choosing 12bit as a base. As long as ProRAW is limited to be generated only on Apple's products, then its capabilities only need to be as good as the sensors that are used in them.
 
  • Like
Reactions: chucker23n1
so the current generation ipad pro with 6 gb of ram cant do proraw? Such a shame for the pro moniker and the latest apple pro ipad.

this is a money grab folks, plain and simple.... the fan boys and stockholders love it.

I am sorry - were you expecting me to be upset?
 
Flapdoodle. Your 10-year-old desktop cannot take ProRaw photsos.

All you’ve proven is that you don’t know what you’re talking about. It’s all trolling.


you are correct it can’t, my wording was off. what I was trying to get across is that my 10 year old intel i7 that has plenty of horsepower to process them. call it trolling if you will since it went over your head.
 
Last edited:
you are correct it can’t, my wording was off. what I was trying to get across is that my 10 year old intel i7 that has plenty of horsepower to process them. call it trolling if you will since it went over your head.
No. Your 10 year old i7 is just one part of the equation...the CPU...but even then, i am not sure A14 CPU is less powerful. Even with discrete GPU from 10 years ago, it is likely not as powerful as A14 GPU. Add in neural engine to process and allocate the data from the camera sensor, which your 10 year old chip cannot do. Add in the image processor needed.

Your argument is pretty bad any way you try to spin it. It shows lack of...thought?

Again, ProRaw and computational photography involve more than just ONE part of the whole system. CPU, GPU, IP, neural engine, and RAM. I am sure that there are other minor components that is involved.
 
No. Your 10 year old i7 is just one part of the equation...the CPU...but even then, i am not sure A14 CPU is less powerful. Even with discrete GPU from 10 years ago, it is likely not as powerful as A14 GPU. Add in neural engine to process and allocate the data from the camera sensor, which your 10 year old chip cannot do. Add in the image processor needed.

Your argument is pretty bad any way you try to spin it. It shows lack of...thought?

Again, ProRaw and computational photography involve more than just ONE part of the whole system. CPU, GPU, IP, neural engine, and RAM. I am sure that there are other minor components that is involved.
Here you are again , I am talking about horsepower. If my desktop had a camera and the require sensor , it could do it. maybe a bit Slower. Speaking of spinning things, If you looked at who I was replying to, it was to a post saying the reason the iPhone 12 can’t do proraw is because it only has 4 gig of ram.

lack of thought, maybe think outside of the box rather than what Apple wants you to think. You are wearing your apple armor well this morning.
 
you are correct it can’t, my wording was off. what I was trying to get across is that my 10 year old intel i7 that has plenty of horsepower to process them

If the CPU in the A14 barely has enough performance to process an image in multiple seconds, it's completely impractical to do on a 10-year old machine, unless you're OK with waiting a minute for each image. Which you're not.

I'm also not sure what kind of camera right you're planning to do that includes an i7. Good luck fitting that in a phone body.

. call it trolling if you will since it went over your head.

It's funny how you keep telling others that "it's going over their head" and they "don't get a lot of things, including my thoughts". Maybe that's more of a you problem.
 
Here you are again , I am talking about horsepower. If my desktop had a camera and the require sensor , it could do it. maybe a bit Slower. Speaking of spinning things, If you looked at who I was replying to, it was to a post saying the reason the iPhone 12 can’t do proraw is because it only has 4 gig of ram.

lack of thought, maybe think outside of the box rather than what Apple wants you to think. You are wearing your apple armor well this morning.
Horsepower? ProRAW is not run only on CPU. That was my point. Geez.

ProRAW involves the whole spectrum of hardware. Not just CPU. Not just RAM. Not just GPU.

I think that a wall is more logical.
 
Here you are again , I am talking about horsepower. If my desktop had a camera and the require sensor , it could do it. maybe a bit Slower. If you looked at who I was replying to, it was to a post saying the reason the iPhone 12 can’t do proraw is because it only has 4 gig of ram.

lack of thought, maybe think outside of the box rather than what Apple wants you to think. Your wearing your apple armor well this morning.

If the CPU in the A14 barely has enough performance to process an image in multiple seconds, it's completely impractical to do on a 10-year old machine, unless you're OK with waiting a minute for each image. Which you're not.

I'm also not sure what kind of camera right you're planning to do that includes an i7. Good luck fitting that in a phone body.



It's funny how you keep telling others that "it's going over their head" and they "don't get a lot of things, including my thoughts". Maybe that's more of a you problem.
you are the one who quoted "my thoughts" re-read your post. who is the problem? and learn a bit more how desktop architecture works compared to a SOC design.... even a common modern day desktop CPU still uses about 45 watts of power just for the CPU. so that power is just wasted to heat or is it capable of doing much more work at a given time compared to the mobile processor? while the a14 consumes about 7 watts... watt for watt the A14 maybe as fast or faster but it is comparing two different breeds

again, where did I say Im planning to put an i7 desktop in a phone body? right over your head again and twisting my words...
 
Last edited:
And your thoughts are… "Apple is terrible because the feature I'm not going to use anyway doesn't exist on my phone, nor in an iPad with a much weaker sensor and CPU, but does exist in this particular phone, but also I don't care because the pictures are already good enough regardless"? Do I have that right?

where did I say apple is terrible? pure hyperbole nonsense but thanks for telling me what my thoughts are
 
Folks who know of only digital post processing have no idea of the work that the masters like Adams had to go through to get their final results. You can do in moments what used to take hours in a darkroom - and with out having to deal the chemicals required. I have spent many, many hours in the darkroom working on getting that perfect image. I don't miss those days at all.

Well most of those days anyway. There were those days when dating my wife...
I scan film and there is nothing stopping me to processing the digital scanned film in lightroom now lol
 
I was wondering about that 12bit number as well. On modern DSLR / mirrorless a typical RAW is 14bit, which roughly corresponds to the dynamic range capability of sensors in these camera. But if the iPhones can capture 2 shots or more at the exact same moment with different exposures then it means the resulting computational composite can be getting more than 12bit depth. (like bracketing 2 shots a few EV from each other at the same time and then end up with an HDR composite)

This diagram illustrates how you can effectively capture 2 more stops in the shadows with 14bit vs 12bit:
View attachment 1695802
Your diagram and statement "you can effectively capture 2 more stops in the shadows with 14bit vs 12bit" are misleading.

RAW bit depth is nothing about captured Dynamic Range. DR is a characteristic of the sensor or scene (if we are talking about HDR). RAW is just a storage for that data. Think about bits and DR like about a stair: DR is a height of stair while bits are the number of steps in a stair.

In other words you can "pack" 20Ev in 12-16bit RAW file, but having 24bit RAW file for that purpose will be a much better as it will preserve all gradations for whopping 20Ev capable sensors.

That's why most mirorrless cameras have 14bit raw files while their sensors actually delivering only 12-13 stops of DR. Even most advanced medium format cameras like Hasselblad are delivering almost 14Ev of DR (according to different tests like DxO) while having 16bit RAW files.

Another example: shooting videos in Log gamma curve will allow you to preserve up to 12-13Ev while codecs like hevc will use only 10bits to store that DR. Some hollywood grade codecs like Apple ProRes are using up to 12bit while most cinema cameras are delivering 15+Ev of DR. Even more: some semipro cameras like Sony A7 III are using only 8bit codecs for same 12-13bit DR latitude (which is a nightmare for postproduction). Think about that.

Bits are about preserving details and gradations and is nothing about capturing more/less Ev.

BTW, I'm still preferring to shoot HDR DNG with iOS Lightroom camera which still delivers wider DR than Apple ProRaw on my 12 Pro Max. They both produce DNG files but Lightroom is way better dealing with high DR scenes.
 
Last edited:
Your diagram and statement "you can effectively capture 2 more stops in the shadows with 14bit vs 12bit" are misleading.

RAW bit depth is nothing about captured Dynamic Range. DR is a characteristic of the sensor or scene (if we are talking about HDR). RAW is just a storage for that data. Think about bits and DR like about a stair: DR is a height of stair while bits are the number of steps in a stair.

In other words you can "pack" 20Ev in 12-16bit RAW file, but having 24bit RAW file for that purpose will be a much better as it will preserve all gradations for whopping 20Ev capable sensors.

That's why most mirorrless cameras have 14bit raw files while their sensors actually delivering only 12-13 stops of DR. Even most advanced medium format cameras like Hasselblad are delivering almost 14Ev of DR (according to different tests like DxO) while having 16bit RAW files.

Another example: shooting videos in Log gamma curve will allow you to preserve up to 12-13Ev while codecs like hevc will use only 10bits to store that DR. Some hollywood grade codecs like Apple ProRes are using up to 12bit while most cinema cameras are delivering 15+Ev of DR. Even more: some semipro cameras like Sony A7 III are using only 8bit codecs for same 12-13bit DR latitude (which is a nightmare for postproduction). Think about that.

Bits are about preserving details and gradations and is nothing about capturing more/less Ev.

BTW, I'm still preferring to shoot HDR DNG with iOS Lightroom camera which still delivers wider DR than Apple ProRaw on my 12 Pro Max. They both produce DNG files but Lightroom is way better dealing with high DR scenes.
Thanks for giving a clear explanation.

I am no optical engineer, just a semi-pro photographer who sometimes struggle to recovery shadows in actual landscape photography. It seems to reflect what I see while dealing with shadows: in a 14-bit RAW file (camera is Nikon D850, captures at base ISO 64), it is clear that I could recovery much more detail compared to shooting the same scene in a 12-bit RAW. This led me to believe it extended my DR in the darker end of the spectrum.

According to what you describe, my thinking above is false. And the fact that I could discern more details was due to the "higher number of steps" within that darker spectrum, such that when raised to a brighter range the smooth gradation and lack of noise is still preserved, I was just re-mapping the tones to a different, more visible number.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.