Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

ronm99

macrumors 6502
Original poster
Jan 13, 2012
336
83
The Deep Fusion segment of the Apple keynote did a poor job of explaining what it actually does. Phil just said how amazing it was, but didn’t say why, and the photo they showed did not have a with / without to see.

I’ve seen a lot of articles implying that it is Apple’s answer to the Pixel’s “Night Sight”, but Apple had a separate section on their own night mode which makes me think that this is different. Apple’s night mode is also on a different release schedule.

My guess is that it is an attempt to make all photos more sharp like you get with good glass on an SLR. It will probably help night shots, but hopefully will help other photos as well. But this is just my guess and could be completely wrong.
 
Deep Fusion is essentially image stacking, which results in much less noise and much more detail..

but done in an even more advanced way with additional long exposure photo and AI.

Not trying to bring everyone's hopes or seem like I've seen the results, but I've done simple image stacking/long exposure on my Xs Max and it gives almost DSLR-like clarity. It's that good. Imagine that but even better.
 
  • Like
Reactions: 44267547
Deep Fusion is essentially image stacking, which results in much less noise and much more detail..

but done in an even more advanced way with additional long exposure photo and AI.

Not trying to bring everyone's hopes or seem like I've seen the results, but I've done simple image stacking/long exposure on my Xs Max and it gives almost DSLR-like clarity. It's that good. Imagine that but even better.

Thanks for the clarification. Sounds exciting! I’ll probably sell my DSLR / lenses soon. I rarely use it any more.
 
The Deep Fusion segment of the Apple keynote did a poor job of explaining what it actually does. Phil just said how amazing it was, but didn’t say why, and the photo they showed did not have a with / without to see.

I’ve seen a lot of articles implying that it is Apple’s answer to the Pixel’s “Night Sight”, but Apple had a separate section on their own night mode which makes me think that this is different. Apple’s night mode is also on a different release schedule.

My guess is that it is an attempt to make all photos more sharp like you get with good glass on an SLR. It will probably help night shots, but hopefully will help other photos as well. But this is just my guess and could be completely wrong.
Deep Fusion is a marketing term. Steve Jobs could get away by saying it’s Magic. Next year they will call it something else that is equally as vague. Deep Fusion = Enhanced Synergy

It’s like Chevron with Techron
 
It means a deep neural net will be computing the pixels in the resulting photo you see, so complex and deep that nobody really knows exactly if or how the results are related to any raw pixel data from the camera sensors. The images will be a completely made up fiction.
 
Thanks for the clarification. Sounds exciting! I’ll probably sell my DSLR / lenses soon. I rarely use it any more.
You will love setting yourself free from your DSLR. I have only used my DSLR 3-4 times since buying my Pixel 2 XL last year in the spring.

Photography before entailed the following:
1. Getting the DSRL out.
2. Shooting 5-20 photos of anything just in case some photos are blury, out of focus or not usable.
3. Take the SD card out, pop it into the macbook and edit photos which can take a few hours for holiday photos.
4. Export photos to share with friends or family.

After buying the Pixel 2 XL:
1. Grab phone one handed, double tap volume button to open camera and press again to shoot.
2. Share photos with friends or family.


I still highly rate DSLRs though but they lend themselves best to shooting still objects and people or when you have time to do "photography". Otherwise, computational photography on smartphones is so good that for 95% of occassions. For scenery or landscapes my phone always get dynamic range, white balance and colour right. Even on instagram I share my photos without editing. Occassionally I might press the auto edit button in Google Photos if I want a bit more pop.
 
  • Like
Reactions: ronm99
Deep Fusion is a marketing term. Steve Jobs could get away by saying it’s Magic. Next year they will call it something else that is equally as vague. Deep Fusion = Enhanced Synergy

It’s like Chevron with Techron


It's almost like Apple are the only company on the planet that use marketing terms the way people keep digging.

Not that I'm digging at you.

Why would they not give it a name? I don't see the issue tbh.
 
My questions is: Does Deep Fusion kicks in on every photo you shoot on the iPhone, or does it apply on certain situations it 'sees' like landscape, portrait modes or Night Mode?

Does Deep Fusion apply to older phones as well?


Is there a technical site or documentation that elaborates on what goes on behind the scenes?
 
I may need to listen to the Verge podcast again, but I thought they said Deep Fusion wasn't ready and it will not roll out day one.
 
I may need to listen to the Verge podcast again, but I thought they said Deep Fusion wasn't ready and it will not roll out day one.
Due sometime in October.

I expect they are waiting to see what Google does with the new Pixel as the Google event was announced yesterday for October 15. Then make some last minute changes, and roll it out roughly at the same time to compete.
 
It's almost like Apple are the only company on the planet that use marketing terms the way people keep digging.

Not that I'm digging at you.

Why would they not give it a name? I don't see the issue tbh.
Actually Apple is using BS marketing terms more way more than any other company. Only Apple can say that they use "liquid retina display" (also known as LCD for everyone else).
 
I hope deep fusion will actually be good.

Google is still way ahead when it comes to camera software. My Pixel 2 which is 2 years old takes noticeably better photos than my partner's iPhone Xr.

Googles dynamic range is post processing is plainly awesome to the extent that I haven't used my DSLR since buying the phone or have had to edit photos.
 
You will love setting yourself free from your DSLR. I have only used my DSLR 3-4 times since buying my Pixel 2 XL last year in the spring.

Photography before entailed the following:
1. Getting the DSRL out.
2. Shooting 5-20 photos of anything just in case some photos are blury, out of focus or not usable.
3. Take the SD card out, pop it into the macbook and edit photos which can take a few hours for holiday photos.
4. Export photos to share with friends or family.

After buying the Pixel 2 XL:
1. Grab phone one handed, double tap volume button to open camera and press again to shoot.
2. Share photos with friends or family.


I still highly rate DSLRs though but they lend themselves best to shooting still objects and people or when you have time to do "photography". Otherwise, computational photography on smartphones is so good that for 95% of occassions. For scenery or landscapes my phone always get dynamic range, white balance and colour right. Even on instagram I share my photos without editing. Occassionally I might press the auto edit button in Google Photos if I want a bit more pop.

Late model smartphones certainly do offer excellent image quality in such a small package, and I am glad to see ultrawide getting some love from Apple on the new iPhones. Interchangeable lenses are the main reason I still shoot with dedicated camera gear. I use ultrawide rectinlear and fisheye lenses a lot, but also long telephoto. I did get rid of my DSLR gear, but only so I could buy more Olympus m4/3 gear. I can shoot RAW+JPEG on my E-M1 Mark II, and easily share photos quickly using the Olympus app on my iPhone to connect to the camera via WiFi. Oly’s JPEG engine is excellent, but I can still mess with the RAW files later if I have the time and desire.

The nice thing is that smartphones can now be a capable part of any photography kit. I have used my iPhone X alongside my Olympus on many occasions... especially if I have my 40-150mm f/2.8 on my Oly. I can pull the iPhone out for a quick wide shot without changing lenses. That said, for extended shooting I will always prefer a camera with a viewfinder and external controls. The dual CD/PD autofocus on the Olympus is also much faster... especially in fading light.
 
  • Like
Reactions: Macalicious2011
Late model smartphones certainly do offer excellent image quality in such a small package, and I am glad to see ultrawide getting some love from Apple on the new iPhones. Interchangeable lenses are the main reason I still shoot with dedicated camera gear. I use ultrawide rectinlear and fisheye lenses a lot, but also long telephoto. I did get rid of my DSLR gear, but only so I could buy more Olympus m4/3 gear. I can shoot RAW+JPEG on my E-M1 Mark II, and easily share photos quickly using the Olympus app on my iPhone to connect to the camera via WiFi. Oly’s JPEG engine is excellent, but I can still mess with the RAW files later if I have the time and desire.

The nice thing is that smartphones can now be a capable part of any photography kit. I have used my iPhone X alongside my Olympus on many occasions... especially if I have my 40-150mm f/2.8 on my Oly. I can pull the iPhone out for a quick wide shot without changing lenses. That said, for extended shooting I will always prefer a camera with a viewfinder and external controls. The dual CD/PD autofocus on the Olympus is also much faster... especially in fading light.
Yes. DSLRs can be bulky and it can be cumbersome to change lenses but I love the satisfaction of admiring the quality of the photos on my laptop or iPad.

Last week I printed an A2 sized version of a close up of my sons face. It was shot with my DSLR and the clarity and immersion is unmatched by any phone. However for 95% of the time my phone is faster.

I look forward to shooting with my 11 Pro Max. I am excited about filming in 4k 60fps with HDR. Currently on my Pixel 2 you have to choose between 4k 30fps or 1080p 60fps.

My other cameras are Lumix LX3 and Fujifilm X10. Both are brilliant but you need to shoot in manual to get the most out of them. My Pixel 2XL is faster, has a larger viewfinder and superior image processing.

Pocket cameras are pretty much redundant unless you are a vlogger or a professional photographer.
 
Last edited:
  • Like
Reactions: sean000
Yes. DSLRs can be bulky and it can be cumbersome to change lenses but I love the satisfaction of admiring the quality of the photos on my laptop or iPad.

This!

Sometimes I do admire how amazing some of my shots are when I took photos on my iPhone. I will be like "Wow, did I take this shot with my phone?"
 
Actually Apple is using BS marketing terms more way more than any other company. Only Apple can say that they use "liquid retina display" (also known as LCD for everyone else).


People that are technically savvy can see through the marketing. I'd wager most don't even care. My missus for example. She'll get the new iphone 11 and won't care for any of the tech stuff at all. Some people like shiny new things after two or three years. When Apple say Liquid Retina, I just smile and sometimes laugh. I know they're reinventing the wheel, but God Damn it, they do shiny better than most.
 
  • Like
Reactions: Macalicious2011
Some of my favourite photos are poor quality mid 2000’s phone camera - best camera is the one you have with you

Some other favourites are DSLR with long lenses - candid stuff with the family that I wouldn’t have gotten with a wider phone camera, and background separation/compression from longer lenses likewise

Quality - I’m sure I could live without a DSLR. Convenience- 110%. But lens flexibility especially longer lenses I can’t give that up yet
 
  • Like
Reactions: spooky23
My questions is: Does Deep Fusion kicks in on every photo you shoot on the iPhone, or does it apply on certain situations it 'sees' like landscape, portrait modes or Night Mode?
I heard it does it on every photo on the Pro. I played with the phone. Not sure about the 11.
 
Actually Apple is using BS marketing terms more way more than any other company. Only Apple can say that they use "liquid retina display" (also known as LCD for everyone else).
Ya sure about that? ;)
(I think by “way more” you meant “about the same as,” or in Samsung’s case “much less than.”)

SAMMY:
30DF8B9E-DAA8-40BA-B139-A23A05B71615.jpeg “Infinity-O”

0F4C32D2-54BD-4732-B9F6-1CE0048CDA68.jpeg “Cinematic Infinity”

HUAWEI:
EA4ADEA3-D436-49C8-9A25-23E241440533.jpeg “Double 3D Dewdrop Display”

MOTO:
EB253E79-D34C-44EA-B2E6-57D89FC3AD88.jpeg “Max Vision Display”

XIAOMI:
AA3C3204-418C-431C-84E0-D47114BC09C6.jpeg “Breaking boundaries for an unlimited view” [Ugh, finally!]

And as for Google Pixel? Well, the past 2 years they’d rather just NOT bring up the topic of displays.
:oops:

In other words, the marketing departments job is to sell the product. Every company does it, and it makes more sense, honestly, to come up with catchy terms than it does to spout specs/technology to folks who couldn’t care less about pixel shape & density, nits, contrast ratios, etc.
 
Last edited:
There's a feature on the Internet called "search". If you type in "Phil Schiller deep fusion" several pages come up that explain what it is. Try it.
 
  • Like
Reactions: willmtaylor
Deep Fusion is essentially image stacking, which results in much less noise and much more detail..

but done in an even more advanced way with additional long exposure photo and AI.

Not trying to bring everyone's hopes or seem like I've seen the results, but I've done simple image stacking/long exposure on my Xs Max and it gives almost DSLR-like clarity. It's that good. Imagine that but even better.

Outstanding post. And even more so, you made it clear enough to break it down simplistically for everyone. Nice job.
 
  • Like
Reactions: alpi123
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.