Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It's interesting that the technology utilizes blending IR images with standard. Current smartphone and camera sensors all have an IR cut filter to prevent the red cast that comes from near-IR light. Pure IR is also cut out by these filters. I have a modified camera that shoots IR, and you do get more detail in certain natural subjjects.
[automerge]1576236617[/automerge]
nature has shown it is most definitely possible capturing great images without multiple massive lenses. Eyeballs prove there is room for improvement.

Those eyeballs also have massive processing power behind them.
 
  • Like
Reactions: Kiorr and CarlJ
nature has shown it is most definitely possible capturing great images without multiple massive lenses. Eyeballs prove there is room for improvement.
True, but to be fair, the standard-issue human eyeball is way larger/deeper than the iPhone cameras, and the "machine learning" in the drivers for that has several million years of training to fine-tune the software.
 
Why are people so into improving the camera quality? The iphone 5 used to take great pictures for your instagram posts. When the camera phone was introduced there was a huge improvements, now they use lab equipment just to prove that the new camera is better than the old camera.
 
Why are people so into improving the camera quality?
Because they can. Because iPhone cameras have not yet caught up to DSLRs, but continue to make progress. Because not everyone uses their iPhone camera just for instagram posts.

Every pic I've taken of my nieces growing up has been with an iPhone camera, because that's what I had with me at the time (I don't generally carry around a larger camera day-to-day). They'll never be that age again, so I'd like the best pics of them possible. And the pics from my iPhone 8 now are objectively better than the ones my iPhone 6 used to take (which were better than the 5 and 4 before it).

The first digital cameras took images we wouldn't want to use for thumbnails these days, but they were the best available at the time. Someday we'll look back at the images we're taking today and we'll think, "that was a great time, but those images are so coarse and grainy, I wish we had better pictures of that magical moment". So, the constant striving for better camera quality.

(And, yeah, a huge percentage of images taken are throwaways that don't need super quality - but you never know until some time later which were the ones that really matter.)
 
That was literally my immediate fear upon seeing this article... Since it's supposedly software-based, I would love this to come to my 11 Pro via update, but I'm not holding my breath.
Apple will need to add an IR camera first:
“Spectral Edge's technology captures an infrared shot and then blends it with a standard photo to make images crisper and to improve color accuracy.”
 
I am really looking forward to Apple's camera improvements the upcoming years. The iPhone 11 Pro series really knocked it out of the ballpark this year, after some good but not great camera's in its predecessors. If the 11 Pro's pictures are already great, one can only imagine how great the pictures of the iPhone 12/13/14 will be!
 
[automerge]1576246340[/automerge]
interesting what can be done with infrared cameras in low light!

 
True, but to be fair, the standard-issue human eyeball is way larger/deeper than the iPhone cameras, and the "machine learning" in the drivers for that has several million years of training to fine-tune the software.
Yes, and our brains are much much larger than the A12 chip. There’s still a chasm of difference. My point is simply that those who think in the terms of past equipment with roots in mechanical and physical input/output are myopic into what may be possible when the technology takes a drastic turn
 
Last edited:
  • Like
Reactions: CarlJ
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.