Separate names with a comma.
Discussion in 'MacRumors.com News Discussion' started by MacRumors, Jan 4, 2014.
My first thought when I saw this article..
Hopefully Apple didn't base their purchase on these screenshots only.
I rarely use the camera on my iPhone, and since I do photography as a hobby I prefer either a proper separate powerful digital camera with large sensor - 16.2MP / 16x Optical Zoom / 24mm lens, as camera's in phone's are just an added "feature", primarily for convenience, but as with all convenience, there are always negatives.
Good to see them showing RESPECT towards the original developers too. Apple's 'copied' and basically bankrupted a few devs in the past. Widgets are a big example...
BUT in recent years it has been Google copying everybody and releasing web-based solutions for 'free' (earning billions through adsense/adwords in the process).
Big companies should lead by buying out small creative developers (who they COULD kill using power/litigation). A few million is nothing to big companies and a LOT to an independent developer. Credit where due - well done Apple on not being 'evil' like some other companies.
A growing number, actually, who have jumped on the mirrorless bandwagon. Or who shoot film.
Another advantage of DSLRs is that they aren't built with freaking DUST UNDER THE LENS! Some people here should know what I'm talking about.
It looks like they used assembly language to tune the JPEG compression perfectly for the iPhone processor. Even C, as I have found from Googling, cannot achieve the precision of assembly language sometimes.
'Cause ew, assembly language is like so not object-oriented. Translation: So few people actually care about efficiency anymore, and for some reason they only know about object-oriented programming.
Uh yeah, I meant a "real" camera, not specifically a digital single lens reflex. The point is no one on earth has a studio lighting setup intended for use with a camera phone. Duh. Are you trolling or just dumb?
Correct me if I'm wrong but I think Apple bans assembly from the app store (or did). This was (is) a reason why many devs publish emulators based on ARM-assembly using cydia.
I believe it's the theory behind this code rather than the language that is important.
The dev reinvented the way in which .jpg files are encoded/saved using pure scientific theory at a low level. The speed gained by doing so allowed the software to save 20 images per second.
FWIW my DSLR can't do that!!!! I'm sure Nikon/Canon/Sony...etc would love the technology as DSLRs could use it for MASSIVE photos (and cameras that may or may not have the grunt of a smartphone).
This is awesome stuff... and it's made by an Aussie
Aussie Pride MF!!!
30fps? That means Apple would come up to say "we r the first to video-shoot at '4K resolution'" when the iPhone releases?
I thought the app saved '20 photos per second'. Maybe the 5S boosted that to 30?
Either way I think the advantage is that quad-core Android phones won't be able to save (and therefore capture) photos anywhere near as quickly as iPhones.
You're right... come higher resolutions this will possibly be a big ticket item. While fandroids with 16-cores, 20,000 mAh batteries, 12" screens and 40MP cameras will be taking single high-res photos... then loading... then taking another, our sleek, dual-core iPhones with 2000 mAh batteries (that last 10 times as long because the phone is so small/thin you hardly notice the weight) will be different. We'll be able to hold down the camera button and take 30+ high-res images per second.
Note: This isn't just some kid with a cute idea/fad. This is an academic with a serious algorithm.
Actually, both the Note 3 and the 10.1 2014 already shoot true 4K video - and they're pretty good at it.
Assembly (asm) is not banned from the App Store, or even frowned upon. Many developers use it to create highly optimized code for the ARM architecture. There's no rules against that, it's totally in the clear.
What you're thinking of is JIT (Just In Time) recompilation. This is a technique used by a lot of emulators to dynamically recompile machine code from one architecture to another, then run it on the host CPU natively. It's how emulators like Dolphin (Mac & Win) and PPSSPP (Mac, iOS, Android, Blackberry, and Windows) work.
Unfortunately JIT requires the ability to mark regions of memory as executable, which iOS does not allow. This would effectively break the chain of security Apple has setup, since they would no longer be able to ensure that all running code in memory was cryptographically signed and approved by the App Store team.
Of course this doesn't stop jailbreakers from running emulators that do JIT anyways. It's not that iOS isn't capable of running a JIT based emulator, it's just that the App Store submission process won't approve an application that makes use of it.
Yes, the 5S has significantly faster hardware. This is why, for example, my full sensor upsampling tweak is able to record 4:3 1664*1248 video at a steady 30p under good lighting, unlike previous, Full HD-capable iPhone models (4S, 5, 5c). The latter can only record at around 17-18 fps.
More info on my upsampler: http://forums.macrumors.com/showthread.php?t=1538193
Well, both mirrorless and film cameras are far-far closer to DSLR's than to iPhones. Practically, there's no difference WRT the ability to have the same DoF with similarly bright and/or long lens and similarly large sensors.
Also factor in that Apple isn't a charity. They want you to purchase their next model and try not to make their "old" models significantly more capable than they were originally shipped. This is why they refuse to add a lot of functionality to older models that, otehrwise, would be perfectly capable of doing them. They even take away existing, perfectly working functionality to boost the sales of new models - think of, say, 60p slow-motion recording on the iPhone 4S. To do the opposite? Almost never - no wonder SnappyCam couldn't stay in the AppStore...
Of course, this is great for their shareholders, just not for us.
Shhhh! You're giving the game away!
16x zoom, starting at 24mm? It's not a "proper separate powerful digital camera", then, "only" a small-sensor P&S or a superzoom, which I'm absolutely sure doesn't deliver (significantly) better image quality than the iPhone. Heck, if it's, say, a Sony HX-series P&S camera or some of the earlier Pana TZ / ZS models (for example, the ZS10 or the ZS20, both having pretty bad IQ), its IQ is actually way worse than that of the iPhone.
Anyone with basic photography knowledge knows the dog photo just couldn't be shot with any small-sensor camera. Given that it uses a very shallow Depth-of-Field, I'd say it's a full-frame shot with f/2.8 (or even larger), at 50mm equiv or even higher.
Not true. There are still a lot of areas many P&S cameras, especially high image quality-ones like the Sony RX100 (Mk II) are significantly better than the iPhone:
- all of them have zoom lenses starting at a wider FoV (24...28mm) than even the 30mm equiv iPhone 5s, let alone older iPhone models with 33...35mm.
- many of them have RAW support, even the cheaper, but still excellent ones like the Nikon P330.
- many of them have significantly better high-speed, hi-res video; for example, the RX100 (MkII), the Pana ZS series etc. and their 1080p60 recording.
- many of them have significantly better extra high-speed, standard-res video
- many of them have xenon flash, stereo audio recording
- some of them have an eyefinder (added stabilization & better image composition on sunny days)
- many have manual modes, which is plain unavailable on the iPhone (not even as of iOS7) - even really cheap ones like the P330
- and, of course, most have optical image stabilization, which is orders of magnitude better than even iPhone 5s's, compared to real OIS-based systems, pretty pathetic IIS gimmick.
You don't need an expensive camera or lens to take that image. Probably you will need a lens with a focal length over 100mm and any old sensor behind it. As long as the timing is right, you're good as gold. Timing is key as well as focal length. You don't even need that fast of a lens. f/4 135mm on APS-C throws the backgrounds out of focus probably about as much as this image. And an f/4 135 lens can be had for under 100$.
Pick up a used D70 or D100 or Canon Kiss from 2003 and you're gold. Done for 200$- as long as you can time and plan that shot.
Exactly, unless you go for a used, but preferably still fullframe DSRL and/or longer, but not very slow primes, which significantly drive the price down.
Nevertheless, the app is (was) still REALLY good. Back in August, when I reviewed his app for my Action Shooting bible, I've very thoroughly tested it. (And have also talked to John over Skype. Lolz, back then, I too recommended him that he tries to sell his tech to tech companies, it being so revolutionary.)
That picture of the baby requires a $20,000 hospital bill, one very sore vagina, and lots of visits from in-laws.
Thanks for clearing that up mate. Here's pretty well what you said (and more detail about the programming used) from the dev's mouth:
First we studied the fast discrete cosine transform (DCT) algorithms
We then extended some of that research to create a new algorithm thats a good fit for the ARM NEON SIMD co-processor instruction set architecture. The final implementation comprises nearly 10,000 lines of hand-tuned assembly code, and over 20,000 lines of low-level C code. (In comparison, the SnappyCam app comprises almost 50,000 lines of Objective C code.)
You forgot to mention that a dog and a frisbee would also come in handy for a pic like that...
Read up on innovation in maturing companies and you will see how ignorant your comment is.
Guys! Breaking news! I just got this exclusive comment from Apple!
Apple buys smaller technology companies from time to time, and we generally do not discuss our purpose or plans"
Wow. What a scoop!
We still use our P&S cameras as they take materially better pictures than our iPhone 4S and 5. We rarely use our digital slr due to its size but the image quality is still far superior to P&S even on an "ancient" EOS 300, the various lenses alone make a huge difference.
I can see how phones are squeezing P&S and you always carry your phone but the fact is the phone is inferior especially for lenses and I don't see that changing for a good while.
While I agree the first image could never come from an iPhone, you certainly don't need full frame or a long lense to get such shallow depth of field. Pop a canon 50 1.2 on an entry level crop and you can get very shallow DOF.
Yep or the dude who made the app...
Many would be surprised by the quality of photos taken on phone cameras. You literally can't tell if it is a DSLR or a phone camera if the shot is good (or taken with perfect lighting and then photoshopped).
I recently went to Vietnam and took some panoramas... since I'm pretty handy with my DSLR people on bumbook were asking me about my camera gear. Admittedly the DSLR has more lenses/setting so usually takes a better photo BUT the iPhone can take 30 shots a second... which increases the odds of one photo being at the right time/place... which is all it takes for a good photo.
I could definitely produce the two photos shown on my iPhone without using any non-apple apps/accessories. But... I find I am generally more talented than most 'experts' who talk $h!zzle online, so go figure.
Regardless, the point of the technology is that it's farking fast guys!!