Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
So excited for this. I'm over here with this Google Photos and cannot even manually tag a face or un-tag a face that's not right. Yet another example of Apple's cleverness, yes they come out with some things after the others however basing my judgement off this post and past experience they get it right.

BTW - Come one Google get your stuff together add manual tagging!
 
I am more interested in what they have done under the hood, as to RAW processing, lens correction etc. It seems like it does some of it though when I tried with a few files.
 
  • Like
Reactions: strawbale
>> Users who have been testing the first beta of iOS 10 last week mentioned the impressive search parameters of Photos, which intelligently detects and tags every picture for the scenes, objects, and landmarks captured within. <<

This could turn out to be very useful.

I wish Photos would also allow me to tag my photos and write descriptions right on my phone after I take a picture and have it save to a sidecar xmp file. Is that in the cards?

Ahhhh, finally we get tagging back. iPhoto had tagging, long ago and it was taken away (with loud protests at the time) during the prior decade. I'll take it if it has it.

I'm still using the last version of iPhoto (which works fine in El Cap) because it lets me get my work done quicker. I'm trying Photo's again, since that is the where the future goes, but adding tags back would be a big help.

Still waiting on Apple to relaunch iWeb... ;)
 
  • Like
Reactions: Brian33
It hasn't been confirmed yet, but they are believed to be distant cousins of Bashful, Doc, Dopey, Grumpy, Happy, Sleepy and Sneezy. Software engineers at Cupertino are also rumoured to be working on adding magic mirrors to the list of objects that the Photos App can detect so users can sort faces into a list by the 'fairest of them all' category, but have strongly denied there is any such thing as a 'poisoned apple' to recognise, despite some early reports images of such an item had involuntarily put some users' computers to sleep.
Makes sense, Apple has a history of using Disney animated characters in it's marketing.
 
Two questions:

1. Can it identifty RBFs?
2. Can it identify canine emotions (much harder since interpretation requires analysis of the entire body).
 
Last edited:
Thought it was interesting to share these findings a the whole thing in a hurry during midnight.

Within Photos app.
[doublepost=1466437696][/doublepost]

No.
But when putting Face in the search field in Photos app - it returns zero r suits I have planty of photos with faces.
 
This is a disgusting invasion of my privacy, I can't even take a picture with out Google spying on me...oh wait it's Apple...never mind it's magical and a brilliant innovation.
Well, I think they were pretty clear that this was all handled locally, precisely in order to AVOID infringing on your privacy. That's why its all pretty genius, if you ask me.
[doublepost=1466438557][/doublepost]
Hopefully, Apple won't make the same mistake by Google that tagged African Americans as gorillas.
Well, they did have problems with tattoos etc, with the Apple Watch... clearly they don't quite think of everything! (Almost everything, but not quite)
 
the main thing i want to see is the photo app picking up vr codes, urls & phone numbers. Maybe event dates etc too. The amount of time i take a photo to quickly make a record of some info dwarves the times i'll look for 'cake' in my photo library.
 
You get it! Lol, although Google has had these features in Photos, Google's machine learning is in the cloud so you have to upload your pictures. Apple's is in Photos Application on your computer so there is more privacy. Not sure what happens when the picture gets uploaded to iCloud though. Maybe there's a database file on your computer that can match the pics uploaded to the learning it has already done.
I highly doubt that Apple does it locally, if anything that wouldn't be very practical to update the engine to keep improving the recognition speed and accuracy.

It's done in the cloud, just like Google.
 
This is a disgusting invasion of my privacy, I can't even take a picture with out Google spying on me...oh wait it's Apple...never mind it's magical and a brilliant innovation.

How is your privacy being invaded? Be specific.
 
I highly doubt that Apple does it locally, if anything that wouldn't be very practical to update the engine to keep improving the recognition speed and accuracy.

It's done in the cloud, just like Google.

As mentioned on The Talk Show, load is shared across tens of millions of devices (each device handling its own load). I can't see how this is this inferior to server side where they'd be handling tens of millions of simultaneous uploads…
 
  • Like
Reactions: aristobrat
But will iPhoto still work with the new OS Sierra? I am sticking with that. I have my 62,000+ images already grouped by time and events. That is what I want my photos app to do. I will stay with current OS if iPhoto will no longer work.
 
  • Like
Reactions: mevans7
I highly doubt that Apple does it locally, if anything that wouldn't be very practical to update the engine to keep improving the recognition speed and accuracy.

It's done in the cloud, just like Google.

Disconnect your iOS device or Mac from Wi-Fi and any cellular connection. Drag a photo into Photos app/take a fresh photo.

Analysis performed instataneously. No reliance on cloud servers.

The learning and modelling of objects and scenes are already done, and your computer has all the code/resources to perform the analysis.
 
I see real world challenges that will make this feature hit-or-miss. For one, the average person doesn't have an eclectic collection of experiences. Most people repeat the same experiences (vacations, holidays and celebrations) in the same locations with the same persons. There's little uniqueness between events to form distinct memories. Secondly, people's vocabularies are poor, they won't be able to discern between those facial expressions without a pictogram.
 
I highly doubt that Apple does it locally, if anything that wouldn't be very practical to update the engine to keep improving the recognition speed and accuracy.

It's done in the cloud, just like Google.
FWIW, here are three recaps from the keynote stating that it's done locally, on the device. There will be servers, and there will be metadata uploaded from customers devices, but not the actual photos themselves.

The Verge said:
However, Apple took the time to reiterate that these features happen locally on devices, meaning no personal data is being sent back to Apple. Photos will be released with iOS 10 and macOS Sierra this fall.
http://www.theverge.com/2016/6/13/11922626/apple-photos-update-announced-new-features-wwdc-2016

Forbes said:
Apple applied advanced deep learning techniques for facial recognition on iOS — all done locally within the device.
http://www.forbes.com/sites/amitchowdhry/2016/06/14/ios-10-features/#466ece303a81

Boy Genius Report said:
Amazingly, this is all done locally on your device so Apple doesn’t need to send all of your private data to its own servers for analysis like Google does with similar features in the Google Photos app.
http://bgr.com/2016/06/13/best-ios-10-features-iphone-ipad/
[doublepost=1466443117][/doublepost]
So what happens when someone backs up all their photos to Icloud....
The same thing that has happened for the last several years when people have backed up all of their photos to iCloud... nothing.
 
Last edited:
  • Like
Reactions: Deelron and Brian33
I see real world challenges that will make this feature hit-or-miss. For one, the average person doesn't have an eclectic collection of experiences. Most people repeat the same experiences (vacations, holidays and celebrations) in the same locations with the same persons. There's little uniqueness between events to form distinct memories. Secondly, people's vocabularies are poor, they won't be able to discern between those facial expressions without a pictogram.

It will be useful for autistic people who cannot recognize facial expressions on their own.

Most people have no trouble with facial expressions, so this feature is a gimmick. That's why I'd rather see canine expression analysis - it would be highly useful learning tool for dog owners. Even as someone who's had at least one dog for decades, dog emotions are not always clear to me. Or maybe I'm just autistic with respect to canines, lol.

Too bad Apple didn't spend so much effort on improving the stability of iOS and macOS.
 
It will be useful for autistic people who cannot recognize facial expressions on their own.

For most people it is a gimmick. Too bad Apple didn't spend so much effort on improving the stability of iOS and macOS.
Who said they didn't? Are you judging the new OS by its first closed beta?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.