Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
This is a disgusting invasion of my privacy, I can't even take a picture with out Google spying on me...oh wait it's Apple...never mind it's magical and a brilliant innovation.

The biggest flaw in your unintelligent rant is that Apple don't track you, use your information and location to spam you with ads to make a dollar. Would you like me to point out the huge differences between Apple privacy policies opposed to googles?
 
  • Like
Reactions: FreeState
I'm still having trouble understanding why you would want an album with only people smiling. Isn't the point of an album to capture a range of emotions and experiences?

You'd think an algorithm wouldn't be needed to filter for smiling persons in photos, especially photos taken of personal acquaintances. However, these days, smiling is rare. (WARNING: OLD MAN SPEAKING) People mostly take selfies and like to appear sexy or unamused. And when you photograph others who aren't posing, they usually look bored. In fact, they're likely looking at their phones.
 
I don't care about facial expressions or objects (although great for you that need it). I need Photos to detect DUPLICATES. They promised it server side, but it has never happened.
[doublepost=1466459074][/doublepost]
This is a disgusting invasion of my privacy, I can't even take a picture with out Google spying on me...oh wait it's Apple...never mind it's magical and a brilliant innovation.

YOU choose to upload your photos and it is an invasion of YOUR privacy? How does that work? And what part is Apple invading if it detects a smile or a ball on a photo? Isn't it more an invasion of privacy to know WHERE a photo was made (a feature that has been present for years)? Did you actually think this through?
 
According to the list, Photos won't search for anything more X-rated than a "cockatoo". I certainly understand why Apple wants to keep search terms family friendly, but by doing so Photos is useless at a what I imagine is a significant use case for a lot of people. If the internet has taught me anything, it's that a lot of selfies are anything but G-rated.

There's also a lack of regional terms for some items, and trademarked things. This sort of technology is only really good if it can search for anything a human can think of.

For example, how about searching for all vacation photos containing Mickey Mouse? Apple could exploit their close relationship with Disney at least.
 
Last edited:
Siri, analyze this:
Zoolander.jpg
 
iMood: the next great innovation from Apple.

Probably lots of folks will be entertained by this feature, and that's cool.

As for myself, I'd rather see Apple re-focus on killer new Mac hardware. I know, I know, the software and hardware bits of Apple are separate...but still.
 
  • Like
Reactions: mrxak
I highly doubt that Apple does it locally, if anything that wouldn't be very practical to update the engine to keep improving the recognition speed and accuracy.

It's done in the cloud, just like Google.

Care to provide evidence to back your statement?

I saw others already backed me up, but you can hear it from Craig directly at 1m 33sec:


I mean I prefer Google Photos because I love the cloud backup and browser accessibility but I'm giving props to Apple for this software.
 
Last edited:
  • Like
Reactions: ErikGrim
Man this tech is pretty impressive. Can't wait to hear my iMac fans scream as it works its way through the 150gb of photos and vids!
My iMac happily played XCOM2 at acceptable FPS while performing the machine learning (as well as indexing spotlight and other post install optimisations). It took less than hour to analyse my 14.5K library. YMMV.
 
  • Like
Reactions: FactVsOpinion
Man this tech is pretty impressive. Can't wait to hear my iMac fans scream as it works its way through the 150gb of photos and vids!

Yup. But fortunately ML classification is an extremely cheap process relative to ML training.
[doublepost=1466481359][/doublepost]
For me, Photos has been the biggest disappointment on the Mac out of all the recent changes Apple has made. I've sent them feedback countless times about peculiar issues that haven't gotten fixed. My biggest gripe is that Photos won't play nice with other apps, even Apple's own. There's no way to drag an image from Photos into another app. And when I open Final Cut and click on the Photos tab, I get the bizarre error message, "Open Photos to see your photos". So there's no longer a way to import iPhone photos and videos into FC without opening the Photos package on the hard drive and navigating to the buried folder I want. It's insane. Plus, if I want to copy and paste content directly from Photos into a different folder, for some odd reason it takes an extremely long time (whereas I can copy from the packaged folder nearly instantly). Photos is still a hot mess. Facial recognition and automatic organization gimmicks aren't going to make me like it more at this point.

Drag photos to desktop first; it's a quirk of macos rather than photos itself.

Your gripe is that it says open Photos to see photos? Really? That's your issue?
 
Drag photos to desktop first; it's a quirk of macos rather than photos itself.

No, it has nothing to do with OSX. iPhoto used to allow me to drag-and-drop from directly within the app.

Your gripe is that it says open Photos to see photos? Really? That's your issue?
Yes it is. FCPX has a tab specifically for importing media from iOS devices. It used to work seamlessly until Photos broke that functionality. Apple is now making me jump through hoops to accomplish something that used to be easy.
 
  • Like
Reactions: ErikGrim
2010 iMac?
2014, but still ;)
[doublepost=1466501230][/doublepost]
No, it has nothing to do with OSX. iPhoto used to allow me to drag-and-drop from directly within the app.


Yes it is. FCPX has a tab specifically for importing media from iOS devices. It used to work seamlessly until Photos broke that functionality. Apple is now making me jump through hoops to accomplish something that used to be easy.
I agree. Photos has fundamentally broken drag and drop.
 



One of Apple's first party apps that's getting a makeover in iOS 10 and macOS Sierra is Photos, bringing intensive new facial recognition and "Siri intelligence" features to the picture accumulating app. Over the weekend, a Reddit user discovered a few lines of code within the framework of Apple's beta of the macOS Sierra Photos app, possibly detailing both the specific facial expressions that the app recognizes and every single searchable object users can find in both Sierra and iOS 10.

In a more detailed Medium post, Redditor vista980622 explained that Photos will be able to "recognize and distinguish" 7 total facial expressions after the app scans a user's library and forms a "faceprint" for each individual in a picture. The expressions include greedy, disgust, neutral, scream, smiling, surprise, and suspicious.

macossierraphotosmemories-800x463.jpg

One of the biggest new additions to Photos in iOS 10 and Sierra is "Memories," a new tab that aggregates a user's pictures into logical, organized folders based on the app's new facial and object recognition abilities. vista980622 discovered every category of Memories as well, whose names are said to be "automatically generated using metadata from the photos and tags from analysis of photos."

The category names are as follows:
Users who have been testing the first beta of iOS 10 last week mentioned the impressive search parameters of Photos, which intelligently detects and tags every picture for the scenes, objects, and landmarks captured within. In totality, there are 432 of these items that can be searched for by the user, including everyday phrases like "Apartment" and "Birthday Cake," and somewhat obscure inquiries like "Diadem" and "Gastropod."
The full list of 432 searchable objects and scenes can be found in the Medium post shared on Reddit. As discovered within the first beta of iOS 10 and macOS Sierra, the comprehensive list is far from confirmed as accurate. All the same, many of its terms do match up with another Redditor's successfully executed searches, as well as the words they claimed failed to generate any concrete results.

Previous Coverage: See iOS 10's New Photos App in Action

Update: The list of searchable scenes and objects appears to account for around 4,432 items, as opposed to the 432 mentioned in this article and the original Medium post. It's unclear whether vista980622 made a typo in the blog post, or if groups of scenes and items -- like Art, Artistic Creation, Artistic Creations, Artistries, Artistry, Arts, Artwork, Artworks -- were grouped together to result in a smaller number.

Update 2: The author of the Medium post has acknowledged that the originally quoted 432 scenes and objects was a typo and the correct number should indeed be 4,432.

Article Link: New Photos App Detects 4,432 Total Searchable Objects and 7 Facial Expressions [Updated]
[doublepost=1466515474][/doublepost]Facial recognition isn't a high priority for me, squashing bugs related to RAW files is.
 
Apple doesn't see your photo, only your phone sees your photo.


SO you're telling this awesome new feature is only available on photos that are physically on my phone, but once they're uploaded to iCloud, I can no longer utilize it? So to take advantage of photo recognition, I should keep my photos on my phone at all times filling up my precious storage space?
 
Machines now outperform humans on the top image recognition benchmarks, so yeah, I'd likely trust the machine to save me the time to scan 30,000 images.

If there were a particularly good image that stood out from the rest, presumably I would have recognized it and flagged it when I imported it.
They can beat us in some areas and not in others. Google photos makes plenty of mistakes that i'd never make - combining pictures of Greece and Las Vegas and saying they're from the same place, what it considered a birthday party very often wasn't. What is the main subject of the picture - is it a picture of a clock or is there simply a clock in the picture? I'm sure their test could be structured to allow the human to easily win. I'm not saying computers won't get there - but not yet.
 
"Hey Siri, show me videos of people lying." should be an interesting query someday.
AGREED...thats exactly what i was thinking when I first saw this post...and I think we are not that far from that point

Now im thinking how is going to impact that on our current society where we all lie at some point.

Even the small things like: Son, eat your food and I'll take you to the "play land" site this weekend o_O
 
They can beat us in some areas and not in others. Google photos makes plenty of mistakes that i'd never make - combining pictures of Greece and Las Vegas and saying they're from the same place, what it considered a birthday party very often wasn't. What is the main subject of the picture - is it a picture of a clock or is there simply a clock in the picture? I'm sure their test could be structured to allow the human to easily win. I'm not saying computers won't get there - but not yet.
Yeah, that's why I made the point that machines do well on benchmarks. ImageNet is a pretty comprehensive one as these things go, but that doesn't mean there aren't blind spots.

In particular, the examples you give are probably at least partly relying on your memory to add context to the search results and the machine wouldn't have access to that from the image data alone. Give it access to geotag data and your calendar history and it would get closer.

Then, of course, theres the the inverse of the question: how many times would the machine find matches that a human would miss particularly at speed? We're not perfect either, but we find our flaws more acceptable than flaws in machines.

My point though was just that computer vision is already a huge benefit for searching through a large library of images and filtering them down. It doesn't need 100% reliability to be useful any more than Spotlight does with natural language. If it can be reasonably accurate with more false positives than missed detections but run a few orders of magnitude faster than I can manually, it's a win.
 
Yeah, that's why I made the point that machines do well on benchmarks. ImageNet is a pretty comprehensive one as these things go, but that doesn't mean there aren't blind spots.

In particular, the examples you give are probably at least partly relying on your memory to add context to the search results and the machine wouldn't have access to that from the image data alone. Give it access to geotag data and your calendar history and it would get closer.

Then, of course, theres the the inverse of the question: how many times would the machine find matches that a human would miss particularly at speed? We're not perfect either, but we find our flaws more acceptable than flaws in machines.

My point though was just that computer vision is already a huge benefit for searching through a large library of images and filtering them down. It doesn't need 100% reliability to be useful any more than Spotlight does with natural language. If it can be reasonably accurate with more false positives than missed detections but run a few orders of magnitude faster than I can manually, it's a win.

I agree - this will be very useful - I'm getting tired of manually organizing my pictures. I'm anticipating a huge group complaining about the mistakes, which will happen, and, as you said, it really doesn't need to be 100% to be useful.
 
AGREED...thats exactly what i was thinking when I first saw this post...and I think we are not that far from that point

Now im thinking how is going to impact that on our current society where we all lie at some point.

Even the small things like: Son, eat your food and I'll take you to the "play land" site this weekend o_O

Yeah, it will be a bigger deal when that can happen in real-time just with an app open. But while recording video, I'll just try to talk as little as possible, lol. What's going to suck is all the videos that already exist before that point. I mean, I don't often lie, but like you said there are so many little white lies that everyone uses to get through the day to avoid offending people, etc. If we were all completely honest we'd just be screaming "Shut up, I don't care about your stupid pets and their diseased feet!" at work and that wouldn't be good for our professional development, especially if it's our boss. But it's not like a boss would likely record casual conversations to see if we were lying about trivial things. I know when people at work ask how my weekend went and I complain about a bunch of crap related to our moving houses right now that they don't really care that much about all the stupid details but I rant anyway. We all rant to each other because it makes us feel better. But eventually that sort of tech will be available on wearables (even tiny things like contact lenses) and like you said, that could really change societal dynamics.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.