Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Any description about these Live photos related pointers in iOS10 will be helpful

  1. Live Photos stabilization
  2. Live Photos editing
  3. Live Filters for Live Photos
 
Did some digging over the weekend into this new iOS photo functionality ... (published - I have not installed iOS10b)
One big question / answer that I did not see in either the keynote nor interview; just how effective can this learning / function really be? On Google's side they have extensive access to both photos and data with server side processing to really understand objects in the photos and their relationships. Apple doesn't seem to use that instead relying on some basic data and recognition algorithms like facial mapping.

I'll really be interested in how effective this functionality is before I agree and say that Apple has found a better alternative.
 
  • Like
Reactions: Signup and akash.nu
I think choice is wonderful. However, transparency is necessary. We should also be careful to not confuse our "wants" and "needs." :)

I'm not sure what's been obscured. Apple's descriptions of how they're doing things seem pretty clear to me. But if you don't trust what Apple says... I can't help you there.

And you clearly missed my point about wants and needs. It's not about a person confusing desire with need. Some people want to be organized. Other people, who are not organized, clearly need to be organized.
 
I’m surprised none of the reviews have addressed this yet… would someone please check and see if you can disable the image analysis? I appreciate that they do it locally, but I don’t want them to do it at all.


Thanks.
 
Last edited:
  • Like
Reactions: dk001
I'm not sure what's been obscured. Apple's descriptions of how they're doing things seem pretty clear to me. But if you don't trust what Apple says... I can't help you there.

And you clearly missed my point about wants and needs. It's not about a person confusing desire with need. Some people want to be organized. Other people, who are not organized, clearly need to be organized.

I have good news. You can help! When Apple is not forthcoming, we can all speak out and bring matters to their attention. For example: Apple touts privacy (which I appreciate), but sometimes sets defaults that aren't consistent with privacy. We can speak up in this forum and others. :)

I'm not sure that I missed your point, as much as I disagree with it. I view needs and wants in a different way. I want to be more organized. I need water. :)
 
I tried it with the "Best of 3 months" setting. Played my trip to PHX, my puppies, family pics, nudes, screenshots of conversations on Tinder, and some food and memes. Not exactly what I was expecting for my friends to see lol.

It's not magic so it can only use the images you supplied it with. Sounds like everything that showed up was taken with 3 months, thus it worked correctly. As it analyzes more objects in your photos it will create more categories for you.
[doublepost=1468097722][/doublepost]
Did some digging over the weekend into this new iOS photo functionality ... (published - I have not installed iOS10b)
One big question / answer that I did not see in either the keynote nor interview; just how effective can this learning / function really be? On Google's side they have extensive access to both photos and data with server side processing to really understand objects in the photos and their relationships. Apple doesn't seem to use that instead relying on some basic data and recognition algorithms like facial mapping.

I'll really be interested in how effective this functionality is before I agree and say that Apple has found a better alternative.

Apple is doing the same type of Analysis to photos as Google but without apple knowing what photos you have.
[doublepost=1468098436][/doublepost]
I've been disappointed by features like this in the past. I'd rather make a video myself. But I'll give it a try in the public beta and see if it works for me. If it makes it easier for me to find a picture of the Treaty Oak that I took in 1987 without my having to add metadata manually, I'll use it as much as its usefulness deserves.
I've been disappointed by features like this in the past. I'd rather make a video myself. But I'll give it a try in the public beta and see if it works for me. If it makes it easier for me to find a picture of the Treaty Oak that I took in 1987 without my having to add metadata manually, I'll use it as much as its usefulness deserves.


Hmm. which Treaty Oak?… The one in Jacksonville or the one in Texas? Unless you have location data attached to your photo it may be difficult for the photos app to tell it's anything other than a plain old tree. Maybe if other users have photos of the same tree, and over time Apple's computer learning matrix would figure it's a specific tree. I can see this being easier with more iconic objects and scenes.
 
  • Like
Reactions: dk001
...

Apple is doing the same type of Analysis to photos as Google but without apple knowing what photos you have.
[doublepost=1468098436][/doublepost]

...

I agree however that is the question: from where is Apple getting the data?

I take a series of photos.
Google: knows me via my account interaction (email, photos, GNow, Google Search, Google+, Social media, etc... Note: my Contacts do not have actual photos). Uses location (if I have it on - usually do not). Uses algorithm to identify objects and build links / associations. Understands what I like (demographic analysis). I can currently ask it for all kinds of things and I get pretty much spot on results.

Apple: Is only going to use the data on my device. If I ask it the same queries I do for Google Photo, assuming it can do the same type of analysis, where is Apple going to get the data to analyze my photos and present me with relevant results? It is likely it can identify common classes of objects (public).

Question: This is the big aspect I do not understand. I keep hearing "it will" however I am seeing no real information on "how".
I can ask Google to show me all photos that contain my mom and dad, my dog, my car, the Atlantic City trip and get them. I can see Apple getting some of these. Maybe.

Color me confused. :confused:
 
I agree however that is the question: from where is Apple getting the data?

I take a series of photos.
Google: knows me via my account interaction (email, photos, GNow, Google Search, Google+, Social media, etc... Note: my Contacts do not have actual photos). Uses location (if I have it on - usually do not). Uses algorithm to identify objects and build links / associations. Understands what I like (demographic analysis). I can currently ask it for all kinds of things and I get pretty much spot on results.

Apple: Is only going to use the data on my device. If I ask it the same queries I do for Google Photo, assuming it can do the same type of analysis, where is Apple going to get the data to analyze my photos and present me with relevant results? It is likely it can identify common classes of objects (public).

Question: This is the big aspect I do not understand. I keep hearing "it will" however I am seeing no real information on "how".
I can ask Google to show me all photos that contain my mom and dad, my dog, my car, the Atlantic City trip and get them. I can see Apple getting some of these. Maybe.

Color me confused. :confused:
It doesn't know who people are without some picture that identifies them. It pulls them all together and then gives you the ability to add a name. It does pull location when it is available, but it also knows what common things look like. I have deer and mountain pictures that I imported from a camera that doesn't have location info and it knows what they are.

With Google and Photos, I get the same type of false positives. Both services think some dog pics are cats or bears. It is easy to see how they can make mistakes, but neither service is close to perfect. Google even thinks one of my dog pics is a flower. That being said, it is better than having to tag everything myself, so I like the option.
 
  • Like
Reactions: Signup and dk001
Hmm. which Treaty Oak?… The one in Jacksonville or the one in Texas? Unless you have location data attached to your photo it may be difficult for the photos app to tell it's anything other than a plain old tree. Maybe if other users have photos of the same tree, and over time Apple's computer learning matrix would figure it's a specific tree. I can see this being easier with more iconic objects and scenes.
Texas:
Marco Treaty Oak 1988 1.jpg

Doesn't look like this any more, though.

Jacksonville's Treaty Oak is very impressive, BTW. I haven't seen it in person. I love majestic trees.
 
It doesn't know who people are without some picture that identifies them. It pulls them all together and then gives you the ability to add a name. It does pull location when it is available, but it also knows what common things look like. I have deer and mountain pictures that I imported from a camera that doesn't have location info and it knows what they are.

With Google and Photos, I get the same type of false positives. Both services think some dog pics are cats or bears. It is easy to see how they can make mistakes, but neither service is close to perfect. Google even thinks one of my dog pics is a flower. That being said, it is better than having to tag everything myself, so I like the option.

Good to know. I get generally great results using GPhotos. After reading all the hype on Apple's new photo functionality your info is appreciated.
 
  • Like
Reactions: Uofmtiger
Mine seems to be stuck getting Memories set up. I've had it plugged in and locked for about 8 hours and it still says "Scanning..."

The face detection also scanned, but it tells you how many photos it has left (took about 3 hours on my 6S Plus).
 
Will any new photos cause a rescan of all your photos?




The Photos app in iOS 10 has been updated with what Apple calls "Siri intelligence," which essentially equates to new deep learning techniques and advanced facial and object recognition algorithms.

Using these tools, Photos is able to scan a user's entire photo library, intelligently detecting people, animals, places, and objects and grouping photos together in a logical way based on those parameters. As can be seen in the video below, this enables powerful searching capabilities, allowing users to search for "cats" to bring up their images of cats, or "mountains" to find all images taken of mountains.


New to Photos on iOS is a "People" album, housing all of a user's images featuring people, grouped based on facial recognition, and there's a world map that shows the physical location where each of a user's photos were taken.

Perhaps the best new feature in Photos is a "Memories" tab that uses all of the image recognition, date, and location information to aggregate photos based around certain days, vacation trips, family events, and more, so your photos can be revisited on a regular basis. With Memories, there are options to watch quick video montages of photos, which are set to music.

Also new in the iOS 10 Photos app are Live Filters that work with Live Photos and new Markup tools for annotating photos.

The new features in Photos are powered by a device's GPU with all learning done on a device-by-device basis to ensure full privacy. Apple has made it clear that it does not see images or image metadata. When using the new Photos features, each device with a photo library will need to scan images independently -- there is no iCloud link yet.

In case you missed them, make sure to check out our seven minute WWDC 2016 video, which features a quick rundown on all of the new iOS, macOS Sierra, tvOS, and watchOS features Apple introduced this week, and our video highlighting iOS 10's overhauled Lock screen. stay tuned to MacRumors for more in-depth software videos.

Article Link: See iOS 10's New Photos App in Action
 
I have a soft spot for Live Photos. Could be big if Apple improves upon these.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.