While that's a valid way of looking at things, it's not the only way .. - far too many people want it and need it.
I think choice is wonderful. However, transparency is necessary. We should also be careful to not confuse our "wants" and "needs."![]()
I'm not sure what's been obscured. Apple's descriptions of how they're doing things seem pretty clear to me. But if you don't trust what Apple says... I can't help you there.
And you clearly missed my point about wants and needs. It's not about a person confusing desire with need. Some people want to be organized. Other people, who are not organized, clearly need to be organized.
I tried it with the "Best of 3 months" setting. Played my trip to PHX, my puppies, family pics, nudes, screenshots of conversations on Tinder, and some food and memes. Not exactly what I was expecting for my friends to see lol.
Did some digging over the weekend into this new iOS photo functionality ... (published - I have not installed iOS10b)
One big question / answer that I did not see in either the keynote nor interview; just how effective can this learning / function really be? On Google's side they have extensive access to both photos and data with server side processing to really understand objects in the photos and their relationships. Apple doesn't seem to use that instead relying on some basic data and recognition algorithms like facial mapping.
I'll really be interested in how effective this functionality is before I agree and say that Apple has found a better alternative.
I've been disappointed by features like this in the past. I'd rather make a video myself. But I'll give it a try in the public beta and see if it works for me. If it makes it easier for me to find a picture of the Treaty Oak that I took in 1987 without my having to add metadata manually, I'll use it as much as its usefulness deserves.
I've been disappointed by features like this in the past. I'd rather make a video myself. But I'll give it a try in the public beta and see if it works for me. If it makes it easier for me to find a picture of the Treaty Oak that I took in 1987 without my having to add metadata manually, I'll use it as much as its usefulness deserves.
...
Apple is doing the same type of Analysis to photos as Google but without apple knowing what photos you have.
[doublepost=1468098436][/doublepost]
...
It doesn't know who people are without some picture that identifies them. It pulls them all together and then gives you the ability to add a name. It does pull location when it is available, but it also knows what common things look like. I have deer and mountain pictures that I imported from a camera that doesn't have location info and it knows what they are.I agree however that is the question: from where is Apple getting the data?
I take a series of photos.
Google: knows me via my account interaction (email, photos, GNow, Google Search, Google+, Social media, etc... Note: my Contacts do not have actual photos). Uses location (if I have it on - usually do not). Uses algorithm to identify objects and build links / associations. Understands what I like (demographic analysis). I can currently ask it for all kinds of things and I get pretty much spot on results.
Apple: Is only going to use the data on my device. If I ask it the same queries I do for Google Photo, assuming it can do the same type of analysis, where is Apple going to get the data to analyze my photos and present me with relevant results? It is likely it can identify common classes of objects (public).
Question: This is the big aspect I do not understand. I keep hearing "it will" however I am seeing no real information on "how".
I can ask Google to show me all photos that contain my mom and dad, my dog, my car, the Atlantic City trip and get them. I can see Apple getting some of these. Maybe.
Color me confused.![]()
Texas:Hmm. which Treaty Oak?… The one in Jacksonville or the one in Texas? Unless you have location data attached to your photo it may be difficult for the photos app to tell it's anything other than a plain old tree. Maybe if other users have photos of the same tree, and over time Apple's computer learning matrix would figure it's a specific tree. I can see this being easier with more iconic objects and scenes.
It doesn't know who people are without some picture that identifies them. It pulls them all together and then gives you the ability to add a name. It does pull location when it is available, but it also knows what common things look like. I have deer and mountain pictures that I imported from a camera that doesn't have location info and it knows what they are.
With Google and Photos, I get the same type of false positives. Both services think some dog pics are cats or bears. It is easy to see how they can make mistakes, but neither service is close to perfect. Google even thinks one of my dog pics is a flower. That being said, it is better than having to tag everything myself, so I like the option.
The Photos app in iOS 10 has been updated with what Apple calls "Siri intelligence," which essentially equates to new deep learning techniques and advanced facial and object recognition algorithms.
Using these tools, Photos is able to scan a user's entire photo library, intelligently detecting people, animals, places, and objects and grouping photos together in a logical way based on those parameters. As can be seen in the video below, this enables powerful searching capabilities, allowing users to search for "cats" to bring up their images of cats, or "mountains" to find all images taken of mountains.
Subscribe to the MacRumors YouTube channel for more videos.
New to Photos on iOS is a "People" album, housing all of a user's images featuring people, grouped based on facial recognition, and there's a world map that shows the physical location where each of a user's photos were taken.
Perhaps the best new feature in Photos is a "Memories" tab that uses all of the image recognition, date, and location information to aggregate photos based around certain days, vacation trips, family events, and more, so your photos can be revisited on a regular basis. With Memories, there are options to watch quick video montages of photos, which are set to music.
Also new in the iOS 10 Photos app are Live Filters that work with Live Photos and new Markup tools for annotating photos.
The new features in Photos are powered by a device's GPU with all learning done on a device-by-device basis to ensure full privacy. Apple has made it clear that it does not see images or image metadata. When using the new Photos features, each device with a photo library will need to scan images independently -- there is no iCloud link yet.
In case you missed them, make sure to check out our seven minute WWDC 2016 video, which features a quick rundown on all of the new iOS, macOS Sierra, tvOS, and watchOS features Apple introduced this week, and our video highlighting iOS 10's overhauled Lock screen. stay tuned to MacRumors for more in-depth software videos.
Article Link: See iOS 10's New Photos App in Action