Separate names with a comma.
Discussion in 'iOS 11' started by PeLaNo, Jun 6, 2017.
Is improve machine learning in photo app and Siri ?
Yes, and developers can use it in their apps too. For example (stolen from the Keynote), if you search for travel destinations in Safari, you may see relevant content in News that pertains to those travel destinations.
Read more: https://developer.apple.com/machine-learning/
Is this why Apple will built their own machine learning chip ?
What do you think?
Most likely. Some of the processes that CoreML will use may be resource intensive and will have to run frequently in the background in order to collect a good amount of data.
There are two types of ML mentioned here that are getting conflated. Being able to use ML in your app is very distinct from the use of search data from one app in another app. The latter feather is Siri knowledge about the user that is pulled from the user's data and interactions in first party Apple apps, shared in a private knowledge graph generated on-device and shared across devices with end to end encryption. It's Apples way of combating the cloud based knowledge graphs that are exposing private user data.
The CoreML feature, OTOH, is just enabling 3rd party developers to easily integrate things like machine vision, pattern/speech recognition, and other basic ML capabilities very easily.