Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

GorillaPaws

macrumors 6502a
Original poster
Oct 26, 2003
932
8
Richmond, VA
Obviously this is speculative (this is a rumor site), but is it likely that Siri is using the LSM Framework (or some variant thereof) under the hood? If so what are the implications of this? It is my understanding that as the number of discrete categories increases, the reliability of the technology decreases.

Does this mean that we are unlikely to see this technology (or related) available to 3rd party OSX developers for a very long time or could we be seeing API's in a year or two?

Other than accessibility benefits, what would be some of the clever ways that a 3rd party OSX app could take advantage of access to such technology? Would we see a revolution in interface design for 3rd party apps on the desktop? I'm trying to envision how an accounting app like Quickbooks might work in the future (e.g. "what is the total spent in June on vehicle expenses").
 
Last edited:

subsonix

macrumors 68040
Feb 2, 2008
3,551
79
I don't think so, Apple acquired Siri inc who made a commercial version of it in 2010. From Wikipedia:

In 2007, SRI spun off Siri, Inc. Siri was born from SRI's work on the DARPA-funded CALO project, described by SRI as the largest artificial intelligence project ever launched.[15] Siri was acquired by Apple in 2010.

Edit: here is a bit more about CALO I found, also from Wikipedia:

CALO was an Artificial Intelligence project funded by the Defense Advanced Research Projects Agency (DARPA)[1] under its Personalized Assistant that Learns (PAL) program. Its five-year contract brought together 300+ researchers from 25 of the top university and commercial research institutions, with the goal of building a new generation of cognitive assistants that can reason, learn from experience, be told what to do, explain what they are doing, reflect on their experience, and respond robustly to surprise. SRI International was the lead integrator responsible for coordinating the effort to produce an assistant that can live with and learn from its users, provide value to them, and then pass a yearly evaluation that measures how well the system has learned to do its job.
 
Last edited:

GorillaPaws

macrumors 6502a
Original poster
Oct 26, 2003
932
8
Richmond, VA
@subsonix thanks for the quotes.

If such a hypothetical API were ever released, how do you think it would work? Could there be SIRI key values that you could associate your objects with? So in the above example, I would think the API could detect uses of NSDate objects and infer the appropriate relationships there, but for the ExpenseCategory objects, perhaps there would be a key that corresponded with a discrete category group that Siri could understand as a type of collection? Maybe it would be integrated with Core data so that it could make inferences based on relationships between objects, and the various settings for properties (delete rules, inverse relationships, limits on values etc).

I just think it's an interesting thought experiment to imagine how these things could possibly work in the future.
 

jiminaus

macrumors 65816
Dec 16, 2010
1,449
1
Sydney
@subsonix thanks for the quotes.

If such a hypothetical API were ever released, how do you think it would work? Could there be SIRI key values that you could associate your objects with? So in the above example, I would think the API could detect uses of NSDate objects and infer the appropriate relationships there, but for the ExpenseCategory objects, perhaps there would be a key that corresponded with a discrete category group that Siri could understand? Maybe it would be integrated with Core data so that it could make inferences based on relationships between objects, and the various settings for properties (delete rules, inverse relationships, limits on values etc).

I just think it's an interesting thought experiment to imagine how these things could possibly work in the future.

I think CoreData would be too low-level. I think it might work better by working against an exposed AppleScript dictionary.
 

admanimal

macrumors 68040
Apr 22, 2005
3,531
2
My understanding is that the recognition and natural language processing takes place remotely, which could pose some serious challenges to allowing 3rd party integration. This would mean that Apple would need to incorporate the information necessary for your app to work in their main recognition model. It's hard to say how difficult this would be without knowing exactly what kind of models they are using, but in general it is not an easy thing to do and in many cases can require rebuilding the entire model. This doesn't even consider the fact that tens of thousands of apps would probably want to use the technology, which would make acceptably accurate performance almost impossible.

I guess the way to look at this in a more positive light is that, since the stuff is running on the cloud, the necessary information and compute power to update the model does exist. It probably couldn't be done on the phone itself.
 

GorillaPaws

macrumors 6502a
Original poster
Oct 26, 2003
932
8
Richmond, VA
My understanding is that the recognition takes place remotely, which could pose some serious challenges to allowing 3rd party integration.

I was assuming that this would work locally on desktops/laptops, but perhaps there are limitations that would make this unrealistic (huge data sets perhaps)?
 

subsonix

macrumors 68040
Feb 2, 2008
3,551
79
@GorillaPaws It's indeed interesting, I don't know much about it I'm afraid it's stuff I just recently found out about. I don't have much knowledge of NLP, AI or machine learning etc, other than a conceptual understanding of it.
 

JoshDC

macrumors regular
Apr 8, 2009
115
0
I don't think LSM is really the right technology to use, and from WWDC sessions on it it sounded like it was developed at Apple.

There's another API introduced in Lion (and now in iOS5) that's much more appropriate. NSLinguisticTagger can do part of speech tagging, essentially "where are the nouns, adjectives, verbs, etc. in this string?". This should make identifying the actions in a sentence much easier by highlighting the most important words. For example this sentence tags only nouns:

How's the weather in York?

And the intent is clear from just those two words. I assume it's used to some extent in Siri to "clean-up" conversational sentences.

I wrote a simple application a while ago to understand the API a little better, and you may find it interesting to see how conversational sentences can/can't retain their meaning with limited parts of speech.
 

Attachments

  • Linguistic Tagger.app.zip
    32.1 KB · Views: 81
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.