Is it likely that Siri is running the LSM framework under the hood?

Discussion in 'Mac Programming' started by GorillaPaws, Oct 4, 2011.

  1. GorillaPaws, Oct 4, 2011
    Last edited: Oct 4, 2011

    GorillaPaws macrumors 6502a

    GorillaPaws

    Joined:
    Oct 26, 2003
    Location:
    Richmond, VA
    #1
    Obviously this is speculative (this is a rumor site), but is it likely that Siri is using the LSM Framework (or some variant thereof) under the hood? If so what are the implications of this? It is my understanding that as the number of discrete categories increases, the reliability of the technology decreases.

    Does this mean that we are unlikely to see this technology (or related) available to 3rd party OSX developers for a very long time or could we be seeing API's in a year or two?

    Other than accessibility benefits, what would be some of the clever ways that a 3rd party OSX app could take advantage of access to such technology? Would we see a revolution in interface design for 3rd party apps on the desktop? I'm trying to envision how an accounting app like Quickbooks might work in the future (e.g. "what is the total spent in June on vehicle expenses").
     
  2. subsonix, Oct 4, 2011
    Last edited: Oct 4, 2011

    subsonix macrumors 68040

    Joined:
    Feb 2, 2008
    #2
    I don't think so, Apple acquired Siri inc who made a commercial version of it in 2010. From Wikipedia:

    Edit: here is a bit more about CALO I found, also from Wikipedia:

     
  3. GorillaPaws thread starter macrumors 6502a

    GorillaPaws

    Joined:
    Oct 26, 2003
    Location:
    Richmond, VA
    #3
    @subsonix thanks for the quotes.

    If such a hypothetical API were ever released, how do you think it would work? Could there be SIRI key values that you could associate your objects with? So in the above example, I would think the API could detect uses of NSDate objects and infer the appropriate relationships there, but for the ExpenseCategory objects, perhaps there would be a key that corresponded with a discrete category group that Siri could understand as a type of collection? Maybe it would be integrated with Core data so that it could make inferences based on relationships between objects, and the various settings for properties (delete rules, inverse relationships, limits on values etc).

    I just think it's an interesting thought experiment to imagine how these things could possibly work in the future.
     
  4. jiminaus macrumors 65816

    jiminaus

    Joined:
    Dec 16, 2010
    Location:
    Sydney
    #4
    I think CoreData would be too low-level. I think it might work better by working against an exposed AppleScript dictionary.
     
  5. admanimal macrumors 68040

    Joined:
    Apr 22, 2005
    #5
    My understanding is that the recognition and natural language processing takes place remotely, which could pose some serious challenges to allowing 3rd party integration. This would mean that Apple would need to incorporate the information necessary for your app to work in their main recognition model. It's hard to say how difficult this would be without knowing exactly what kind of models they are using, but in general it is not an easy thing to do and in many cases can require rebuilding the entire model. This doesn't even consider the fact that tens of thousands of apps would probably want to use the technology, which would make acceptably accurate performance almost impossible.

    I guess the way to look at this in a more positive light is that, since the stuff is running on the cloud, the necessary information and compute power to update the model does exist. It probably couldn't be done on the phone itself.
     
  6. GorillaPaws thread starter macrumors 6502a

    GorillaPaws

    Joined:
    Oct 26, 2003
    Location:
    Richmond, VA
    #6
    I was assuming that this would work locally on desktops/laptops, but perhaps there are limitations that would make this unrealistic (huge data sets perhaps)?
     
  7. subsonix macrumors 68040

    Joined:
    Feb 2, 2008
    #7
    @GorillaPaws It's indeed interesting, I don't know much about it I'm afraid it's stuff I just recently found out about. I don't have much knowledge of NLP, AI or machine learning etc, other than a conceptual understanding of it.
     
  8. JoshDC macrumors regular

    Joined:
    Apr 8, 2009
    #8
    I don't think LSM is really the right technology to use, and from WWDC sessions on it it sounded like it was developed at Apple.

    There's another API introduced in Lion (and now in iOS5) that's much more appropriate. NSLinguisticTagger can do part of speech tagging, essentially "where are the nouns, adjectives, verbs, etc. in this string?". This should make identifying the actions in a sentence much easier by highlighting the most important words. For example this sentence tags only nouns:

    And the intent is clear from just those two words. I assume it's used to some extent in Siri to "clean-up" conversational sentences.

    I wrote a simple application a while ago to understand the API a little better, and you may find it interesting to see how conversational sentences can/can't retain their meaning with limited parts of speech.
     

    Attached Files:

Share This Page