This article is absolutely incorrect.
Nuance technology is a front-end to Siri not a backbone. A backbone implies a deeper connection to a given structure. That's not how Siri is designed.
Vlingo, Siri, etc. all use Nuance Recognizer technology; the NLU engine within Recognizer allows these products to perform "intelligently" with speech recognition. Recognizer will understand the user and return an "intent" back to the calling application - all the data will be nicely parsed to the calling application (i.e. Siri). Siri merely needs to submit the data to its backend services to perform the action - whether its search, calendar updates, etc.
We also don't know how much services work Nuance did for Apple - for all we know, the entire grammar set, tuning, and other significant professional services could have been implemented by Nuance.
In short, Recognizer performs almost 60 to 75% of the work (listening, parsing, and returning the intent) - the difficult part, the rest of it is relatively "simple" procedural programming.