No but if you cooked a pie using an apple pie recipe I might be tempted to think you baked an Apple pie.
FYI I never claimed siri was owned by Nuance. FYI siri used Nuance technology. FYI Apple is licensing that same technology from Nuance. The only thing this has incommon is it is using Nuance technology.
As for your description of Nuance's app, I am guessing you have never user Dragon Go. It does far more than "basically just sent you a text back of what you said and was glorified text messaging". There is a reason Apple is going to use Nuance tech to support all of this.
The iPhone absolutely does multitasking. It is UNIX FFS. Userland apps are limited to single tasking, for good reason. An OS process, such as Assistant, would already be able to run in the BG. Just like phone.app, mail.app. Lots of OS level tasks are running in the background at all times, many with RT access to your contacts, calendar, etc.
The assistant running in the BG itself would consume more resources. Being location aware, at all time, would require more resources. The real processor intensive stuff will be the voice recognition. Language analysis, intent analysis, pattern analysis etc is all very CPU heavy. That's why the Nuance API, which Apple is going to use, offloads the work to the network.
Good bet on developers being given access. None of that sounds particularly processor intensive. If they are gong to be doing much of the voice analysis locally, that would require lots of CPU and memory. That is exactly why Nuance offloads it to the network. Orders of magnitude more compute power in a data centre than on device. But, that also introduces delay, which may be enough for Apple to do much of it locally.
And that's really my question. How much are they going to try to do locally that would require an A5 over an A4. The A5 will be a big improvement over the A4, no doubt, but the level of voice recognition they are going to be throwing at it would be best served from a data centre for accuracy.
The way you wrote it, you had Siri lumped with Nuance. I just like to clarify
Nuance's desktop software is amazing, I've not found the iPhone app anything to write home about. And yes, it is basically a text message. The server transcribes your speech, and texts it back to you. The desktop version has a recognition engine built in. They work slightly differently by nature. A desk top also has a lot more resources and storage capacity.
Back to the multi-tasking discussion, I understand you better this time.
However, it is not fair to say the iPhone has true multi-tasking. It's fair to say its capable of doing so, but not that it does so. Every phone right down to basic featureless phones has to do all those same processes you mentioned since cell phones did anything above making a call. What we want to call apps really aren't... they're the function of the phone. Like the Phone App. Every phone has the phone feature... or it wouldn't be a phone. Every phone has to search out signals and run background processes, and have for decades now.
Also, the mail app is not running in the background "all the time." If you enable push notifications, it will at intervals check for new messages and alert you. "Pushing" data is very different from running all the time. Actually, it's pushing "notifications" not data. The app doesn't update again until you re-open it. In most cases, it's like getting a text message that tells you to open your app. If it were true multitasking, when I opened say, Word With Friends, the app would already be showing me it's now my turn. Instead, the app refreshes. Sometimes it can be several minutes of delay before the notification even comes, when the app would have been able to update almost instantly if it were actually running.
In the same context, the Assistant app is probably not running in the bg all the time either. Nor is the new "Lists" feature. If you have the app set to remind you to buy eggs the next time you go to the grocery store, it's not feverishly checking to see where you are. It probably updates itself when it changes cell phone towers, or notices you've left the wi-fi network. I'm really curious to see this feature in action, because it seems too good to be true.
It will be interesting to see how much of the speech recognition engine is on board, or rendered server side. (They did build some pretty big data centers.) If it's not baked in, then the feature is useless if you're without a data connection, and that doesn't sound very Apple like. It would eat a big chunk of storage space too on your device I'd think.
Regardless of how they implement that, if it is being done via the phone, definitely good call for a memory boost. Even if it's not, while IOS home grown apps are integrated (contacts in the phone app, etc) they're just designed to access one another's database. You don't have all these apps running in the background at once. (No need either.) The Phone App isn't running the Contacts App simultaneously, just sharing a database. They're sort of an un-needed redundancy. Once that info is index, it just sits there, much like Spot Light. (Why do we have a contacts app actually? It's baked into mail, phone, and on it's own.)
You're taking on the assumption because the iPhone multitasks processes the way it does now in a certain way, it would continue to do the same with Assistant, which is logical. It's simply updating and accessing a universal data base like all the other home grown apps. It's when you get into third party support that changes, in theory.
Using Facebook as an example again, while it can access your contacts or camera roll, the FB app still has to run at some point even though the Assistant App is technically doing it. Even say, the Open Table app. I want Assistant to make dinner reservations. It's got to do all the stuff it does on it's own with home grown apps, but now it has to run processes through another app. It's simple to add it to my calendar as an event, but it's still got to use Open Table to book the reservation. This is why I really think 3rd party acces would be coming, because it's where I can see the need for more memory. If it was OS specific, then maybe not.
Currently, IOS suspends and un-suspends apps. We've all seen what happen when you switch back into an app like Facebook or anything with dynamic content. That app updates itself and checks for new posts or changes, etc. If Assistant had to suspend and un-suspend apps back and forth while it processed something, it would really slow the process down each time it switched. If it could temporally enable multiple processes amongst multiple apps, it would be a faster and smoother process, but it would require more memory to pull it off.
I personally think the iPhone 4 should of had 1 GB of memory. At a certain point, the hardware put limitations on app developers. Your phone also gets to a point where it can't run things anymore. If I buy a phone, and a year later it can run an app without choking, I have an issue. Phones are disposable, so I don't expect my device to be future proofed for 5 years or anything ridiculous.
In any case, does the phone need that much memory all the time? Doubtful. Does it probably need it to make a feature work well and not choke on itself, would be my guess. It also makes the device more capable, and lets developers put out more rich apps. That's a win win.
Again, this is all fun speculation and fun conversation. This rumor could be a big bag of burning poo. It might not be coming with IOS 5, and it might not exclude the iPhone 4. (It better not