I doubt that. The neural engine has been part of ARM since the A11 Bionic. (9/12/2017). When you consider longer text processing taking advantage of this processing across multiple OS's, it's not just a newer iPhone that would basis for its usage. It's used for Face ID, Animoji and other machine learning tasks.Yeah.. All of a sudden all Apple chips until A17 Pro will become incapable of running Siri Version 2. They will give som new name for a tiny area in the chip and make it necessary to run Siri. Like Google spoke about AI for 140 times, Siri will get a 1 hour exclusive presentation.
reference this older article

The iPhone 15 Opts for Intuitive AI, Not Generative AI
Apple ignored the tech industry's obsession with generative AI at the new iPhone launch, offering subtler AI features that make everyday tasks like photography and phone calls better.
A new voice-isolation feature for the iPhone 15, for example, uses machine learning to recognize and home in on the sound of your voice, quieting background noise on phone calls. As usual for iPhone launches, yesterday’s event spent ample time on the power of the new phone’s camera and image-enhancing software. Those features lean on AI too, including automatic detection of people, dogs, or cats in a photo frame to collect depth information to help turn any photo into a portrait after the fact.
IOS 18 will just take this further than iOS 17 with more intuitive AI which can be directed towards virtual assistants like tasks, such as how SIRI works in iOS 18.Additional AI-powered services are also coming to newer iPhone models via the new iOS 17 operating system, due out next week. They include automated transcription of voicemails, so a person can see who’s calling before picking up a phone call, and more extensive predictive text recommendations from the iPhone keyboard. Neither is as flashy as a know-it-all chatbot. But by making life easier, they just might convince people to spend more time with their phones, pushing up usage of Apple’s services.