Yes, They do. Large LLMs will not be sustainable because they will contain so much slop that their results will be terrible. And their automation is what a smaller, local LLM would do, maybe even better.
The future will, in my mind, be small, personalised LLMs and AI tools that can communicate with other LLMs to provide the information needed.
For example, you ask Apple Intelligence for a recipe. It will not have it in its LLM yet. Apple Intelligence will ask another LLM that contains the recipe what the recipe is and provide it to you.
We will not have large, megalithic LLMs like ChatGPT, but rather smaller, specialised LLMs with specific information. Maybe with a subscription model, so that they can provide the requested information.
Maybe you're doing research for school, but need information about a certain part of history. A library with a specialised LLM will have that information, but it is not freely available. You can add that subscription to Apple Intelligence, and Apple Intelligence will be able to access that information and temporarily store that information for research purposes.
I do not believe in the LLMs that we have now. Maybe Gemini because Google can use its search engine as a source. ChatGPT will end because it is only a black box with nothing to refer to. Their use case is small, and Apple Intelligence will be able to do everything else in the future.