While that comes across as AI is here and now, it’s also indicative of it isn’t. It seems that we are advancing on using the term, but it’s not really AI. All those AI is the future need to be looked at as another form of data mining by google and others. Now as far as separate graphics cards the last thing I saw was Apple making their own external cards rather then partnering with nvidia because of how they are advancing metal3 to be their graphics engine. Supposedly we expect MacOS 14 to add to that. The comparison of what metal3 provides vs what it lacks is best seen with resident village playback demos. Still a WIP. As far as AS Macs being the same as traditional PC workstations that is still not known, because we haven’t had any technical discussions representing a Mac Pro, just extended conjecture on what might be used.
First of all I think you are mistaking two different things: AI and AGI. An example of AI application are Large Language Models - things like ChatGPT, based on layers of transformer nodes that create a neural networks.
Artificial General Intelligence is being developed in parallel with completely different goals - much more lofty goals of creating, well, general intelligence. And maybe even consciousness.
LLM may play a role in development of AGI as Ben Goertzel suggests, while some people like Sam Altman (CEO of OpenAI) thinks LLM and neural networks alone may lead to creating AGI - although noone knows for sure.
AI which is developing much faster than anyone anticipated, so you're wrong on that one - unless you know better than all the biggest AGI/AI specialists in the field

LLM are data scraping the internet of course but if anything, it's Microsoft not google - they own OpenAI now.
Since noone really knows what is going on inside the neural network, one can argue are there glimpses of actual understanding the world by ChatGPT4 or whether its purely probabilistic machine with no understanding of text its creating. This is debatable. Also noone knows if AGI can be born purely out of LLM development - though unlikely, it is possible.
As for AGI development, it's highly probable that there will exist a true AGI in 5 years - which sounds crazy. And a bit scary.
I dont know for sure but my guess is you didnt dig deep into the subject - I totally recommend it, it's pretty interesting (and as I said - scary).
As for AI applications for software - well, there are things that require nVidia chips and that's it. There's even very cool nVidia proprietary software for generative image creation, something like Photoshop - you take a few colors, a brush and you generate environment / background image in a matter of seconds. They have software perfect for game devs like that and these tools are being developed with nvidia chip in mind - although they could obviously run on different hardware.
Software developers will go that route - there will be more and more applications, plug-ins and tools using AI that will help with all sorts of tasks - like increasing resolution of images, motion picture tools, TONS of AI tools for 3D graphic designers. All of these run mostly on GPUs - and apple M chips cant compete with any proprietary modern graphic card. So either Apple does something about it or we can say goodbye to a real "pro" apple desktop. Even without all the AI tools that will come - how can you create a "pro" workstation that can't be used for any serious 3D application?

"Our Pro machines encode 4k and 8k in a blink of an eye, but please don't use any 3D software"

It's pathetic.