I realize what's being discussed here is on a whole separate level, but Adobe has been adding more and more tools that they describe as "Powered by AI" that really are phenomenal but also don't do anything crazy.
As an example, not too long ago Lightroom replaced the noise reduction sliders with a new "Denoise" function that, again, they claim is AI powered. The old noise reduction had some parameters to play with, but it was basically a high tech Gaussian blur and always lost detail(to the point that I rarely used it, or would do so with a very light touch). The new one does an amazing job of preserving details, and the only setting is just a single 0-100% slider for strength of reduction, and I find the default 50% a great general purpose setting.
It sort of amazed me the first time I tried it. The only downside is it's computationally heavy. My 2019 iMac that I use for photo editing does decent, especially with an eGPU(RX580, internal is RX570). I have a 2015 15" MBP that only has a integrated GPU that I use sometimes for editing-my first attempt took 12 minutes on a 40mp image from my Fuji X-T5. It did drop down to ~45 seconds when I managed to get an eGPU working on that computer(not the easy job since it doesn't officially support it) which actually surprised me.