Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It's recommendations probably wouldn't work for me, if I even wanted to use it.

My top (pretty much) song was listened to 3,275 times last year. 25,228 minutes in total. The rest barely hit 100 plays at most.
o_O (laughs at self)

56958e51-3c94-4909-aa1d-58d1b791dd96-1680743040075.png
 
I built a full Apple Music playlist conversationally - not just “songs like X,” but sequencing, mood, era-bridging. Went back and forth refining the order, swapped tracks, talked through why something worked or didn’t, and ended up with a playlist that feels genuinely intentional instead of algorithmic.
 
Or Apple could have a decent way to discover music in their app.
This should just work from Apple Music. This is 2026, ffs.
iOS doesn’t have basic features like image collages, and Apple took several years before offering the most basic password manager. No reason to make the effort when they can have someone else do it, especially where 30% is taken.
 
Ok... the marketing department/agency at it again:
Let's revive old functions als new by slapping the label "AI" on it (because everyone is doing it).

Really, there's nothing new here. There are already personal suggestions, automatic playlists in iTunes for two decades, and Apple Music for a decade. I've got the Apple Music subscription and I've been using automatic playlists since then and even more when I got HomePod mini's.

In the past I found that even the "shuffle" mode in iTunes 10 really wasn't fully random. Remember the Party-Shuffle mode of iTunes? Later renamed iTunes-DJ and now AutoMix. Party-Shuffle seemed to follow BPM, genre, style, how often tracks were played... back in the day, when I had the day off, I used to let iTunes play in this mode the whole day to a set of speakers on an Airport Express.
 
  • Like
Reactions: Eddie Beeps
How about using it with HomePod mini?
It would be awesome. But I don’t think Apple will think about it.
 
*presses voice command button on steering wheel*

ChatGPT, create me a nice 1980s Dark Wave playlist and start playing in Apple Music.

— I can’t do that while you’re in the car.

OK super useless then.
 
Will be fun to see what ChatGPT recommends. Glad that privacy is respected. Think I will check it out.
 
  • Like
Reactions: mganu
I tried it but then after it makes the playlist ALL the songs get added to my library! Maybe thats an Apple Music limitation but a playlist might have a bunch of songs I don’t like if I’m just asking for a genre.
 
  • Wow
Reactions: Eddie Beeps
Or Apple could have a decent way to discover music in their app.

Ya'll just don't get it. AI is very very new and in its infancy. The AI capability that CHatGPT and similar do isn't ready yet to be built directly into apps like Apple Music. Maybe in the future. Tying them together is how we do it for now.
 
I'm always amazed how different I am on stuff like music.

I absolutely hate random playlists of stuff I've never heard, which ends up being a new chore of skipping songs all the time.
 
Can it make a playlist for running for example, with specific BPM and specific genre?
 
*presses voice command button on steering wheel*

ChatGPT, create me a nice 1980s Dark Wave playlist and start playing in Apple Music.

— I can’t do that while you’re in the car.

OK super useless then.

hehe...at first glance I read the line as "— I can’t do that while I'm driving your car."
 
  • Like
Reactions: Kar98
So it’s worth boiling the oceans for this very slight convenience? Seriously, all this AI stuff seems to do is introduce minuscule amounts of “improvement” that saves very little time at exceptionally high electricity costs and fresh water use.

I get using machine learning to solve complex scientific or engineering challenges, I get using it to assist in certain menial tasks, but unless you can do it on-device, without consuming vast resources, I want nothing to do with it

It depends largely on the model that's being used. I asked grok (ducking) to compare energy usage of web queries, inference/usage of frontier models, "conventional ML", and such, and here is what it reported:

Conventional Google query40-300mWh
Frontier LLM inference240-340mWh
Voice assistant (Siri, Alexa, Google Assistant)0.5-5mWh
Traditional ML1-100mWh
Efficient/smaller LLM10-200mWh

Training the models--which is going to happen anyway--is measured in GWh.

Grok's summary:

Real-World Context​

  • A simple LLM query (~0.3 Wh) is like running a microwave for ~1 second or a 10W LED bulb for ~2 minutes.
  • 1,000 LLM queries (~0.3 kWh) ≈ 1 hour of HD video streaming (~0.1–0.5 kWh).
  • But at scale (billions of queries/day), LLM inference dominates AI energy use — often 80–90% of total lifecycle.
Bottom line (early 2026): Frontier LLM inference is 3–10× more energy-intensive than a web search and vastly more than voice assistants or traditional ML, but efficiency gains are closing the gap fast for optimized/smaller models. The real impact comes from massive usage volume, not per-query costs.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.