Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Long awaited? I thought everyone canceled Spotify by now.

Let’s establish a fact first: Spotify is the music streaming platform with the largest number of subscribers by a large margin.

Keeping that into consideration, you are either ignorant of Spotify’s market share, or you are trying to crack a joke, or quite possibly you’re attempting to elicit a reaction by making a deliberately provocative statement (there is a word associated with this definition).

Which one is it?
 
Last edited:
The problem is that Spotify never listens to their subscribers. A simple and basic feature like “play next” is requested for years and they’re simply ignoring it. Instead, they added useless options like “go to queue” or “hide” high in the menu, that you might accidentally tap on it. It’s embarrassing. They have this voting lists… where people are talking about the same issues for years, and no one seems to look into it and fix it… Apple Music is by far superior.
 

Attachments

  • IMG_9948.jpeg
    IMG_9948.jpeg
    51 KB · Views: 20
  • IMG_9947.jpeg
    IMG_9947.jpeg
    36.4 KB · Views: 22
Do people actually use Dolby Atmos? Wow
I listen to Apple Music on a variety of devices and IMHO most tracks encoded with Dolby Atmos sound substantially better on all of them, to varying degrees. The improved quality is most noticeable to me in two cases: at home using the AM app on an ATV 4K connected to a Bose 900 7.1.2 surround setup, and on the road using the Rivian AM app with spatial audio enabled in my R1S. Of course, YMMV depending on where and how you listen as well as what you’re listening to.
 
  • Like
Reactions: wbeasley
They were the first streaming service I can recall outside of internet radio services. My first payment to them was in January 2009.

I've switched between various services through the years, but these days I have access to both Apple Music (launched mid-2015) and Spotify:

The Spotify app is massively superior both on my mobile devices and the desktop, to the point that our I probably use it 80% of the time. Apple Music is superior when the music has good Dolby Atmos support, and on the Apple watch when used alone, e.g.on runs.

The difference in sound quality between the highest Spotify encoding and lossless is impossible to hear, even on good gear - like my Sennheiser HD800s and my B&W CM9 setup - even if you know what to listen for. It's been years since that could reliably be found in blind tests, when audio levels are matched and normalization is off. Matching audio levels is critical, since louder otherwise sounds better.
i think this might depend on your hearing a lot.

I've noticed HD Tracks seem to be more open and warmer/less tiring.

Over the years I've tried a lot of different MP3 bit rates (even down to 56kbps to fix heaps of tracks on one CD) and while 256 and 320 sound a lot better, lossless FLAC in CD or higher bit rates usually sound superior.

On bluetooth speakers I get the same feeling when using LDAC codec over SBC or AAC.
I wish Apple would add LDAC support - its the main reason I keep an Android phone around to stream music to some Sony speakers... thankfully I can use Apple Music on the Android device ;)
 
44.1kHz isn’t high enough resolution.
44.1 kHz is more than twice the frequency that someone with perfect hearing can hear. It is a sufficient sampling rate to encode audio frequencies at the maximum range of human hearing (20 kHz) with headroom for a low pass filter. It is the same frequency used by CDs.

Bearing in mind that higher sampling frequencies require more storage and higher data rates, I’d be interested to know how high you need it to be, and why?
 
Still waiting for Spotify to introduce a shuffle mechanism that doesn’t screw up tracks that are supposed to be played back to back…ala Abbey Road medley.
 
44.1 kHz is more than twice the frequency that someone with perfect hearing can hear. It is a sufficient sampling rate to encode audio frequencies at the maximum range of human hearing (20 kHz) with headroom for a low pass filter. It is the same frequency used by CDs.

Bearing in mind that higher sampling frequencies require more storage and higher data rates, I’d be interested to know how high you need it to be, and why?
But higher frequencies can still modulate and affect how lower, audible frequencies are perceived. This is a similar way low-frequency or high frequency content can affect loudness/overhead and distortion in mixes. Harmonics are created in a series, and the lack of of higher frequencies can be noticeable if you have good ears.
 
Reminder

not compressed with a lossy perceptual encoder ≠ “lossless”


If you’re not being served the original bits, there is loss. Full stop.
44.1 kHz is more than twice the frequency that someone with perfect hearing can hear. It is a sufficient sampling rate to encode audio frequencies at the maximum range of human hearing (20 kHz) with headroom for a low pass filter. It is the same frequency used by CDs.

Bearing in mind that higher sampling frequencies require more storage and higher data rates, I’d be interested to know how high you need it to be, and why?
it needs to support high enough sampling rates to be able to play back the files at their native sampling rate. If things are being converted and downsampled to meet a 24/44.1kHz limit, they are no longer lossless. Maybe not being passed through a lossy perceptual encoder, but definitely not identical to the master either.
 
I read “on iPhone” so in which way could I benefit from that?
Honest question… iPhones don’t have 3.5mm jacks anymore… and I don’t think there’s really a lossless codec for bluetooth?
You need to use wired headphones via USBC / Lightning or an audio cable via a DAC.
 
  • Like
Reactions: OJK
And we'll stream this lossless audio on Airplay 1? Idiots
Same here. Better than Apple music and Spotify put together. Love the Friday "My Weekly Q".
Also it is truly lossless. Apple Music and if this Spotify thing goes through are both lying.
24-bit at 44.1 kHz is not even close to lossless.
 
Speaking of better sound quality, I wonder has Apple thought about doing a 320 kbps VBR data rate for Apple Music instead of the current 256 kbps data rate? 320 kbps VBR could mean improved sound quality without the complications of using a lossless format like Apple Lossless.
 
24-bit at 44.1 kHz isn't lossless.
You gotta go to 24-Bit 194 kHz to get lossless.
This 44.1kHz doesn't even put them in the Hi-Res category.
Just lying. Using words that have a meaning and making them meaningless.
 
Personally I only listen to Spotify at the gym to drown out their awful music... and audio quality is not a big selling point in that environment. Also I'm not sitting in front of a computer plugged into lossless headphones to enjoy my music... there isn't a lossless home audio system that supports Spotify... so I'm not sure who their market is except AirPods users on their couch.
 
I honestly can’t tell the difference between the lossless music on my music server and the sound quality from Spotify’s current subscription tier. Perhaps I’m too old (in my fifties) or don’t have good enough quality equipment (that said, I’m running a Sonos head unit through a pretty decent NAD amp and Monitor Audio speakers 🤷🏻‍♂️). So I won’t be paying extra for this.

My main concern is that they’ll lower the quality of the current top tier in order to encourage users to pay for the new top tier 😕

They were the first streaming service I can recall outside of internet radio services. My first payment to them was in January 2009.

I've switched between various services through the years, but these days I have access to both Apple Music (launched mid-2015) and Spotify:

The Spotify app is massively superior both on my mobile devices and the desktop, to the point that our I probably use it 80% of the time. Apple Music is superior when the music has good Dolby Atmos support, and on the Apple watch when used alone, e.g.on runs.

The difference in sound quality between the highest Spotify encoding and lossless is impossible to hear, even on good gear - like my Sennheiser HD800s and my B&W CM9 setup - even if you know what to listen for. It's been years since that could reliably be found in blind tests, when audio levels are matched and normalization is off. Matching audio levels is critical, since louder otherwise sounds better.

Probably 99% of people can’t tell the difference between whatever Spotify’s current maximum bit rate and CD-quality lossless. I haven’t seen any evidence that anyone can tell the difference between CD-quality and higher than CD quality (only considering stereo audio)

44.1 kHz is more than twice the frequency that someone with perfect hearing can hear. It is a sufficient sampling rate to encode audio frequencies at the maximum range of human hearing (20 kHz) with headroom for a low pass filter. It is the same frequency used by CDs.

Bearing in mind that higher sampling frequencies require more storage and higher data rates, I’d be interested to know how high you need it to be, and why?

But higher frequencies can still modulate and affect how lower, audible frequencies are perceived. This is a similar way low-frequency or high frequency content can affect loudness/overhead and distortion in mixes. Harmonics are created in a series, and the lack of of higher frequencies can be noticeable if you have good ears.
masterhiggins is absolutely correct. Ultimately whether or not lossless audio can be perceived depends on the material and how it's mastered. But all things equal, anyone with decent hearing and a decent sound system should be able to hear the difference in lossy vs lossless audio, and lossless vs hi-res audio, if they know what to listen for. Just sitting the average listener in a chair and asking them to tell you the difference won't work. They'll hear guitar, bass (mind you, most people couldn't even tell you what a bass sounds like, much less what it's supposed to sound like...), and drums, and say, "yeah, I hear them in both recordings, so they sound the same.", Now isolate those instruments. and ask them which one sounds more real, and you're likely to get very different answers. Especially with acoustic instruments recorded in space.

What lossless provides is accurate timbre and decay. Lossy audio lacks both and sounds unnatural. But most people don't care because it's what they're used to. They're programmed to expect and settle for less. Why should they care if something sounds natural when they've never heard realistic audio?

The thing is, some of us can hear the difference. And now that we're able to have natural sounding audio at no extra cost to the providers, we should demand it. Or at the very least be given the option.

24-bit at 44.1 kHz isn't lossless.
You gotta go to 24-Bit 194 kHz to get lossless.
This 44.1kHz doesn't even put them in the Hi-Res category.
Just lying. Using words that have a meaning and making them meaningless.

Depends on what they're comparing it to. 44.1kHz is CD quality, so technically you're not losing any quality vs the CD. But you're right, 44.1kHz doesn't equal master quality.

Also I'm not sure you have to go all the way up to 194kHz to be lossless, especially if the master was only recorded at 98kHz.
 
Last edited:
  • Disagree
Reactions: xyz01 and Jensend
But some how they are number one. Just goes to show you what people really care about. I myself prefer AppleMusic and had both Tidal and Qobuz. Music selection on average was better on AppleMusic, but the other two at the time offered Hi-Res audio. Even went out to get a DAC to play MQA steams, and the Hi-Res audio from Qobuz.

Can you tell the difference between them? If you know what you're listening for, you certainly can. Also, it makes a difference too when you turn it up or even when listening a low volumes. And of course, what type of music you listen to. It's not going to make a world of difference if you listen to rap/hip-hop/techno/club/house/trance. As long as it's not too low quality that is. But, if you're listening to acoustic or jazz, classical, 60/70 and some 80's pop/rock. You certainly can hear it and more importantly feel it differently than in low quality. Not necessarily just in the bass "feel" but, the whole song can feel different. And it's not always good either. Some music was rather poorly recorded, and you may not like it in Hi-Res. It can be "too" revealing.
I totally agree with your opinions, I have an ATV hooked up to a modest 5.1 theater system with tower main speakers, I turned on the lossless on the ATV, at low levels it was not that noticable, but turn up the volume to about 60-65 db (according to my Apple Watch) and it was quite noticeable in the genres that you mentioned. I also have the original lightning AirPods Max and was disappointed when only the USB-C AirPods Max obtained lossless while teathered to the iPhone by USB-C cable, but I did get a Beats Pro for Christmas and the claim was that it also could play lossless by USB-C cable, I tried it out and could not tell the difference from wireless.
 
Do you believe Apple is paying artists more than they are required to by the music licensing companies? Or do you believe Apple is worse at negotiating than Spotify?
This isn’t like app stores, where they can set whatever rate they want to.
You make it sound like Apple directly negotiates with artists or licensing companies what they should pay, which is not at all the case. All music streaming services pay artists per stream a share of the revenue divided by the payment pool — a.k.a. If Apple makes $100,000 revenue, and they pay out 50% of that revenue to artists, and your music gets streamed 1,000 times and those streams represent 1% of total streams, that means you get paid $500 = $0.5 per stream. That's an oversimplification with much smaller numbers. Apple consistently pays artists 3x more than Spotify despite having lower marketshare because:
  • Their revenue cut is larger than Spotify,
  • Their revenues are similar to Spotify's as they don't have an ad supported plan,
  • Spotify has frequently tried to find ways to pay less to artists, like bundling podcasts and audiobooks so they can legally say they're a "bundle subscription" and therefore get away with counting Premium subscribers' revenue as lower.
Also Apple is consistently one of the only streaming providers that follows demands from songwriter organisations and indie labels for royalty sharing, so they do pay more than they're usually expected to do.
 
Same here. Better than Apple music and Spotify put together. Love the Friday "My Weekly Q".
Also it is truly lossless. Apple Music and if this Spotify thing goes through are both lying.
24-bit at 44.1 kHz is not even close to lossless.
um, if it is a bit by bit copied file from a CD then it is lossless if sent to your device that way.
how you then play that file is open to loss...

and 24bit is bigger than CD's standard 16 bit.

now CD might not have the original recorded bit for bit encoding...

but then older non-digital recordings have other issues due to the age of the storage method.

slippery slope of what is best source material. enough older recordings have been remastered over the years, many arguably better than what we think we remember...
 
But higher frequencies can still modulate and affect how lower, audible frequencies are perceived. This is a similar way low-frequency or high frequency content can affect loudness/overhead and distortion in mixes. Harmonics are created in a series, and the lack of of higher frequencies can be noticeable if you have good ears.

24-bit at 44.1 kHz isn't lossless.

Whilst I understand what you are trying to say, strictly speaking, by the technical definition of the term “lossless” - which refers to the compression algorithm rather than the sampling rate and resolution - 24 bit at 44.1 kHz is lossless if, for example, encoded in FLAC or uncompressed.


You gotta go to 24-Bit 194 kHz to get lossless.
This 44.1kHz doesn't even put them in the Hi-Res category.

So, you are saying that you have to sample at 194 kHz?! That allows encoding of almost 100 kHz waveforms (ie ultrasonic). How much money would you have to spend on hi-fi gear that can playback those frequencies? (Last time I checked, even high end consumer gear rolls off at around 20 kHz).

(It seems that some people assume that you need to play back audio with similar resolution and sampling frequency to the recording process for true fidelity. It’s been a while since I studied audio engineering, but my recollection is that the reason that recording studios use high sampling rates and precision is to maintain fidelity during the digital mixing and mastering stages. It’s not required for playback: the quality of which is limited by what speakers and amplifiers are physically capable of reproducing.

What you can hear is further limited by your ears. My biology is even more rusty than my audio electronics, but at 20kHz, all you can detect is the magnitude of the sound, not the shape of the waveform. So it doesn’t matter how well the 20kHz waveform is represented in digital form: once converted to physical sound waves you can barely perceive it and you can’t hear the shape of it (ie its harmonics.) Here’s an experiment you can do yourself. Find a waveform generator and set it to 20kHz: now change the waveform shape from sine wave to sawtooth and see if you can hear a difference. )
 
Last edited:
  • Like
Reactions: wbeasley
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.