Qualcomm Claims New X85 Modem Creates 'Huge Delta' in Performance Versus Apple

Peak upload and download of multi-gbps... Fine, but the infrastructure around here offers 50-70mbps, if I go into the city, I might get 150-200mbps. I mean what resolution Apple TV+ series are you going to be watching to need those speeds? 😂
 
This is kinda sorta turning into the whole "Intel vs. Apple silicon" thing, where one side throws out perf numbers with no consideration what the thermals / power usage is, and the other has taken a direction where performance per watt is the goal (vs. raw performance).
 
12.5 Gbps is an impressive theory speed. My local carriers might bring it to real life when 6G-A is built.

Also, I believe most things they talked about AI can be replaced with a simpler word "algorithms".
 
It may be high performance, but I hate to imagine the battery drain running in 3GPP NR 5G mode. Apple's upcoming C2 will likely use a lot less power since it's specifically optimized for iOS and MacOS, not a "universal" radio modem chip like the Snapdragon X85.
 
This reminds me of 20-25 years ago, when Intel would have some kind of announcement showing that their new CPU was light years ahead of what AMD was doing, and essentially implying that AMD would never catch up.

Apple may never catch Qualcomm, but they don't exactly have to---if they aren't buying chips from them, it's a loss for Qualcomm already. Look at it another way; Qualcomm may still reign supreme in cellular chips for years, but it is losing/lost a huge chunk of business in the future.
 
Now we just need a usecase as long as the modem is still in production .
I saw a video sometime last year that we need extremely fast speed 5g modems for other things such as self driving cars, drones, AI features, etc. A normal user with a phone might have no use for these speeds but other utilities can.
Sure, but that's about latency, not bandwidth. With 4gbps you could easily stream 500 HD Videos from a car simultaneously, which obviously no car-hardware could compress. A typical F1-car generates about 1TB telemetry data per weekend (6h on Track), which translates to 0,37Gb/s (if it only generates data while driving, which it probably doesn't).
 
The replies are already hilarious

If this was reversed and it was Apple's modem, it would be a DUNK FEST

C'mon guys .. let's all slightly de-tint the Apple colored shades
Nah, wrote the same about the 16e only having 4gbps - that's still way faster than any usecase over the lifetime of the phone. Doesn't even make sense to tether it at usb2 speeds of theoretically 480mbps (or wifi at 1,2 gbps). It's just a ...-measuring contest by now.
 
Last edited:
Nah, wrote the same about the 16e only having 4gbps - that's stilm way faster than any usecase over the lifetime of the phone. Doesn't even make sense to tether it at usb2 speeds of theoretically 480mbps (or wifi at 1,2 gbps). It's just a ...-measuring contest now.

While I agree with your point, we all know it would in fact be a "yay Apple!" .. "BOOM .. Apple wins again!" dunk fest if it were reversed, even if it were totally pointless, as it often is
 
In my humble opinion, when it comes to cellular, is not about maximum speeds anymore but rather good coverage, good reliability of the signal, and efficiency.
 
Sounds like Qualcomm is butt hurt over Apple leaving the fold. Gotta start comparing and trashing Apple just like Samsung always does when it feels the heat from Apple. Intel is still butt hurt years later. Everyone is touting about how they are superior to Apple. What does that say about Apple’s influence on the market?
 
AI in a cellular chip that improves connectivity for weak signals. What does it do - guess the data in the air?

No, it probably guesses, for the next second: what will signal quality be like? How much will the user use the device? Etc. Based on that, it can then attenuate the data, set the power draw, etc.

Using environmental data (location, user movement, distance from cell towers, etc.) to predict that could be done in ML.
 
What the heck is an "AI powered" modem?! This term is buzzed out of relevance by now.

You put that magic word and investors pour money into your company.

Nobody gets excited by hearing SOFTWARE. Machine Learning, Artificial Intelligence, Quantum Computing, etc.
 
The "problem" here is that we're in diminishing returns, when it comes to cellular performance. Weak signal still sucks, but assuming you have good signal, most consumers won't know the difference between 100Mbps or 1Gpbs.
 
Just like computer specs..... if the info comes in at a speed half the time of an eye blink, how will we see the info differently if it comes in a quarter of the time of an eye blink????

Not chasing speed but chasing lower power use is what I want to see so the iDevice can last longer when on battery power for all devices.

Apple heard and implemented that thought when designing the C1.
 
I'm not sure any of us normal consumers need AI of any kind
It can be very handy if the modem recognizes certain bandwidth for best performance or automatically connects without having to deal with complex trial and error. Just plug and play like we used to in the past with Apple.

AI can be very useful in some places and could save users lots of time and frustration.

I’ll understand that the word “AI” causes allergic reactions. And there are also a lot of useless examples like genmoij and censored image playgrounds.

But there are also very good examples that works pretty well. Like repairing photos with accidental taken objects or a group of persons where one has closed eyes, automatically recognizing content and helping you deal with it.

Time consuming things and technical things you don’t have to care about.

I’ve worked in a hospital and every scan was also interpreted by AI besides a professional observation. Very often AI detected irregularities the doctor or surgeon overlooked.

There are lots of good examples everywhere. Because Apple hasn’t shown us good examples yet, it doesn’t mean AI is bad.

I understand the term AI is working on your nerves. Apple used Apple Intelligence as a marketing phrase while failing to deliver good AI (large language models).

😊
 
It can be very handy if the modem recognizes certain bandwidth for best performance or automatically connects without having to deal with complex trial and error. Just plug and play like we used to in the past with Apple.

AI can be very useful in some places and could save users lots of time and frustration.

I’ll understand that the word “AI” causes allergic reactions. And there are also a lot of useless examples like genmoij and censored image playgrounds.

But there are also very good examples that works pretty well. Like repairing photos with accidental taken objects or a group of persons where one has closed eyes, automatically recognizing content and helping you deal with it.

Time consuming things and technical things you don’t have to care about.

I’ve worked in a hospital and every scan was also interpreted by AI besides a professional observation. Very often AI detected irregularities the doctor or surgeon overlooked.

There are lots of good examples everywhere. Because Apple hasn’t shown us good examples yet, it doesn’t mean AI is bad.

I understand the term AI is working on your nerves. It’s often getting used as a marketing phrase.

😊

I should have been more clear, I was mostly referring to Apple Intelligence (AI)
👍
 
Well yeah, gotta keep trashing Apple.
Who does this on purpose? Most YouTubers profile their audience to keep making content they like. If they're an Android-focused channel (eg TechSpurt) then it fits with their image whereas MKBHD very rarely says anything bad about Apple because he's top of their social media food chain.
 
It can be very handy if the modem recognizes certain bandwidth for best performance or automatically connects without having to deal with complex trial and error. Just plug and play like we used to in the past with Apple.
Don’t really know if the end user will see a difference. It’s the whole android specs make it better mentality that’s been pervasive for years.
AI can be very useful in some places and could save users lots of time and frustration.
Sure generally speaking AI can be useful.
I’ll understand that the word “AI” causes allergic reactions. And there are also a lot of useless examples like genmoij and censored image playgrounds.
It does, right?
But there are also very good examples that works pretty well. Like repairing photos with accidental taken objects or a group of persons where one has closed eyes, automatically recognizing content and helping you deal with it.
Sure niche cases for sure. Can replace photoshop but if repairing photos is the best use case that can be documented, I think it’s going in the wrong direction.
Time consuming things and technical things you don’t have to care about.

I’ve worked in a hospital and every scan was also interpreted by AI besides a professional observation. Very often AI detected irregularities the doctor or surgeon overlooked.
Now that is useful.
There are lots of good examples everywhere. Because Apple hasn’t shown us good examples yet, it doesn’t mean AI is bad.
Android hasn’t shown us it’s good.
I understand the term AI is working on your nerves. Apple used Apple Intelligence as a marketing phrase while failing to deliver good AI (large language models).

😊
AI gets under peoples skins because it sounds nefarious. But people laud android because the ai on android devices can repair photos? There is a long, long way to go.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.
Back
Top