Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The likelihood that someone clever enough to figure out they can scan an AirTag with their phone would also enter personal information on some fake website is pretty low...

Still, pretty much a beginners error on Apple side for allowing code injection to slip through AirTag design and qa!
 
The likelihood that someone clever enough to figure out they can scan an AirTag with their phone would also enter personal information on some fake website is pretty low...
Yeah, add to that, the likelihood of someone finding a lost AirTag when there physically just aren’t many lost ones out there, is even lower.

”Security Researchers” ran out of REAL things to report a long time ago. There was a time when a security researcher would report something that did not require physical contact with the device and did not require interaction from the user. In those days, they felt important because the things they were talking about were of immediate interest.

These days, the sky is no longer falling, (and they’ve got to put in REAL work to find exploits as serious as the ones in the past) so they have to crow about whatever they can to whomever will listen!

Security Researcher: Hey, you’ve got glasses in your cabinet.
Normal person: Uah, yeah. I use them. For drinking.
Security Researcher: Did you know that if you open that cabinet, take out the glass and throw it on the floor, THEN take off your shoes and step on the broken pieces, you could start bleeding?
Normal person: Sure, that makes sense.
Security Researcher: You should do something about that.
Normal person: Umm… ok. Sure. I’ll get right on that.
Security Researcher: YOU’RE NOT CONCERNED ABOUT SPONTANEOUS BLEEDING?
Normal person: I am, but I’ll just be careful when I use my glasses.
Security Researcher: YOUR HOUSE IS A BLEEDING DEATHTRAP AND I’M GOING TO TELL EVERYONE!

Perhaps if AirTags were so common that people on this forum were coming across 10-15 lost ones a day, this could be a low level concern. BUT you can be assured that anyone wanting to acquire an average person’s login details is not going to be using an AirTag to do it. They can simply send them an email stating “You’ve won a free AirTag, log in here at our prize collection totally not phishing site.”
 
Last edited:
Why are some security researchers such drama queens?
Apple setup a bounty program that pays up to $40k per report.

For some, this is their job. Discover issues, develop proof of concept, report issue, Apple patches issue within 90 days, dev collects paycheck, receive recognition when issue is made public.

Apple is failing to do all of these. Apple is failing to provide payment (wouldn't that make you made if your employer stopped paying you?

Then Apple gaslights them when they call them out for not receiving recognition by saying they'll give credit in the next update. Next update comes and still nothing.

Again, Apple is failing in all areas that THEY promised. I'd be pissed if I was part of Apple's bug program and getting shafted like this.
 
Yeah, add to that, the likelihood of someone finding a lost AirTag when there physically just aren’t many lost ones out there, is even lower.

”Security Researchers” ran out of REAL things to report a long time ago. There was a time when a security researcher would report something that did not require physical contact with the device and did not require interaction from the user. In those days, they felt important because the things they were talking about were of immediate interest.

These days, the sky is no longer falling, (and they’ve got to put in REAL work to find exploits as serious as the ones in the past) so they have to crow about whatever they can to whomever will listen!

Security Researcher: Hey, you’ve got glasses in your cabinet.
Normal person: Uah, yeah. I use them. For drinking.
Security Researcher: Did you know that if you open that cabinet, take out the glass and throw it on the floor, THEN take off your shoes and step on the broken pieces, you could start bleeding?
Normal person: Sure, that makes sense.
Security Researcher: You should do something about that.
Normal person: Umm… ok. Sure. I’ll get right on that.
Security Researcher: YOU’RE NOT CONCERNED ABOUT SPONTANEOUS BLEEDING?
Normal person: I am, but I’ll just be careful when I use my glasses.
Security Researcher: YOUR HOUSE IS A BLEEDING DEATHTRAP AND I’M GOING TO TELL EVERYONE!

Perhaps if AirTags were so common that people on this forum were coming across 10-15 lost ones a day, this could be a low level concern. BUT you can be assured that anyone wanting to acquire an average person’s login details is not going to be using an AirTag to do it. They can simply send them an email stating “You’ve won a free AirTag, log in here at our prize collection totally not phishing site.”

This one seems pretty serious to me for a few reasons.

One, it's a grammar school coding fail. Validate your input. Sanitize anything you're putting into an execution environment. How did a <script> tag make it into a phone number field and then through the entire system unmolested turning a trusted Apple domain into a serious security threat?

This flaw allows an AirTag to direct your phone to run arbitrary code in your browser including but not limited to redirecting you to any website. That qualifies to me as no physical contact, no interaction. The attacker doesn't need access to your hardware, the user doesn't need to interact with their devices in a way different than the product would typically require.

If a Safari exploit is found, this is a vector to triggering it.

As others have mentioned, this isn't in the firmware for the tags, it's in the backend web service-- so this isn't esoteric in any way. The world is awash with competent web service engineers and there are plenty of well established processes and automated tools for testing this kind of code. Why isn't Apple applying any of those to a product like this?

If they let an error like this slip through, what else aren't they paying attention to?

And finally, a patch could have been issued within hours on this-- it's a phone number, validate that it only includes characters acceptable for a phone number and limit the field length to what's necessary for a phone number. Why isn't Apple more proactive in addressing these kinds of flaws, and why aren't they more receptive to the researchers identifying them as a service to Apple. Are the researchers hoping to get paid? Sure. I hope to get paid for my work too. Nothing wrong with that.

This is a problem, and it's not the first time Apple has released products with easy to exploit flaws and have been slow to take corrective action. Apple has a cultural blind spot to security issues that undermines what I believe is a sincere effort to strengthen user privacy. They can't have the later without the former and they need to push hard on security and trumpet it as loudly as they do their privacy message.

These forums are full of Chicken Littles on all manner of trivial nonsense but, on matters of security, the response seems far too overly subdued. Security is foundational.
 
Last edited:
BUT you can be assured that anyone wanting to acquire an average person’s login details is not going to be using an AirTag to do it. They can simply send them an email stating “You’ve won a free AirTag, log in here at our prize collection totally not phishing site.”
Lol, yes! 😂

Btw, I actually lost an AirTag today, or better, my dog lost his collar. I didn't even notice it until hours later. Found it back a couple meters down the road where I live. Someone apparently found it and put it on small fence assuming the owner would come back to look for it. Guess around a hundred people must have walked passed it, some might have even noticed it but nobody scanned it and gave me a call. The fun part was actually finding it down the street with my iPhone picking up a weak signal, enabling the beeper and seeing the screen lid up on the findMy app when I got closer. That was kind of cool. I only hope than when the collar gets lost while my dog is still wearing it, someone takes the time to look at the tag, scan it and call me. Or just look at the metal tag with her name and my phone number, probably easier ;)
 
Why is apple so lazy and incompetent when dealing with security researchers?
1. Wall Street. Apple focuses on pushing out new product, not making current product solid. Unless this issue appears to seriously impact their “valuation”, this will continue. In that regard, they’re just like every other computer industry corporation.

2. Apple’s corporate culture is heavily entrenched with secrecy, department silos (related to secrecy, but not entirely), and therefore poor communication...

3. ...and a huge dose of basic arrogance.
 
Guessing he’s not getting a bounty now then 🤣

In all seriousness though, companies take a long time to patch bugs, just look at intel who kept their huge cpu bugs secret for over 6 months with help from Microsoft and Apple. At least this guy and others like them are reporting them to Apple and not selling the bug info on the dark web. Apple and others should really be thankful and act on the reports, providing feedback and timescales etc where required.
 
Last edited:
  • Like
Reactions: Robert.Walter
Any negative privacy news about Apple slowly chips away at their credibility
Still, their credibility has to be chipped away SO MUCH to have any meaningful impact to them. Critical mass. AirTag would need a few dozen more scandals like this one before people are starting to not buy any AirTag.
 
Why is apple so lazy and incompetent when dealing with security researchers?
Because Apple is oblivious to its users, developers and security researchers. I have registered several error tickets and after several years they are still in the "open" state. Why????? 😐🤷‍♂️
 
  • Like
Reactions: Robert.Walter
Then it indicated wrong, and common sense also said it's wrong.

It's called Activation Lock. iPhone has it as well. Nobody can grab your iPhone while screen is turned on and reset it and use it without knowing your iCloud password.

Actually AirTag is harder since you also have to find THE iPhone it's setup with, to reset it.
Most of your reply is about the phone AL but you’re telling me something I’ve known for years and years and saying the AT approaches it identically.

But my tag is not my phone.

Please find me the same citation for an AirTag and you might be more convincing.

EDIT: I’ve gone back to test the tag for AL but came across Apple’s explainer on the topic.

It (now?) indicates the tags are locked to the AID and the 5x Battery install to reset is only necessary if the Remove Item from Find My Items is done out of BT range of the tag.

This contradicts earlier reporting on the reset process. But is now the state of things, so I stand corrected. Thanks!
 
Last edited:
Apple is a bunch of people working from home with distractions & babies crying in the background. COVD has seriously impacted getting things done at all. They are running on fumes and it’s surprising that they get some stuff done. What’s not surprising is all the messed up little details that need fixing, but communication is broken and people in authority are unavailable. Decisions that need making are lost in the ongoing chaos of COVID workforces.
There is probably no company better positioned for exigent work from home than Apple.
 
Most of your reply is about the phone AL but you’re telling me something I’ve known for years and years and saying the AT approaches it identically.

But my tag is not my phone.

Please find me the same citation for an AirTag and you might be more convincing.
1. I'm not sure what makes you presume company like Apple who make products like iPhone AND Mac that can't be steal and use without the owner password (try it yourself) will not do the same for AirTag..

2. You have AirTag, so it's very easy to test. Let someone else hardware-reset it and see if they can pair it to their phones. There, your answer.

3.
However, keep in mind that unless the original owner has removed the AirTag from their own Find My app, you’ll be blocked from finishing the set up process.
 
Apple is failing to do all of these. Apple is failing to provide payment (wouldn't that make you made if your employer stopped paying you?
If MY EMPLOYER stopped paying me, I’d be pissed. If I did a LOT of work, for free, HOPING to get paid, only to have someone else submit that work BEFORE me such that they get paid and I don’t…

Well, that was MY fault for working for free in the first place :)
 
  • Disagree
Reactions: btrach144
If MY EMPLOYER stopped paying me, I’d be pissed. If I did a LOT of work, for free, HOPING to get paid, only to have someone else submit that work BEFORE me such that they get paid and I don’t…

Well, that was MY fault for working for free in the first place :)
You don't understand the issue. This isn't about one person getting paid and another not.
 
If a Safari exploit is found, this is a vector to triggering it.
Yes A vector. Now, how many vertices does this vector have?
Purchasing an AirTag -> acquire an Apple ID -> registering the AirTag -> deploying the site/providing the link to the site on the AirTag -> depositing the AirTag in some public location -> hoping someone picks it up -> hoping they know that they can hold their phone to it to obtain how to contact the person who lost it -> hoping they know THAT while at the same time NOT knowing that the information is provided without requiring your own AppleID -> and finally, hoping that they actually enter their information and submit it.

That’s 9 vertices in that vector. No matter how much money a company has, they do not have the luxury of infinite developers. Sounds surprising to a lot of folks here, I know, but true! So, if they have a backlog of far more serious internally reported OR reports from security researchers NOT trying to make a name for themselves on social media, they’re going to prioritize those over this EVERYtime. I know I don’t mind this being left unresolved while some critical bluetooth stack exploit gets resolved.

With all these steps required to perform… ALSO considering the myriad other ways that any enterprising attacker could save time by using any number of known successful ways to get the information from thousands or MILLIONS (instead of, at most, 1 or maybe two for each effort), it’s clear why this is at the bottom of the “fix” pile. There’s a pretty big gap between “What’s possible” and “What an attacker would spend time trying to do.

And finally, a patch could have been issued within hours on this-- it's a phone number, validate that it only includes characters acceptable for a phone number and limit the field length to what's necessary for a phone number. Why isn't Apple more proactive in addressing these kinds of flaws, and why aren't they more receptive to the researchers identifying them as a service to Apple. Are the researchers hoping to get paid? Sure. I hope to get paid for my work too. Nothing wrong with that.
I know VERY few companies that are interested in making any changes to their PRODUCTION environment in hours, especially when the exploit in question requires as much set up as this one does. Perhaps if there was an exploit that required no interaction from the user, but, even in those cases, they’d likely disable the feature while they test the fix to ensure no unexpected ill effects over a day or so.

Regarding them not being paid, well everyone making a living based on hoping to be paid have made a decision that they WANT to make a living hoping to be paid. They accept that, sometimes, they make more money in a week than all their friends combined make in a year. Sometimes, their hours and hours of hard work goes unpaid. If this is NOT a situation they prefer or desire, I’m not in a position to tell them how to fix that situation. I AM aware that there ARE positions that pay more reliably on a set schedule and a more reliably set amount, though. And 99% of those, likely more, don’t require waiting for a check from Apple.

What they can do in the interim is make YouTube videos… though, that’s also a business where you’re hoping to be paid…

These forums are full of Chicken Littles on all manner of trivial nonsense but, on matters of security, the response seems far too overly subdued. Security is foundational.
This one (and, really, any exploit that requires user intervention) IS trivial nonsense, though. We can look over the exploits closed at any time in the last year and all of them will stand muster as more serious than this one. I can’t think of one I’d choose to leave open and unpatched in lieu of getting this one fixed.

Those exploits that require no interaction from the user deserve all the ire that the commentariat feels worthy to righteously muster! Well, I mean, the commentariat should feel free to throw their ire in whatever direction suits them, or whatever way the wind is blowing. :) If anyone wants to have a “bandwagon effect” where other folks jump on board and join in? Well, an exploit that requires interaction with a fairly rare device (that you’re TRYING to make look like it was lost) in the wild is not likely to be the one that gets legs ta jumpin’!
 
  • Disagree
Reactions: Analog Kid
You don't understand the issue. This isn't about one person getting paid and another not.
You’re right, the issue is someone doing a lot of work, all along hoping that, at the end of that work, they get paid. As long as the way they EXPECT to make a living is doing a lot of work WITHOUT getting paid, then HOPING they get paid once the work is done, then each day, each week, their bills are at risk of not being paid.
 
  • Disagree
Reactions: btrach144
I'd be pissed if I was part of Apple's bug program and getting shafted like this.
Gotta say, I wouldn’t be a part of anyone’s bug program. Doing free work HOPING to get paid? And, if I was, I’d be leaving the program the very next day. BUT…

Like many things Apple…

I doubt they’ll actually put their money where their mouth is and just stop looking for Apple’s bugs. :)
 
  • Disagree
Reactions: btrach144
Although I certainly don't condone Apple hesitating in addressing security issues I'm also starting to view "security researchers" as petty people who put themselves over the security of us all. "Apple didn't commit to recognizing that I found out I can inject some HTML into the AirTags message so now I'm going to go tell the world how to break this," isn't a mature response.
The Security Researchers are just doing their job and would like to be compensated for it. Also, Apple should pay because this guy could have just easily taken this and sold it on the black market and allowed it to be exploited without anyone knowing, but instead he wanted to do the right thing and sent it to Apple and even after Apple did nothing about it, once again instead of selling it on the black market for probably more than Apple would have paid, he put the exploit out knowing now Apple will have to do something and it will be fixed while also realizing by doing a public disclosure he might not receive ANY money. So, he did the mature response. Hell, Goggle does the same thing to other companies when they find a bug and it isn't fixed in time.
 
The Security Researchers are just doing their job and would like to be compensated for it. Also, Apple should pay because this guy could have just easily taken this and sold it on the black market and allowed it to be exploited without anyone knowing,
I know, right? This one time, I was just doing my job writing and sending Hollywood my screenplay. They didn’t ask me to do it, no, but I’m done with it, I sent it and I WANT MY MONEY.

And this exploit? On the black market? Where, on any given day, they have access to ACTUAL exploits that require FAR less user interaction (and doesn’t require you to buy any additional physical devices) than this one? He’d be lucky to get $2 on the black market for something like this.
 
1. I'm not sure what makes you presume company like Apple who make products like iPhone AND Mac that can't be steal and use without the owner password (try it yourself) will not do the same for AirTag..

2. You have AirTag, so it's very easy to test. Let someone else hardware-reset it and see if they can pair it to their phones. There, your answer.

3.

please see my edit/additional comment in my initial reply to your feedback note.

Ps I didn’t presume anything. My comments were based on the early reporting on the topic. Perhaps they ere in error or perhaps apple added AL lock after launch. I don’t know. But it’s not important now that it’s in place. Thanks for pointing out the change!
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.