Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Apple's garden the most secure around. (small print) Outside the garden "Good luck!" They are just like everyone else... Google Playstore, HMS, Samsung App store and so on. Smoke and mirrors. The average consumer has no idea what happens behind the scenes. They hear Apple is the most secure and they see everyone else.. hence they follow. A sucker is born every minute sadly.
 
Obviously, it's difficult for Apple to do a app-by-app check with people, but why after all these years haven't they found a way to automate testing of apps?
 
  • Like
Reactions: Jimmy James
Obviously, it's difficult for Apple to do a app-by-app check with people, but why after all these years haven't they found a way to automate testing of apps?
They do Automated testing of apps.

Some things are very difficult to test. The app sends encrypted information back to a server. The app report card says the information sent to the company includes only A and B.

How Is Apple supposed to know that the data in the encrypted packets also includes C and D, just by automated testing?
 
People seem to think that because it's the honor system it's worthless. The point of the system is that holds developers to a new standard. It is now against the rules to misrepresent what your app does. All Apple has to do is start banning any offenders they happen to find and the rest, not wanting to be banned will start to fall in line.
 
With 15% of app costs, Apple should be verifying every app. No honor system, they have multi-millions of dollars from the App store and they are doing things cheap and half-buttcrack
 
  • Like
Reactions: bousozoku
As an app developer myself and privacy advocate it's worth the effort to be transparent. It requires checking all downstream activities as well. For example, using cloud services like AWS or GCP. Making sure those activites don't track without knowledge. It's tricky to know for sure if someone is not fully techicale. Best way to avoid it is don't send the user data. In my apps I'm probably tracked on the backend for my usage (as the host using an API.) I never send any data to them that isn't raw and missing user information. That way it's tracking that an API is used (for billing) but not a user and analitical data.
 
  • Like
Reactions: bousozoku
With 15% of app costs, Apple should be verifying every app. No honor system, they have multi-millions of dollars from the App store and they are doing things cheap and half-buttcrack
No amount of money will ever fix this problem. Apple can't do much about it. They CANNOT see what data goes inside each app or what comes outside of the app.

I can give you a perfectly secure app and you'll find nothing wrong, it'll get passed by every review process on the planet.

I'll be able to just push any content I want inside and copy anything I want. No one will ever know until I get caught doing it.

That's why you need to first make this illegal (it is not, so best you can do is a ToS violation), make developers confirm they're not doing it and then ban/block them once they're caught doing it. Apple can't catch them in real time, it is always going to be a cat and mouse game.
 
  • Like
Reactions: amartinez1660
They do Automated testing of apps.

Some things are very difficult to test. The app sends encrypted information back to a server. The app report card says the information sent to the company includes only A and B.

How Is Apple supposed to know that the data in the encrypted packets also includes C and D, just by automated testing?
Years ago, Apple missed bugs in their software because they didn't even do crude testing for bugs. It wasn't difficult and they were shown how to detect them. They started doing the easiest of that kind of testing.

I would bet that their automated testing is the easiest possible testing and doesn't discover anything but the kinds of problems that a blind person could see. Still, they test more than Google, Microsoft, or Adobe do.
 
But that allows the buck to be passed. Someone who didn't write the code can claim they didn't realize it. It's no different than holding the pharmacist responsible if opioids go missing, or a doctor who give the thumbs up for a kid to play football again. Privacy is a serious issue and we need to take it serious. That means holding someone individually responsible.

Honest developers should support this as it increases risk for them and therefore the salary they can demand.
You think every developer knows every bit of the software? How do you expect a developer to know EVERYTHING that is being tracked? They might not have even touched the code for some of the tracking.
 
  • Disagree
Reactions: Tech for Kings
Years ago, Apple missed bugs in their software because they didn't even do crude testing for bugs. It wasn't difficult and they were shown how to detect them. They started doing the easiest of that kind of testing.

I would bet that their automated testing is the easiest possible testing and doesn't discover anything but the kinds of problems that a blind person could see. Still, they test more than Google, Microsoft, or Adobe do.

As a developer, I can tell you that apple catches lots of issues. I get lots of emails from them about all sorts of things.
 
  • Like
Reactions: Jimmy James
with apple having architected (?) the system from top to bottom, i'm kinda surprised these labels weren't automatically generated.
It's not possible to automatically generate a label.

The developer has to declare on the label when certain types of data are collected and used for specific purposes.

Simply using an API such as the location API to get a GPS position does not mean that anything needs to be declared on the label (e.g. if the location data is used offline).
 
They do Automated testing of apps.

Some things are very difficult to test. The app sends encrypted information back to a server. The app report card says the information sent to the company includes only A and B.

How Is Apple supposed to know that the data in the encrypted packets also includes C and D, just by automated testing?
Maybe spend some of the trillions and hire more real people to test apps
 
with apple having architected (?) the system from top to bottom, i'm kinda surprised these labels weren't automatically generated.

i am not surprised, however, that people weren't honest with the system.

Yes! They could analyze the code and generate labels automatically. That would be why the apps need to be in review, not just to check if they’re bypassing their in-app purchase systems.
 
  • Like
Reactions: mashdots
you are missing the point. No amount of people is going to tell them what is in encrypted packets.
Guess I don’t know too much about this stuff. So if they looked at the app this still would be missed? Thanks for the info.
 
Guess I don’t know too much about this stuff. So if they looked at the app this still would be missed? Thanks for the info.
Correct and it's why it up to the developer to make sure they follow correct practice. When the dev marks their app as "no data collected" and it does collect, there could be a consequence. My guess is email, drop off the store till fixed or marked correctly, etc.

I see it this way, I care about my users' privacy and want them to have an accurate picture of what they consent to. I collect nothing in my apps. I don't need or want their data. If someone else does, they, hopefully, tell you.
 
  • Like
Reactions: amartinez1660
you are missing the point. No amount of people is going to tell them what is in encrypted packets.
If random people can see the problem, people who work for Apple should be able to do it to a greater extent than they're doing.

They need to develop more methods to test, not just focusing on the easy ways to test.
 
  • Like
Reactions: Td1970
Headline here is misleading, suggesting a limited issue, that only a handful of apps, a dozen, are involved...

Fowler, however, reported this:

”To be clear, I don’t know exactly how widespread the falsehoods are on Apple’s privacy labels. My sample wasn’t necessarily representative:
There are about 2 million apps, and some big companies, like Google, have yet to even post labels. (They’re only required to do so with new updates.) About 1 in 3 of the apps I checked that claimed they took no data appeared to be inaccurate.”

That's the real concern.
If that percentage held up, it's a dismal finding.


Hope Apple permanently bans any company that misrepresents what its app is doing vis-à-vis privacy.
 
  • Like
Reactions: amartinez1660
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.