Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
One of the downfalls of the video game industry was the lack of quality control during the 80's and the mass production of absolute crap that made it onto shelves.

The app store was amazing but now I rarely look through it. It is an endless sea of crap and has been this way for the past few years. It wouldn't be as bad if Apple implemented a more functional and advanced search feature but they have not yet up to this point. Since they are so controlling of their environment, why do not they not implement a quality control such as what Nintendo did with the NES and their "seal of quality" companies had on their games to show they were properly licensed and checked by Nintendo?
 
Wish Apple could block or blacklist those developers from creating another app in the near future when something like this happens.
 
  • Like
Reactions: MagMan1979
It isn't. In Objective C it's possible to construct API calls at runtime, so there's no easy way to discover them using static code analysis. And you can implment a "timebomb" to make sure the calls aren't made while the app is being run in the review process.

Hmmm. I wonder if they plan on removing this ability once Swift gains traction...
 
These apps should be banned, but doesn't sound too serious. Google likely collects more data ;)
Sounds like the kind of stuff Google gets up to normally :)

A smiley face doesn't change the ridiculousness of these comments.

Neither Apple nor Google sells the personal information they collect, unlike these app ad collection systems.

It isn't. In Objective C it's possible to construct API calls at runtime, so there's no easy way to discover them using static code analysis. And you can implement various methods to try and avoid making the calls while the app is in the review process.

Exactly. Almost everyone mistakenly thinks that Apple can safely vet apps for malware. But malware can easily set a timer to only enable itself after App Store testing is over with.

This is why the best safety comes when third party groups constantly monitor apps for malware after they're released to an app store.

Personally, I think that that every smartphone should have a log of where each app goes, so that the user can check for such unadvertised connections.
 
A smiley face doesn't change the ridiculousness of these comments.

Neither Apple nor Google sells the personal information they collect, unlike these app ad collection systems.



Exactly. Almost everyone mistakenly thinks that Apple can safely vet apps for malware. But malware can easily set a timer to only enable itself after App Store testing is over with.

This is why the best safety comes when third party groups constantly monitor apps for malware.

Personally, I think that that every smartphone should have a log of where each app goes, so that the user can check for such unadvertised connections.
I never said Google sells them.
 
um ok, so when do we get the full list of apps affected?

and another thing, when is Apple going to stop enabling and facilitating the ease of obtaining fast food? WWSJD?
 
  • Like
Reactions: S G
As a rule, I’ve avoided apps whose developers are Asian. Primarily, because Asian developers tend to generate a lot of copycat apps of poor quality. Their efforts are rarely serious; their apps don’t evolve beyond version 1.x. Now I have another reason to boycott them.
 
"Its report claims most of the developers who used the SDK are located in China"

Didn't the last malware report mainly affect developers in China?
 
Exactly. Almost everyone mistakenly thinks that Apple can safely vet apps for malware. But malware can easily set a timer to only enable itself after App Store testing is over with.

This is why the best safety comes when third party groups constantly monitor apps for malware after they're released to an app store.
.

In the situation you just describe Apple could, instead, search for code which only enables itself after a set time.

People are right to think this is a complex arms-race between the malware creators and Apple, but there is nothing fundamental that can stop Apple from keeping up.

The worst malware is actually something that no-one has seen before, a whole new fresh exploit that goes unnoticed for a long enough time to cause serious damage, and not some 'perfect loophole' that would be undetectable forever.

But there is no fundamental problem with detecting malware (nor malware that enables itself after a set time etc.. )
The static code analyses from recent Xcode versions, point the way to how Apple are probably handling this.
It is complicated and also by its nature secretive, personally I assume that part of the bitcode push in recent months could, at least in part, be to address this problem by gathering more information on the program before it is compiled into native instructions.
 
I won't download any app developed in China from now on. You guys just screwed yourself over with a great opportunity with Apple.
 
  • Like
Reactions: spinnyd
It isn't. In Objective C it's possible to construct API calls at runtime, so there's no easy way to discover them using static code analysis. And you can implement various methods to try and avoid making the calls while the app is in the review process.

There needs to be a version of Little Snitch for iOS. Or if they are testing these apps in the simulator a copy of Little Snitch on that machine should be able to point out exactly what's going on. The list of things we find out Apple doesn't check gets longer.
 
Apple should start banning these developers and their apps permanently. They need to have a zero tolerance policy for this stuff. Same goes for the XcodeGhost developers.
 
  • Like
Reactions: spinnyd
The way I look at it is it's impossible to stop everyone... at least at this point. But it's good to see Apple is trying to stay on top of this stuff and make sure iDevice people are as safe as possible.
 
How are these apps getting the approval of Apple's curators? They had full blown malware apps a few weeks ago and now hundreds of apps infringing on user privacy.
So there were 250 apps out of 1.5 million. Ok
 
Why does Apple allow these private APIs to begin with? Is it not something they can disable to avoid this problem in the future? I mean the reality is that you do not need the SDK to leverage the APIs. If you are an app developer you could write code to leverage them directly. How is Apple monitoring for this?

Any set of functions you make yourself put together can be called an API.

For example, you make a function that calculates how much wood could a woodchuck chuck if a woodchuck could chuck wood. You put this function in an API called WoodchuckStuff.

Now, others use your WoodchuckStuff API because they really need that woodchuck function.

Meanwhile, unbeknownst to any developers, your woodchuck function in addition to giving out the wood count, also figures out the Apple ID of the user and sends this back to the WoodchuckStuff API developer.

Apple's previous processes never saw this banned functionality inside the API, maybe they just looked at your code, never scrutinized the third-party APIs as well as they should? Or maybe the API developer hid these changes very well. (Maybe the code requested a URL, downloaded the code from there and ran it after everything was compiled?)

Well, Apple found out, and now it's banned this API.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.