Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
70,565
42,285


Apple and other smartphone manufacturers are resisting an Indian government proposal that would require them to hand over source code for security review, reports Reuters.

apple-india.jpg

The proposal is included in a package of 83 security standards that India is considering as legal requirements, as part of Prime Minister Narendra Modi's efforts to boost security of user data following increases in online fraud and data breaches in the country.

Beyond routine measures like notifying the government of major updates and storing security audit logs, the standards would force manufacturers to hand over source code to government-designated labs to check for vulnerabilities.

Apple, Google, Samsung, Xiaomi, and industry group MAIT have all reportedly objected, citing a lack of global precedent and concerns about revealing proprietary details.

The standards were originally drafted in 2023 but are only now under government consideration. Tech company executives are expected to meet Tuesday to discuss the matter.

IT secretary S. Krishnan told Reuters the government will address legitimate concerns "with an open mind," adding it was "premature to read more" into the proposals.

The country's IT ministry also said it "refutes the statement" that it is considering seeking source code from smartphone makers, despite the requirement appearing in the government documents reviewed by Reuters.

A ministry spokesperson told the news organization it could not comment further due to ongoing consultation with tech companies on the proposals.

Apple in December resisted an Indian government directive that would require all iPhones sold in the country to ship with a preinstalled state-run security app. The government ultimately decided not to make the pre-installation mandatory for manufacturers after protests from privacy advocates, political opposition, and industry pushback.

Article Link: Apple Opposes India's Plan to Access iOS Source Code
 
"the standards would force manufacturers to hand over source code to government-designated labs to check for vulnerabilities".

Why would staff at a government-designated lab coming in cold to a huge code base be more likely to find vulnerabilities than Apple engineers - Apple engineers working in a environment where security/privacy is one of the cornerstones of how Apple positions itself in the marketplace?

I suppose one argument against that could be that people can miss things when reviewing their own work - don't be the only one proof reading your own copy - but when I was in the industry the places I worked at had mechanisms (group code reviews) to get other eyes on individual developers' new code before it was committed. I confess that I've no idea whether that is still standard practice in OS development teams but I'd hope that it is.

Apple also offers "potential maximum rewards of over $5M" (https://security.apple.com/bounty/) for people identifying vulnerabilities so that, plus Apple's access to the general bug reporting databases and analytics data coming off phones means that Apple has way more input data than some government-designated lab that presumably isn't paying big bucks for experts to try and find vulnerabilities and tracking device issues across the entire user base (at least for those users who have opted into analytics) in near real time.
 
Why would staff at a government-designated lab coming in cold to a huge code base be more likely to find vulnerabilities than Apple engineers - Apple engineers working in a environment where security/privacy is one of the cornerstones of how Apple positions itself in the marketplace?
Are we really going to pretend third parties don't find security holes in software, even without having access to the source code? Come on now...

For those wondering why Microsoft isn't against this; Microsoft has had its Government Security Program for decades now that allows governments a look at the source code of Windows, Office, Exchange, SQL Server, etc. Which yeah, seems like a fair thing for a government to ask; "can we trust the software from companies outside our borders".
 
The country that houses the world’s most cybercriminals, thieves and pirates wants to pirate and steal software because they can’t innovate their own good software. China must be laughing at India.

The US should slap even bigger tariffs and sanctions on India until they pay back every pensioner who was robbed by fake tech support.
 
This sounds like a gross overreach of what government is supposed to be? I really don’t understand why India is like this, instead of trying to cooperate in the global economy?

The failing non-innovating EU set the example. Other nations such as UK, Russia and India followed.

China surprisingly has been very respectful in comparison because they know how to innovate and make their own software. Instead of trying to destroy and fine US companies they build their own solutions. The worst they do is just not let companies like Meta run amock.
 
Is this the strategy where they just keep asking for worse and worse things, then after they've all been rejected, they go back to the terrible thing they originally asked for which now doesn't seem as bad anymore?
 
  • Like
Reactions: jbc25 and Timpetus
The country that houses the world’s most cybercriminals, thieves and pirates wants to pirate and steal software because they can’t innovate their own good software. China must be laughing at India.

The US should slap even bigger tariffs and sanctions on India until they pay back every pensioner who was robbed by fake tech support.
No. Tariffs hurt everybody and shouldn't be white-knighted because these requests (which India has a history of going back on) are so dumb.
 
  • Like
Reactions: TruthWatcher412
"the standards would force manufacturers to hand over source code to government-designated labs to check for vulnerabilities".

Why would staff at a government-designated lab coming in cold to a huge code base be more likely to find vulnerabilities than Apple engineers - Apple engineers working in a environment where security/privacy is one of the cornerstones of how Apple positions itself in the marketplace?

I suppose one argument against that could be that people can miss things when reviewing their own work - don't be the only one proof reading your own copy - but when I was in the industry the places I worked at had mechanisms (group code reviews) to get other eyes on individual developers' new code before it was committed. I confess that I've no idea whether that is still standard practice in OS development teams but I'd hope that it is.

Apple also offers "potential maximum rewards of over $5M" (https://security.apple.com/bounty/) for people identifying vulnerabilities so that, plus Apple's access to the general bug reporting databases and analytics data coming off phones means that Apple has way more input data than some government-designated lab that presumably isn't paying big bucks for experts to try and find vulnerabilities and tracking device issues across the entire user base (at least for those users who have opted into analytics) in near real time.

I would also add that the kernel itself is open source, and Apple already has special access programs for qualified security researchers that gives them access to rooted devices. There are of course tons of third party audits and auditors they can and already submit themselves to.

I could see this as part of a carefully constructed partnership to review critical code, but it sounds a lot more like governments looking for/creating backdoors.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.