
India’s proposal to require smartphone manufacturers to share portions of device source code with government authorities has triggered a sharp debate over security, innovation, and market access. The draft requirement, framed as a measure to strengthen national cyber security and regulatory oversight, would apply to major global manufacturers operating in India, including Apple and Samsung Electronics.
Android’s Open-Source Reality
Explaining this, Yashodhan Sawardekar, a cyber-security analyst and PhD researcher at National Forensic Sciences University, says, “Nearly all mobile devices today run Android or iOS. Think of Android like a car design that’s widely available in every public library—the base source code called AOSP (Android Open Source Project) is already publicly available online. Phone manufacturers like Samsung, Xiaomi and OnePlus partner with Google to customise this code and add their own proprietary changes.”
These manufacturers develop exclusive UI skins (Samsung’s OneUI, Xiaomi’s MIUI), custom camera algorithms, performance optimisations, GPU drivers, and unique gesture controls that give them competitive advantages in the market. These innovations are years of engineering work and are closely guarded trade secrets, states Sawardekar.
Trade Secrets at Risk
What makes the government’s request problematic, according to him, is that Android’s security is already checked by thousands of expert cyber security researchers worldwide. Companies like Qualcomm and MediaTek work with Google’s security teams and independent bug hunters who compete for cash rewards to find vulnerabilities. “This collaborative ecosystem has been working for over a decade. It’s like having the world’s best mechanics continuously testing and improving the car design that’s publicly available,” points out Sawardekar.
So the government is essentially asking manufacturers to apply 83 security standards and hand over both public code and proprietary innovations to agencies that lack the technical expertise to meaningfully audit such complex systems. Worse, granting access creates new risks of code leakage from government systems. It is akin to asking Ferrari to share its engine designs with a local RTO office for safety verification, he opines.
Privacy and Global Concerns
However, this approach may make more sense for Apple’s iOS, which is fully closed-source. Unlike Android, iOS code cannot be independently examined by security researchers, so compelling Apple to open up could improve transparency, says Sawardekar.
Another Goan, Mangesh Sakordekar, who holds a master’s and bachelor’s degree in computer science, feels the move is a distraction tactic. “No company will agree to this. Closed-source code helps companies provide a unique user experience. If the code is open source, anyone can copy it and offer a clone,” he says.
Experts argue that companies invest heavily in software development, and ownership of such code remains proprietary. Sharing it with government agencies exposes it to multiple officials, increasing the risk of leaks or misuse. There are also concerns that this could lead to user data exposure, raising questions about privacy and data protection. “This poses a huge risk to the privacy of consumers,” notes an IT expert.
Others warn that the proposal sets a dangerous precedent. “Honestly, the government is pushing policies beyond its expertise,” says another expert, pointing out that no country—including China—mandates such requirements.