SMITH BRAIN TRUST – App makers delayed updating to a more secure version of Google’s Android platform just so they could continue to harvest and sell users’ personal data, finds new research from the University of Maryland’s Robert H. Smith School of Business. The research also reveals the tactic backfired for the devious app developers, who suffered hits to their reputations and revenue.
Siva Viswanathan, Dean's Professor of Information Systems and Digital Innovation, and Maryland Smith PhD candidate Raveesh Mayya looked at what happened when Google rolled out its Android 6.0 version with tighter privacy rules. Before the 6.0 version – dubbed “Marshmallow” – app developers could use blanket permissions that would give them access to sensitive personal information like location data, contact lists and photos.
“They sell the data and you never know who it’s being sold to and what it’s being used for,” Viswanathan says. “The key thing is, until very recently, with mobile apps you didn’t even have a choice. The only way to download and use any app was to accept all the permissions it sought.”
And users weren’t often reading that entire list of disclosures, Mayya says. “They were used to just scrolling to the bottom and clicking ‘accept.’” That gave app makers access to more information than consumers might be comfortable with.
But in recent years, as consumers have become much more sensitive to privacy issues (let’s not forget the Cambridge Analytica-Facebook firestorm), platforms have tightened access to user data. Google upgraded privacy with Android 6.0, which requires app makers to ask for individual permissions to sensitive data. Now users pick and choose what they allow apps to have access to.
Google gave Android app developers a three-year window to upgrade, offering an interesting opportunity for the researchers to see when app makers decided to upgrade, track how they changed privacy settings and what happened afterward.
To do so, Mayya and Viswanathan installed over 13,600 apps and developed their own app that tracked all the other Android apps to see which ones changed their permissions each month and what they were asking for before and after upgrading.
The first thing the researchers noticed was that not all the apps were quick to upgrade, prompting them to figure out why.
“We didn’t expect apps to have a strategic intent to delay upgrading,” Viswanathan says. “We thought it would be more about operational issues or things like that. But when we see that apps of a certain type – the ones asking a lot of nonessential permissions to sell user data or display personalized advertisements – are the ones delaying, then you begin to see that there’s a strategic intent.”
Then the researchers looked at what happened to the apps after they finally upgraded to the tighter privacy rules.
They found that delaying the upgrade didn’t pay off in the long run. With the eventual upgrade, those apps couldn’t keep asking for unnecessary data to sell. And the fact that they took so long to upgrade didn’t sit well with consumers.
“Users are less likely to give you those permissions, but more importantly, they give you lower ratings,” Mayya says. “They become more sensitive.”
The apps that delayed upgrades ended up receiving more negative reviews from consumers, which pushed them further down the list in Android Play Store, leading to fewer downloads of their apps.
“The apps that delayed upgrading paid a price,” Mayya says. “Their bad behavior didn’t pay off in the long run.”
Read more: Delaying Informed Consent: An Empirical Investigation of Mobile Apps’ Upgrade is a working paper.
GET SMITH BRAIN TRUST DELIVERED
TO YOUR INBOX EVERY WEEK
Media Relations Manager