3rd Party Mobile SDKs Make Dev Life Easy but Erode User Trust

A recent iOS scandal demonstrated how invasive a malicious SDK can be, and how much damage it can do to the privacy of the user. This can happen without the user, or even the app developer, knowing or agreeing to it. We don’t use SDK’s and here’s why.

When you build an app, 3rd party SDKs are incredibly attractive. Import a Google library into your project, add a few lines of code, and things just start working. It’s that easy, and the utility is HUGE for the developer. Install Twitter’s Fabric.io and you get an email every time your app crashes on someone’s device with all the details you need to fix it. Throw in Yahoo’s Flurry and see how people use the app in real time, which screens they like, where interest drops off, how and when they use the app, etc. If you’re marketing your app, use Facebook’s developer SDK to be able to track ad clicks all the way through the app store to app install, and even pay only when the app is installed. All of these SDKs are bundled with SaaS platforms that store all the data, do all the processing, and visualize the data to make it instantly actionable.

But notice a pattern? Look who is buying up app analytics startups. It’s all of the huge names in tech, but not the SaaS providers like SAP, IBM or SalesForce. It’s data companies whose value lies in the insight their content provides them. These ad/analytics/tracking SDKs give them eyeballs into the pockets of the end user: your users, or you. They don’t know what you’ve just put on their phone, and how would they considering it is probably only stated in the terms of use (hopefully). When the developer writes 3 lines of code, the SDK has all the permissions of the app itself. And the data it collects sits on the 3rd party server, not yours. For all intended purposes it now belongs to them.

Get Facebook’s SDK up and running with 3 lines of code.

Recently, Apple blocked over 150 apps from the app store at once. They all had an SDK in common, from a Chinese ad network named Youmi. That SDK was accessing and reporting sensitive information such as user emails and device IDs back to the SDK provider Youmi. Apple usually has stringent app checks to catch this type of app behavior. However it appears Youmi was able to fool the examiners. I think that similar to the way VW diesels were able to sense that they were in a test environment and reduce emissions, Youmi was able to sense the Apple testing environment and shut down the malicious activities. However that’s just conjecture. What is not conjecture is that Youmi stole extremely sensitive user data without users, or even the developers, knowing that it was happening. In general as a user, the only way to find out which SDKs you’ve “opted-in” to is to read the privacy agreement, terms of use, EULA, etc. of every app you have installed.

While Youmi was obviously not a reputable partner, their actions are bringing the behavior of other more reputable SDKs into the spotlight. Since it is now clear that we don’t know exactly what they are doing, it is also clear that we shouldn’t necessarily trust them. Avoiding them makes things extremely difficult for developers. The other options are certainly not as refined. ACRA for example allows you to catch crashes and run analytics using your own servers, but can take a good bit of tooling to get it running. We searched for paid SaaS solutions that would allow us the agility and insight of Google, Twitter or Yahoo while keeping it within our own silo, but came up empty handed. If you build a privacy-aware, SaaS app analytics platform, we’ll be your first customers. Call us! Perhaps the Youmi scandal means we won’t be the only one. It will be the users of those 150 kicked apps that decide what the consequences are for using sketchy SDKs.

-dawud

@d4wud

Original post:http://www.blog.twosense-labs.com/sdks-bring-easy-utility-to-apps-at-the-cost-of-privacy-and-trust/