Olga Spaic (left) and Stuart Conway
By Stuart Conway and Olga Spaic
Developers are natural born innovators who aim to build cool and useful applications that users love. This often means packing a mobile application with all the features and functionality that the platform makes available. Accelerometer? Absolutely! Camera? Definitely! GPS? Yes, please! We sometimes forget that users must reveal personal information to make use of these features, but we must always be aware of the privacy implications that users will face. Given the attention Congress continues to focus on this issue, privacy must be a preeminent concern for developers.
So what's a developer to do? Developers should design with efficiency in mind. Ask only for data that's absolutely necessary and avoid slowing down your app with extraneous information. Your end users may not know it, but you will also be protecting their privacy. Efficient design will also help you stay clear of the privacy legislation debate, which is fueled by continual security breaches.
In the wake of last year's discovery that Apple iPhones were collecting and storing location data in unsecure, easily hacked files (this security hole has since been fixed in iOS), Senators Al Franken (D-Minn.) and Richard Blumenthal (D-Conn.) introduced the Location Privacy Protection Act of 2011, which would require companies to obtain consent before collecting and sharing a user's location data. This year, Representative Ed Markey (D-Mass.) drafted a mobile-device privacy bill in response to the revelation that Carrier IQ software on smartphones could track users' keystrokes without their knowledge or consent. Several other initiatives specific to online privacy and targeted advertising have also been introduced, though there is also an attempt by the advertising industry to self-police via the Digital Advertising Alliance's (DAA) Self-Regulatory Program for Online Behavioral Advertising.
As developers, we must place checks in our software so that when users might to compromise their privacy, we alert them of the risks. With the advent of social media, however, people are less sensitive about what they post or the information they allow our apps to share. And, as everyone knows, it gets easier with each subsequent click of "Allow" to ignore privacy warnings.
Even though the onus remains on users to protect their personal data, there have been numerous covert ways for apps and websites to access data--ways that make it difficult for users to be completely aware of. As an example, Path's app was called out for tapping into users' personal address books without specifying they were doing so. This was in the apparently innocent interest of "adding friends" with whom users would share information via the app, but the real rub was that Path wasn't just accessing data--it was storing it without user consent. Interestingly, Path got called out on it, apologized for this breach of privacy and promised to delete users' address book info. And then got lauded for this after the fact!
Take the now-defunct IE P3P policy files for example. These guidelines tell the user in plain, non-legalese language, what data is being collected, why it is being collected, how the data will be used, how long it will be retained and how it will be destroyed after the retention period. You should also have a Contact section where users can request to view the data the hosting entity has collected from them, and ask for it to be destroyed.
Developers should collect the minimum amount of data possible to implement the features of their application. It's tempting to collect everything available in anticipation of future product enhancements--even simple utility apps that ask for full network and GPS access. Keep it simple and collect only what you need.
Here are five simple steps app developers can take to protect consumer privacy:
- Have a compelling reason for collecting user data.
- Let users know what data you are collecting and what you are doing with it.
- Allow users to opt out of having their personal information collected, even if it means limiting functionality
- Draw a line between PII (personally identifiable information) and generic user information.
- Have a plan. Where is sensitive information stored? Who has access to it? What is your destruction policy?
Stuart Conway is the development lead and Olga Spaic is manager of analytics at Metia/Seattle--the North American headquarters of global agency Metia Group. Visit www.Metia.com and www.twitter.com/metiasea.