Selling Digital Fear

The crowded building’s not on fire. After an exhaustive investigation of the top 100 Facebook apps, the Wall Street Journal didn’t find any serious privacy violations. While sensationalizing the dangers of online privacy sure drives page views and ad revenue, it also impedes innovation and harms the business of honest software developers.

Reality has yet to stop media outlets from yelling about privacy, and because the WSJ writers were on assignment, they wrote the “Selling You On Facebook” hit piece despite thin findings. These kind of articles can make mainstream users so worried about the worst-case scenario of what could happen to their data, they don’t see the value they get in exchange for it.

“Selling You On Facebook” does bring up the important topic of how apps can utilize personal data granted to them by their users, but it overstates the risks. Yes, the business models of Facebook and the apps on its platform depend on your personal information, but so do the services they provide. That means each user needs to decide what information to grant to who, and Facebook has spent years making the terms of this value exchange as clear as possible.

The sub-headline for the WSJ’s article is “Many popular Facebook apps are obtaining sensitive information about users—and users’ friends—so don’t be surprised if details about your religious, political and even sexual preferences start popping up in unexpected places” but it’s not until the 10th paragraph that it mentions that apps “obtain” this information through a detailed data permissions process. Also, Facebook apps have been able to ask for this data since 2007. Wouldn’t it already be “popping up in unexpected places” by now if that were actually true?

When you go to install an app, you’re first shown a description of what the app does, what data it needs such as your email address or your friends’ photos, and you’re able to select who can see your in-app activity. If the app requires more than biographical information and the option to publish to your wall, it has to show a second screen listing every type of data or ability it needs. It’s probably the most privacy-sensitive process for granting access to personal data on the Internet.

You can’t have apps that show you local concerts or let you send birthday cards to friends if the app doesn’t know your location or when your friends’ birthdays are. The WSJ and other media often harp that apps ask for more user data then they need. That’s actually against Facebook’s policy, and it hurts developers anyways as each additional permission they ask for reduces install rates by 3%. Users are not breezing past long lists of permission requests. They notice, and in fact are clearly turned off by apps who appear greedy for data.

Indeed, friends can provide your data to apps without you being notified. But you can limit or shut that off if you want, and most people wouldn’t want a constant barrage of notifications about a friend who can already see your data accessing it through an app.

After its commendably deep analysis, all the WSJ could come up with was that a few small startups use unapproved ad networks to monetize their apps, and don’t properly spell out their privacy policy in writing. Facebook set the bar high by voluntarily establishing the approved ad network program and privacy policy requirements to provide users with additional protections.

This makes enforcement a challenge, but Facebook roots out offending apps through a three-tiered system of automated detection, a human team, and the ability for users to flag violations. It’s not that Facebook “occasionally isn’t enforcing its own rules” as the WSJ says, it’s that with 7 million constantly changing apps on its platform, it’s not always immediately aware of more benign infractions like those the WSJ found.

There are real data privacy concerns out there, but most stem from users making too much data about them publicly available. I think it’s irresponsible for Facebook to default new user privacy to public for everything from photos and status updates to current city and work history. Controversial mobile apps like “Girls Around Me” can use this public data to enable some shady activity, but that’s different than the apps we’re discussing where we and our friends provide our private data.

The fact is that businesses have been requesting personal information from their customers for hundreds of years. How do you think the Wall Street Journal knows where to deliver your newspaper? It asks for your home address. Facebook actually prohibits apps from selling this type of data to other companies — something print publishers have long been known to do. That’s why your home mailbox is full of junk.

By drumming up privacy concerns, mainstream media is holding back the future and the companies trying to bring it to us. Yes, change can be scary, but it can also be beneficial. We’re seeing cumbersome necessities of offline business like filling out contact information cards by hand get streamlined into a few clicks as they move online. But forget that. This adjoined WSJ article walks you through deleting all your Facebook apps and turning off the platform entirely so you never have to bothered with useful services again.

Here’s a concrete example of what we’re missing out on because of this fear. Last year, Facebook wanted to allow applications to ask for your mobile phone number and home address. With permissions apps could have notified you by text message when friends were nearby, or let you instantly fill out shipping information for e-commerce purchases. But instead, the media went crazy, sure it would lead to waves of SMS spam and home invasions. Politicians started bleating against the options to, and Facebook had to retreat.

Now we still can’t choose whether to grant Facebook apps information that on and offline marketers ask us for all the time. So instead of human connection and faster shopping that could help the economy, fear trumped innovation and we got no improvements.

It’s time we start thinking critically about what makes us uncomfortable. If there are serious dangers that aren’t being policed, let’s shut them down. But if something scares us just because it’s new, let’s weigh the risks against the benefits, and put down the pitchforks and torches.

[Image Credit: Luke Romyn]