By Mark Jamison, Director and Gunter Professor, Public Utility Research Center
This post is part of a series on data portability and interoperability.
Last month, lawmakers demanded answers from Google about its practice of letting outside developers scan the inboxes of millions of Gmail users who signed up for email services using other apps. That same day UK regulators said they intend to hit Facebook with a maximum fine of 500,000 pounds for allowing the political consulting firm Cambridge Analytica to harvest information from millions of people without their consent. Meanwhile, the Federal Trade Commission at various points has taken action against both Google and Facebook, along with Twitter and Microsoft for allegedly violating privacy rules. Around the world, Congress and parliaments have held hearings on the topic, with more coming.
If users of these products feel like renowned sci fi star Truman Burbank, who discovers his whole life has been captured by hidden cameras, it’s for good reason. Too many social media companies have allowed people to remain ignorant of how they’re being watched and who’s doing the watching.
Amid this environment, some social media companies are examining whether, and under what conditions, they should share data between one another or with other software developers. In industry terminology, their question is: should our systems be more open or closed? Such decisions are critical because they determine how massive amounts of personal information flow through our economy and under what conditions. For the companies, it’s fundamental to their business models.
The more open a system, the easier it is for outside app developers to access another platform’s data. Companies like Yelp, for example, have weighed in favoring openness requirements, arguing that Google should use companies like Yelp for some searches and that Facebook should share more data with smaller companies.
Having a more closed system, on the other hand, means less access for third parties. It has the benefit of better safeguarding people’s information by ensuring it remains only with the platform they use. This is one of the key public policy arguments for closed systems: they put more control in the hands of users, so their information isn’t shared in ways they didn’t approve or realize. A sense of privacy takes precedent.
But there is another argument for closed systems that often gets overlooked amid the focus on privacy. In many cases, it may actually be better for businesses too, especially start-ups.
That isn’t the conventional wisdom. Indeed, many businesses feel like Yelp – they want the information spigot open. The theory is that it lowers costs for rival companies, enabling them to create innovations that might benefit consumers. Plus, it spurs entrepreneurs to create more companies, also potentially good for consumers.
It’s true that open systems can lower costs for a startup because they give companies with few resources data sets that rivals have built.
But does that always translate into more and better innovation? Is it the best means of creating competition for the tech giants and more choice for consumers?
Not necessarily so.
Suppose you are launching a new restaurant review app as an open system. All of the information you gain from users leaks seamlessly to other apps, and eventually to your competitors – such as Yelp. As a result, the only effective difference between your offering and that of your competitors, small and large, is the attractiveness of your user interface.
If users prefer your site, you will draw a larger audience and be able to sell more ads. But your open systems approach means that you cannot target ads any better than your rivals – because they have the same customer data you have.
Now suppose that you instead decide to launch a more closed platform. Now your user interface, knowledge, and ability to serve restaurant needs are all unique. Not only do you sell more ads, you can use that money to innovate and improve the experience for users.
Of course, as a startup, you would love to have access to the data of your larger competitors. But it isn’t essential if you build a better product. Facebook didn’t get access to Myspace’s data: Facebook outdistanced Myspace by focusing first on college students and including features that allowed them to interact. Likewise, Google didn’t have access to Yahoo!’s data, the one-time victor of the search wars. Instead, Google developed search results that customers valued. It quickly surpassed all other organic search engines. As startups, Facebook and Google largely kept rivals from accessing their data, enabling the two former fledglings to quickly flourish with advertising capabilities never seen before.
In other words, it was the closed systems that led to more variation among products, and more choice for consumers.
Should people be able to take their data with them?
One aspect of the ‘open or closed’ debate involves how portable a user’s information should be. An open system is more apt to allow users to transfer their information to another site or app, or simply remove it completely. Some pundits believe portability should be required of all social media companies in the name of consumer rights and enhanced competition.
Users often say they like portability too. They feel less captive to one platform, and they think it’s only fair that they be able to take their posts and photos with them to a new product.
If customers like portability and companies want to provide it, then it’s a win-win. But legislating portability in the name of enhancing competition is problematic and far more complicated than policy makers may realize.
One issue is defining what data can be ported. People advocating mandatory portability sometimes gloss over the tough details by simply saying users should be able to port “their” data. But is a user’s data limited to what the user has posted, or does it include what other users have said about the user? And what impact will the porting of one user’s data have on the privacy of other users? Does one user have the right to take photos with other people in them who didn’t agree to have their images ported off?
And does what’s ported include information a social media site has observed about the user? Some sites, for example, track users’ web browsing and location histories. Is that part of the package?
And it is far from certain that portability empowers smaller rivals relative to larger incumbents. The machine learning that social media companies use for targeting content isn’t unlearned when data exit the system.
As is often the case when policymakers suggest mandating a business practice, the advocates don’t think through the effects.
Facing up to the disillusioned
Unfortunately, it may be too late to avoid some of the regulatory and legislative fallout. But the options being considered – open vs. closed systems and data portability – seem to miss the impetus for the user outcries. Users are upset because they’re surprised and dismayed by revelations about how tech companies use people’s data. They feel they lost something that was rightfully theirs.
Social media companies should meet this disillusionment head on. Accept the public outcry as a mandate for change before governments overstep. Engage with users openly. Tell them what has been going on and what is currently going on in ways that respect their time and intelligence. And remind people about the true relationship between the tech company and user. Notice I don’t call the users “customers.” They are not. They’re critical suppliers to companies that would not exist without them and they should be treated with respect.
Jamison is a Visiting Scholar with the American Enterprise Institute (AEI). He is also the Gunter Professor of the Public Utility Research Center (PURC) at the University of Florida. Disclosure statement: He provided consulting for Google in 2012 regarding whether the search engine is a public utility.