Meta

FTC Agreement Brings Rigorous New Standards for Protecting Your Privacy

By Colin Stretch

Update on July 24, 2019 at 12:16PM PT: Mark Zuckerberg discussed these changes at a company-wide event this morning. Here are part of his remarks.

Originally published on July 24, 2019 at 5:44AM PT:

After months of negotiations, we’ve reached an agreement with the Federal Trade Commission that provides a comprehensive new framework for protecting people’s privacy and the information they give us.

The agreement will require a fundamental shift in the way we approach our work and it will place additional responsibility on people building our products at every level of the company. It will mark a sharper turn toward privacy, on a different scale than anything we’ve done in the past.

The accountability required by this agreement surpasses current US law and we hope will be a model for the industry. It introduces more stringent processes to identify privacy risks, more documentation of those risks, and more sweeping measures to ensure that we meet these new requirements. Going forward, our approach to privacy controls will parallel our approach to financial controls, with a rigorous design process and individual certifications intended to ensure that our controls are working — and that we find and fix them when they are not.

In reaching this settlement, we have also agreed to pay a $5 billion penalty — multiple times what any previous company has paid the FTC — in order to resolve allegations that we violated our 2012 consent order.

The FTC’s investigation was initiated after the events around Cambridge Analytica last year. Our handling of this matter was a breach of trust between Facebook and the people who depend on us to protect their data. This agreement is not only about regulators, it’s about rebuilding trust with people.

Over the past year we’ve made large strides on privacy. We’ve given people more control over their data, closed down apps and applied more resources to protecting people’s information.

But even measured against these changes, the privacy program we are building will be a step change in terms of how we handle data. We will be more robust in ensuring that we identify, assess and mitigate privacy risk. We will adopt new approaches to more thoroughly document the decisions we make and monitor their impact. And we will introduce more technical controls to better automate privacy safeguards.

As part of this effort, we will be undertaking a review of our systems. We expect this process will surface issues — that’s part of its purpose. When it does, we will work swiftly to address them.

Just this month, and in response to the FTC investigation, we discovered that shortcomings in our systems allowed some partners to continue accessing data to provide Facebook features on their products. While we found no abuse, the new agreement will help ensure against such risks going forward. We will also be more diligent in how we monitor for abuse, and we’ll require developers to be accountable for the ways they use data and comply with our policies.

Transparency and accountability will be two driving concepts. We will have quarterly certifications to verify that our privacy controls are working. And where we find problems, we will make sure they’re fixed. The process stops at the desk of our CEO, who will sign his name to verify that we did what we said we would.

We will also have a new level of board oversight. A committee of Facebook’s board of directors will meet quarterly to ensure we’re living up to our commitments. The committee will be informed by an independent privacy assessor whose job will be to review the privacy program on an ongoing basis and report to the board when they see opportunities for improvement.

These efforts will occur under the watchful eye of the FTC and the US Department of Justice. The order imposes a number of reporting requirements to the Commission, which ensure that the FTC and the Justice Department will have clear lines of sight at any given point into how effectively we’re meeting our responsibilities.

Even with these new measures in place, we know we can’t fix all these challenging issues by ourselves. To address this, we will formalize and expand our efforts to gain input from experts outside the company.

Today, we also resolved an ongoing investigation by the Securities and Exchange Commission. The SEC alleged that we should have had better processes in place to ensure disclosure to investors of data abuse like what occurred with Cambridge Analytica. The SEC also alleged that, after we learned in late 2015 that a developer had transferred data to Cambridge Analytica in violation of our policies, we should have said more about this abuse in our investor disclosures. We share the SEC’s interest in ensuring that we are transparent with our investors about the material risks we face, and we have already updated our disclosures and controls in this area. As part of the settlement with the SEC, we agreed to pay a $100 million penalty.

We have heard that words and apologies are not enough and that we need to show action. By resolving both the SEC and the FTC investigations, we hope to close this chapter and turn our focus and resources toward the future.

Billions of people around the world use our products to make their lives richer and to help their organizations thrive. That makes it especially important that the people who use our platform can trust that their information is protected. This agreement is an unambiguous commitment to do that.



To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy