Parmy Olson
TT

How Frances Haugen Left Mark Zuckerberg Speechless

The woman behind Facebook’s most damning-ever leak of internal documents has a name: Frances Haugen.

On Monday, ahead of Facebook’s worst site-wide outage for some time, details about Haugen emerged. She was a product manager on the company’s “civic integrity team,” where she systematically copied tens of thousands of internal documents to share with the US Securities and Exchange Commission, members of Congress and the Wall Street Journal before leaving in May. It could turn out to be the most important act in Facebook’s corporate history.

This was no rash act of impulsiveness. Haugen, 37, armed herself with lawyers and provided reams of documents for a WSJ series about Facebook’s harms. Giving her first television interview Sunday, she was succinct in explaining why Facebook’s algorithms were harmful. In another interview, with The Journal podcast, posted Monday, she gave clear prescriptions for what could be done: Don’t break up Facebook, but do hire more people to audit and guide the content that the company shows to more than 1.6 billion people every day.

Haugen no doubt has a tsunami of legal and corporate blowback headed her way. But Facebook is going to struggle to discredit someone who not only speaks well, but has a Harvard MBA and is so well-versed in how algorithms are made that she has patents under her name.

Haugen’s document dump revealed what many suspected but couldn’t prove: that Facebook created more lenient secret rules for elite users, that Instagram made body issues worse for one in three teen girls, and that Facebook knowingly amped up outrage on its main site through an algorithm change in 2018, potentially leading to the Jan. 6 storming of the US Capitol building.

Regulators have been at a loss for how to deal with Facebook up to now, but Haugen’s cool-headed suggestions coupled with internal details on how Facebook's systems are set up could provide a clearer way forward. She stresses that breaking up Facebook would be a mistake because that would starve the individual parts of the conglomerate of the resources needed to stem harmful content. Instead, the company needs far more people to audit and guide content across the platform.

While Facebook claims it’s putting real resources into just that policing, her account suggests the opposite. Her civic integrity unit, with 200 people, was woefully under-resourced and eventually dissolved by Facebook management, she says.

Haugen’s assertions that algorithms are underperforming is a well-rehearsed argument (including here), but she has an enormous cache of documentation to back it up. And these aren’t just Facebook’s problems, she notes, but problems with “engagement-based ranking” in general.

Her biggest wish, she says, is for real transparency. Imagine if Facebook published daily data feeds on its most viral content, she says. “You’d have YouTubers analyzing this data and explaining it to people.” That point should add fuel to upcoming regulations like European Union’s AI law, designed to force companies to unpick the code underpinning their AI algorithms for regulators.

While the 2018 revelations about Cambridge Analytica resulted in a fine, regulators ultimately left the social media giant alone and its shares climbed steadily. This is likely to be different, not least because of the change in the White House and Congress since then. US lawmakers recently introduced five antitrust bills targeting the outsized power of Big Tech. In addition to her trove of documents, Haugen offers lawmakers and regulators deep insider knowledge.

She describes herself as an algorithm-ranking specialist who, having worked at four social networks — including Alphabet Inc.’s Google and Pinterest, Inc. — understands the intricacies of how computer code chooses what content people see. Her whistleblowing is more powerful both for her own background and the sober approach she took. Going first to a paper that takes an even-handed approach in its corporate reporting insulates her from charges that she’s on an ideological mission.

At Facebook, Haugen says she attended regular meetings where staff would share their struggles to stop viral posts that showed beheadings, or posts that compared certain ethic groups to insects. She ultimately concluded that underinvestment in safety was baked in at Facebook and virtually impossible to change.

Bloomberg