Ezra Klein
TT

TikTok May Be More Dangerous Than It Looks

At the core of the frenzied interest in Elon Musk’s acquisition of Twitter is an intuition that I think is right: The major social media platforms are, in some hard-to-define way, essential to modern life. Call them town squares. Call them infrastructure. They exist in some nether region between public utility and private concern. They are too important to entrust to billionaires and businesses, but that makes them too dangerous to hand over to governments. We have not yet found a satisfying answer to the problem of their ownership and governance. But some arrangements are more worrying than others. There are fates worse than Musk.

TikTok, as we know it today, is only a few years old. But its growth is like nothing we’ve seen before. In 2021, it had more active users than Twitter, more US watch minutes than YouTube, more app downloads than Facebook, more site visits than Google. The app is best known for viral dance trends, but there was a time when Twitter was 140-character updates about lunch orders and Facebook was restricted to elite universities. Things change. Perhaps they have already changed. A few weeks ago, I gave a lecture at a Presbyterian college in South Carolina, and asked some of the students where they liked to get their news. Almost every one said TikTok.

TikTok is owned by ByteDance, a Chinese company. And Chinese companies are vulnerable to the whims and the will of the Chinese government. There is no possible ambiguity on this point: The Chinese Communist Party spent much of the last year cracking down on its tech sector. They made a particular example out of Jack Ma, the high-flying founder of Alibaba. The message was unmistakable: Chief executives will act in accordance with party wishes or see their lives upended and their companies dismembered.

In August 2020, President Donald Trump signed an executive order insisting that TikTok sell itself to an American firm or be banned in the United States. By the fall, ByteDance was looking for a buyer, with Oracle and Walmart the likeliest suitors, but then Joe Biden won the election and the sale was shelved.

In June, Biden replaced Trump’s executive order, which was sloppily written and being successfully challenged in court, with one of his own. The problem, as Biden’s order defines it, is that apps like TikTok “can access and capture vast swaths of information from users, including United States persons’ personal information and proprietary business information. This data collection threatens to provide foreign adversaries with access to that information.”

Let’s call this the data espionage problem. Apps like TikTok collect data from users. That data could be valuable to foreign governments. That’s why the Army and Navy banned TikTok from soldiers’ work phones, and why Senator Josh Hawley wrote a bill to ban it on all government devices.

TikTok is working on an answer: “Project Texas,” a plan to host data for US customers on US servers, and somehow restrict access by its parent company. But as Emily Baker-White of BuzzFeed News writes in an excellent report, “Project Texas appears to be primarily an exercise in geography, one that seems well positioned to address concerns about the Chinese government accessing Americans’ personal information. But it does not address other ways that China could weaponize the platform, like tweaking TikTok algorithms to increase exposure to divisive content, or adjusting the platform to seed or encourage disinformation campaigns.”

Let’s call this the manipulation problem. TikTok’s real power isn’t over our data. It’s over what users watch and create. It’s over the opaque algorithm that governs what gets seen and what doesn’t.

TikTok has been thick with videos backing the Russian narrative on the war in Ukraine. Media Matters, for instance, tracked an apparently coordinated campaign driven by 186 Russian TikTok influencers who normally post beauty tips, prank videos and fluff. And we know that China has been amplifying Russian propaganda worldwide. How comfortable are we with not knowing whether the Chinese Communist Party decided to weigh in on how the algorithm treats these videos? How comfortable will we be with a similar situation in five years, when TikTok is even more entrenched in the lives of Americans, and the company has freedom it may not feel today to operate as it pleases?

Imagine a world in which the United States has a contested presidential election, as it did in 2020 (to say nothing of 2000). If one candidate was friendlier to Chinese interests, might the Chinese Communist Party insist that ByteDance give a nudge to content favoring that candidate? Or if they wanted to weaken America rather than shape the outcome, maybe TikTok begins serving up more and more videos with election conspiracies, sowing chaos at a moment when the country is near fracture.

None of this is far-fetched. We know that TikTok’s content moderation guidelines clamped down on videos and topics at the Chinese government’s behest, though it says its rules have changed since then. We know that other foreign countries — Russia comes to mind — have used American social networks to drive division and doubt.

It is telling that China sees such dangers as obvious enough to have built a firewall against them internally: They’ve banned Facebook and Google and Twitter and, yes, TikTok. ByteDance has had to manage a different version of the app, known as Douyin, for Chinese audiences, one that abides by the rules of Chinese censors. China has long seen these platforms as potential weapons. As China’s authoritarian turn continues, and as relations between our countries worsen, it is not far-fetched to suspect they might do unto us what they have always feared we would do unto them.

“No analogies are perfect, but the closest analogy I can think of is to imagine if the Brezhnev-era Soviet Union had decided to plow some of its oil export profits into buying up broadcast television stations across the US,” my former colleague Matthew Yglesias wrote in his newsletter, Slow Boring. “The F.C.C. wouldn’t have let them. And if the F.C.C. for some reason did let them, the Commerce Department would have blocked it. And if a judge said the Commerce Department was wrong and control over the information ecosystem didn’t meet the relevant national security standard, Congress would have passed a new law.”

As analogies go, I think that’s a good starting point. But if the Soviet Union had bought up local television stations across the nation, we’d know they had done it, and there’d be an understanding of what those stations were, and what they were attempting, just as was true with Russia Today. The propaganda would be known as propaganda.

TikTok’s billion users don’t think they’re looking at a Chinese government propaganda operation because, for the most part, they’re not. They’re watching makeup tutorials and recipes and lip sync videos and funny dances. But that would make it all the more powerful a propaganda outlet, if deployed. And because each TikTok feed is different, we have no real way of knowing what people are seeing. It would be trivially easy to use it to shape or distort public opinion, and to do so quietly, perhaps untraceably.

In all of this, I’m suggesting a simple principle, albeit one that will not be simple to apply: Our collective attention is important. Whoever (or whatever) controls our attention controls, to a large degree, our future. The social media platforms that hold and shape our attention need to be governed in the public interest. That means knowing who’s truly running them and how they’re running them.

I’m not sure which of the social network owners currently clear that bar. But I’m certain ByteDance doesn’t. On this, Donald Trump was right, and the Biden administration should finish what he started.

The New York Times