Cathy O'Neil
TT

Reputation Scores on Facebook?

Facebook just came up with a really creepy idea: To stem the flow of disinformation, the social network has started assigning reputation scores to users. If you post stuff that the company deems suspicious, your score will fall and you will be filtered out of other people’s news feeds.

Dystopic as this might sound, I’m for it. But I think people should have the power to decide how much they use it.

Yes, there are valid complaints. Facebook is far from the ideal arbiter of truth. Its scoring system is so opaque that we won’t know exactly what it means by “suspicious.” The bad actors that the company wants to suppress will also be the most capable of gaming the system.

Still, I maintain that regular, non-predatory users should want this system implemented, and pronto. We’re sick of terrible behavior and low-quality content. We don’t want to be trolled or manipulated. This might help.

Plus, there’s a simple way for Facebook — and Twitter — to limit the downside and give users some control: Just include a knob that allows them to turn the filter up or down. If it’s set at zero, they’ll see everything. At ten, posts from “suspicious” users will disappear completely. The default level could be five. On days when we feel like eating trolls for breakfast we’ll turn our knobs all the way down. On other days we’ll hang out with our friends, thanks anyway.

The knob would also offer some insight into how the system works, because users would see who disappears when they turn it up. This would allow people to report false negatives — posts or comments that were labeled suspicious but that they think are actually OK.

Reputation scores are nothing new. Credit-card companies constantly scan our purchases for attributes that look suspicious — and deny or at least pause the ones that they deem potentially fraudulent. It’s just that the definition of fraud is being applied to social-media posts. Spam filters aim to weed the junk out of email accounts, and they work quite well. Imagine having to look at your unfiltered inbox, and you’ll get a sense of how we might look back on this time of social media. What garbage!

Just to be clear, the drawbacks of filtering are real. Although the knob would alleviate the creep factor, false positives and false negatives will still happen: Some bad stuff will get through, and some good stuff will get filtered out. It will never be perfect. But we have to compare it with the status quo. A messy information spigot is still better than one being controlled by professional liars.

We’re in an arms race of disinformation, and the bad guys are winning handily and cheaply. Let’s make things harder for them.

Bloomberg