Unattended algorithms

Neil Stephenson has written a new book. In an interview with PC Magazine, he talks about the problems with social media:

We’ve turned over our perception of what’s real to algorithmically driven systems that are designed not to have humans in the loop, because if humans are in the loop they’re not scalable and if they’re not scalable they can’t make tons and tons of money.

I think more social networks should do things that don’t scale, prioritizing safety over profit. For example, in Micro.blog the featured posts in Discover are curated by humans instead of algorithms.

Significant parts of Facebook and Twitter run unattended. There was more pushback against Facebook last week after they refused to remove the edited Nancy Pelosi video, choosing instead to try to educate users that the video was fake. Monika Bickert, Facebook’s vice president of global policy management, went on CNN to defend their decision. She described how Facebook works with fact-checking organizations to independently confirm whether something is accurate:

As soon as we get a rating from them that content is false, then we dramatically reduce the distribution of that content, and we let people know that it is false so they can make an informed choice.

I signed in to Facebook to try to understand what they had done. I actually had trouble finding the video at first, maybe because none of my friends on Facebook had shared it. Searching for Nancy Pelosi did include Facebook groups such as “Nancy Pelosi is Insane” and “Americans Against Nancy Pelosi”, featured prominently in the search results. I finally found the video, but there was no callout that it was fake. (The version I saw was captured with a camera pointed at the video playing elsewhere, likely confusing Facebook’s algorithm for finding an exact copy of the video.)

Facebook is trying to solve this problem, but it’s a band-aid on a system that is working as designed to surface “relevant” content for more ad views. It’s not enough.

Manton Reece @manton