Twitter made an announcement today about stopping abusive accounts and hiding low-quality tweets. I think filtering search results in particular is a very good step in the right direction:
We’re also working on ‘safe search’ which removes Tweets that contain potentially sensitive content and Tweets from blocked and muted accounts from search results. While this type of content will be discoverable if you want to find it, it won’t clutter search results any longer.
As I work on Micro.blog, I’ve tried to be mindful of where users can stumble upon posts that they don’t want to see. Replies is a big one, and I’ll be focusing most of my attention on that. But search, trends, and hashtags are also a problem, because they let anyone’s posts bubble up to a much wider audience. I’m launching Micro.blog without them.
I’ve been following Seth Godin and reading his books for many years, but recently two of his statements caught my attention. The first is an older video episode with Gary Vaynerchuk, where Seth talks about why he has no presence on social media except automatic cross-posting of his blog posts.
The second is equally relevant to what I’ve been thinking about with Micro.blog. Seth says that we’ve surrendered control over how our software works to algorithms instead of human decision-makers who can take responsibility for mistakes. It’s too easy to blame the computer:
That person who just got stopped on her way to an airplane—the woman who gets stopped every time she flies—the TSA says it’s the algorithm doing it. But someone wrote that code.
Algorithms are a shortcut. They should give us more leverage to go further, faster, not dictate where we go.
The social web is now permeated with algorithms. Today, Twitter again promoted what’s trending higher up in their app. That may be a step in the wrong direction. Trends can sometimes surface the better parts of Twitter, but they’re also an invitation to view the worst possible tweets you’ll ever see.
Let’s not be afraid to add curation by humans. That’s not an admission of failure. It’s an acknowledgement that algorithms are imperfect.
Software has consequences. How it’s designed informs what behavior it encourages. If it’s built without thought to these consequences, it will succeed only by accident. For 2017, one of my goals is to slow down and be more deliberate about features that can have this kind of impact.