Dark Forest of the Web Jeremy Keith follows up on fighting AI bots, quoting a couple things I’ve said.
He closes with, quote, I agree with these statements in isolation.
Maybe what we disagree on is whether AI is inherently destructive to the web, so all AI bots should be stopped, or whether we can more narrowly minimize AI slop from spreading.
Even without AI, Google refers to blogs as awesome and going down, with Nilay Patel arguing that we are heading to Google Zero.
In other words, Google is already taking more from the web than they’re giving back.
The solution to that is Google alternatives that get us back to the style old-school search engines, 10 blue links, with a focus on real blogs and news sites, weeding out content farms and other spam shenanigans.
We have spammers creating accounts on micro.blog every day, trying to pollute the open web.
It’s depressing.
I want to create more tools that highlight human-generated content, like the audio narration we added.
Jeremy didn’t quote one of my responses about trying to insert text in the post to confuse bots, so I’ll add it here for completeness.
I replied with, quote, There’s got to be a better way to address this.
End quote.
I viewed Source to see how Jeremy is handling this on his blog.
His technique doesn’t appear to be causing any problems with micro.blog’s bookmarking, which saves a copy of the text in a blog post for reading later, because the prompt injection is outside the article tag and H entry for the post.
But it’s not hard to imagine a well-behaved, non-AI bot getting tripped up by this.
I don’t think technological determinism is an appropriate summary of my thoughts.
There are a bunch of questions to resolve around generative AI for sure, including robots, but there’s a lot of potential good, too.