Assuming they choose not to easily patch this with a simple depth limit, this is a good way to just waste your server resources and cost yourself money while impacting site performance for everyone else, ensuring that the only people visiting your site are the bots. So far all these "anti-AI" projects are either nothing-burgers or self-imposed malware.
Cybersecurity
An umbrella community for all things cybersecurity / infosec. News, research, questions, are all welcome!
Rules
Community Rules
- Be kind
- Limit promotional activities
- Non-cybersecurity posts should be redirected to other communities within infosec.pub.
@remixtures that's great, some days ago i saw some people on r/selfhosting discussing how to stop AI crawlers that don't respect robots.txt (so all of them) and there were a lot of people basically reinventing the tarpit idea, having a dedicated tool for that is great, combined with a simple logging of all ip ranges falling for it to get blacklisted we might get a fighting chance, there were even people serving zip bombs to ai bots, but i don't believe they would bother to open it
Nepenthe! Nepenthe! And forget this lost Lenore
Quoth the raven,
I guess just adding something like a link depth limit would already counter that
Not sure, if that would reduce the gathered information on legitim sites much, but I don't think so
Yeah, this sounds like something I tackled when mirroring webcomics, twenty years ago. Dynamic webpages with a "Next" button are not new.
The interesting part is the detection of AI crawlers and selectively feeding them markov chain nonsense
@[email protected] they seem to repeatedly and endlessly hammer certain pages on sites, too, for no reason. Some of the stories on here are horrendous - openAI &tc effectively DDOSing entire sites!
I'm betting this, alongside rampant advertising, is a big part of why the Internet seems so much slower than a decade and a half ago, in spite of speeds of home Internet being many times what they were then
@laurelraven We had cable internet, 30 years ago in Preston. 30mbps and it was far faster than the 750mbps we have now!
A few years ago the average webpage was larger than the whole of Doom. What they are now, who even knows?
People used to do something similar to email-harvesting bots.
@[email protected] interesting, the hackernews thread linked in that article has someone talking about similar tools
https://news.ycombinator.com/item?id=42726426