It does not stop them, but it does make it more expensive and slower for the attacker.
This is a bit of a misconception of what Anubis does. It uses PoW to enforce a full browser environment, but the PoW is only used once a week or so (or when there is some suspicious things detected). The PoW is then used to autogenerate a kind of password to store in the browser cookies, and to generate this "password" you can't use the simple servers that are used at scale to scrape (practically ddos) the open internet right now.
The main problem is with complex websites like git forges that these AI scrapers hit all the computational expensive deep endpoints and practically force them to shut down from overloading the CPU.
Since I was forced to implement Anubis for my Forgejo instance I also experimented with it on Lemmy. Right now the results show that while Lemmy isn't as badly effected by this AI scraping, there is still quite a bit of it happening. After adding Anubis the overall traffic went down by about a third on our instance, and it prevents the regular traffic spikes we previously saw and had no real explanation for.
But we also ran in some strange issues with it. Most likely it is caused by Anubis detecting mobile connections with switching IP addresses as possible scrapers (who are known to first access pages from a more complete server to get cookies and so on and then switch to a cheaper server on a different IP to do the actual scraping). But we are still figuring out how to replicate those issues, and they might have been fixed in the latest Anubis update we applied yesterday.