this post was submitted on 21 Apr 2025
-5 points (14.3% liked)

Sysadmin

8802 readers
34 users here now

A community dedicated to the profession of IT Systems Administration

No generic Lemmy issue posts please! Posts about Lemmy belong in one of these communities:
[email protected]
[email protected]
[email protected]
[email protected]

founded 2 years ago
MODERATORS
 

Projects like Arubis use a web based proof of work to slow down and and potential stop not traffic. The idea is that the proof of work makes the client spend some computer resources accessing the page so that it isn't as computationally feasible to abuse public websites.

However, doing this all as a web service seems inefficient since there is always a performance penalty tied to web pages. My idea is what there could a special http protocol addition that would require the client to do a proof of work. Doing it at the browser/scaper level means that it would be Mich more efficient since the developer of the browser could tailor the code to the platform. It would also make it possible for bots to do it which would still allow scrapping but in a way that is less demanding on the server.

you are viewing a single comment's thread
view the rest of the comments
[–] CameronDev 3 points 3 days ago

Its never just "A little extra code". Each browser will have to implement it themselves (although possibly it could be done in chrome and everyone else inherits it by default), each browser will run the features through the standard debates around support, necessity, correctness, side channel security issues, etc. Firefox might drag their feet, chrome might implement it differently, edge might strip it out because it hurts their scraper. 5 years later it might get useful.