mirror of
https://github.com/TecharoHQ/anubis.git
synced 2026-05-17 20:09:55 +00:00
b57508afcd
Currently the honeypotting feature has no limits or delays anywhere and uses that to feed an internal greylist of IP networks. This can cause issues such as in #1613 where Claude's crawler seemed to pick up on it and egress data at over one megabit per second until the administrator noticed and blocked the address range. This takes a different approach by inspiration of how the classic #xkcd IRC bot Robot9000 works. The first time a given IPv4 /24 or IPv6 /48 visits a honepot page, Anubis sleeps for 1 millisecond. The second it sleeps for two milliseconds. The third is four milliseconds and so on. The goal of this is to make the scraping inherently self-limiting such that the scrapers go off in their own corner where they won't really hurt anyone. Let's see if this works out according to keikaku. Ref: https://github.com/TecharoHQ/anubis/issues/1613 Signed-off-by: Xe Iaso <me@xeiaso.net>