Files
anubis-mirror/docs
Xe Iaso caa4d1273e fix(honeypot/naive): apply robot9001 style delays
Currently the honeypotting feature has no limits or delays anywhere and
uses that to feed an internal greylist of IP networks. This can cause
issues such as in #1613 where Claude's crawler seemed to pick up on it
and egress data at over one megabit per second until the administrator
noticed and blocked the address range.

This takes a different approach by inspiration of how the classic #xkcd
IRC bot Robot9000 works. The first time a given IPv4 /24 or IPv6 /48
visits a honepot page, Anubis sleeps for 1 millisecond. The second it
sleeps for two milliseconds. The third is four milliseconds and so on.
The goal of this is to make the scraping inherently self-limiting such
that the scrapers go off in their own corner where they won't really
hurt anyone.

Let's see if this works out according to keikaku.

Ref: https://github.com/TecharoHQ/anubis/issues/1613
Signed-off-by: Xe Iaso <me@xeiaso.net>
2026-05-15 17:39:28 -04:00
..
2025-11-14 03:14:00 +00:00

Website

This website is built using Docusaurus, a modern static website generator.

Installation

$ yarn

Local Development

$ yarn start

This command starts a local development server and opens up a browser window. Most changes are reflected live without having to restart the server.

Build

$ yarn build

This command generates static content into the build directory and can be served using any static contents hosting service.

Deployment

Using SSH:

$ USE_SSH=true yarn deploy

Not using SSH:

$ GIT_USER=<Your GitHub username> yarn deploy

If you are using GitHub pages for hosting, this command is a convenient way to build the website and push to the gh-pages branch.