cmd/anubis: configurable difficulty per-bot rule (#53)

Closes #30

Introduces the "challenge" field in bot rule definitions:

```json
{
  "name": "generic-bot-catchall",
  "user_agent_regex": "(?i:bot|crawler)",
  "action": "CHALLENGE",
  "challenge": {
    "difficulty": 16,
    "report_as": 4,
    "algorithm": "slow"
  }
}
```

This makes Anubis return a challenge page for every user agent with
"bot" or "crawler" in it (case-insensitively) with difficulty 16 using
the old "slow" algorithm but reporting in the client as difficulty 4.

This is useful when you want to make certain clients in particular
suffer.

Additional validation and testing logic has been added to make sure
that users do not define "impossible" challenge settings.

If no algorithm is specified, Anubis defaults to the "fast" algorithm.

Signed-off-by: Xe Iaso <me@xeiaso.net>
This commit is contained in:
Xe Iaso
2025-03-21 13:48:00 -04:00
committed by GitHub
parent 90049001e9
commit d3e509517c
17 changed files with 311 additions and 46 deletions

View File

@@ -0,0 +1,12 @@
---
title: Proof-of-Work Algorithm Selection
---
Anubis offers two proof-of-work algorithms:
- `"fast"`: highly optimized JavaScript that will run as fast as your computer lets it
- `"slow"`: intentionally slow JavaScript that will waste time and memory
The fast algorithm is used by default to limit impacts on users' computers. Administrators may configure individual bot policy rules to use the slow algorithm in order to make known malicious clients waitloop and do nothing useful.
Generally, you should use the fast algorithm unless you have a good reason not to.