mirror of
https://github.com/TecharoHQ/anubis.git
synced 2026-04-09 10:08:45 +00:00
* feat: iplist2rule utility command Assisted-By: GLM 4.7 via Claude Code Signed-off-by: Xe Iaso <me@xeiaso.net> * docs: update CHANGELOG Signed-off-by: Xe Iaso <me@xeiaso.net> * chore: fix spelling Signed-off-by: Xe Iaso <me@xeiaso.net> * chore: fix spelling again Signed-off-by: Xe Iaso <me@xeiaso.net> * feat(iplist2rule): add comment describing how rule was generated Signed-off-by: Xe Iaso <me@xeiaso.net> * docs: add iplist2rule docs Signed-off-by: Xe Iaso <me@xeiaso.net> * chore: fix spelling Signed-off-by: Xe Iaso <me@xeiaso.net> --------- Signed-off-by: Xe Iaso <me@xeiaso.net>
88 lines
2.4 KiB
Plaintext
88 lines
2.4 KiB
Plaintext
---
|
|
title: robots2policy CLI Tool
|
|
sidebar_position: 50
|
|
---
|
|
|
|
The `robots2policy` tool converts robots.txt files into Anubis challenge policies. It reads robots.txt rules and generates equivalent CEL expressions for path matching and user-agent filtering.
|
|
|
|
## Installation
|
|
|
|
Install directly with Go:
|
|
|
|
```bash
|
|
go install github.com/TecharoHQ/anubis/cmd/robots2policy@latest
|
|
```
|
|
|
|
## Usage
|
|
|
|
Basic conversion from URL:
|
|
|
|
```bash
|
|
robots2policy -input https://www.example.com/robots.txt
|
|
```
|
|
|
|
Convert local file to YAML:
|
|
|
|
```bash
|
|
robots2policy -input robots.txt -output policy.yaml
|
|
```
|
|
|
|
Convert with custom settings:
|
|
|
|
```bash
|
|
robots2policy -input robots.txt -action DENY -format json
|
|
```
|
|
|
|
## Options
|
|
|
|
| Flag | Description | Default |
|
|
| --------------------- | ------------------------------------------------------------------ | ------------------- |
|
|
| `-input` | robots.txt file path or URL (use `-` for stdin) | _required_ |
|
|
| `-output` | Output file (use `-` for stdout) | stdout |
|
|
| `-format` | Output format: `yaml` or `json` | `yaml` |
|
|
| `-action` | Action for disallowed paths: `ALLOW`, `DENY`, `CHALLENGE`, `WEIGH` | `CHALLENGE` |
|
|
| `-name` | Policy name prefix | `robots-txt-policy` |
|
|
| `-crawl-delay-weight` | Weight adjustment for crawl-delay rules | `3` |
|
|
| `-deny-user-agents` | Action for blacklisted user agents | `DENY` |
|
|
|
|
## Example
|
|
|
|
Input robots.txt:
|
|
|
|
```txt
|
|
User-agent: *
|
|
Disallow: /admin/
|
|
Disallow: /private
|
|
|
|
User-agent: BadBot
|
|
Disallow: /
|
|
```
|
|
|
|
Generated policy:
|
|
|
|
```yaml
|
|
- name: robots-txt-policy-disallow-1
|
|
action: CHALLENGE
|
|
expression:
|
|
single: path.startsWith("/admin/")
|
|
- name: robots-txt-policy-disallow-2
|
|
action: CHALLENGE
|
|
expression:
|
|
single: path.startsWith("/private")
|
|
- name: robots-txt-policy-blacklist-3
|
|
action: DENY
|
|
expression:
|
|
single: userAgent.contains("BadBot")
|
|
```
|
|
|
|
## Using the Generated Policy
|
|
|
|
Save the output and import it in your main policy file:
|
|
|
|
```yaml
|
|
bots:
|
|
- import: "./robots-policy.yaml"
|
|
```
|
|
|
|
The tool handles wildcard patterns, user-agent specific rules, and blacklisted bots automatically.
|