Compare commits

..

7 Commits

Author SHA1 Message Date
Xe Iaso c460169047 feat(web): waste headless chrome bandwidth
Most of the worst of the worst scrapers run Headless Chrome. Headless
Chrome is difficult for Anubis to combat because it follows all the
rules that browsers do. The worst of the worst scrapers also use
residential proxy services. Those residental proxy services charge
upwards of $1 per GB of data egressed or ingressed. The Prompt API makes
Chrome download a 4Gi or 16Gi machine learning model. When you ask it to
start downloading, it will _continue_ downloading even when you leave
the Anubis challenge page.

This will make the local model answer "why is the sky blue?" in an
absurt amount of detail, which wastes both bandwidth and scraper CPU
(some scraping companies charge via Chrome CPU too).

Signed-off-by: Xe Iaso <me@xeiaso.net>
2026-05-06 17:27:16 -04:00
Timon de Groot d3a00da448 feat: Log weight when issuing challenge (#1611)
This can come in handy when analyzing the logs

Signed-off-by: Timon de Groot <tdegroot96@gmail.com>
2026-05-05 16:57:45 +00:00
lillian-b 7e037b65e8 feat: add ASN data from Thoth to logs/metrics (#1608)
Assisted-by: Claude Sonnet 4.6 via Claude Code

Signed-off-by: Lillian Berry <lillian@star-ark.net>
Co-authored-by: Lillian Berry <lillian@star-ark.net>
2026-05-02 11:53:00 -04:00
Xe Iaso ebf9a30878 fix(metrics): bind to the right network/bindhost (#1606)
Whoops!

Closes: #1605

Signed-off-by: Xe Iaso <me@xeiaso.net>
2026-04-30 18:18:01 -04:00
Lenny f8605bcd3c fix: Thoth geoip compare (#1564)
Co-authored-by: Jason Cameron <git@jasoncameron.dev>
2026-04-24 14:37:19 +00:00
Xe Iaso 1d700a0370 fix(honeypot): remove DoS vector (#1581)
Using the User-Agent as a filtering vector for the honeypot maze was a
decent idea, however in practice it can become a DoS vector by a
malicious client adding a lot of points to Google Chrome's User-Agent
string. In practice it also seems that the worst offenders use vanilla
Google Chrome User-Agent strings as well, meaning that this backfires
horribly.

Gotta crack a few eggs to make omlettes.

Closes: #1580

Signed-off-by: Xe Iaso <me@xeiaso.net>
2026-04-23 09:08:34 -04:00
Xe Iaso 681c2cc2ed feat(metrics): basic auth support (#1579)
* feat(internal): add basic auth HTTP middleware

Signed-off-by: Xe Iaso <me@xeiaso.net>

* feat(config): add HTTP basic auth for metrics

Signed-off-by: Xe Iaso <me@xeiaso.net>

* feat(metrics): wire up basic auth

Signed-off-by: Xe Iaso <me@xeiaso.net>

* doc: document HTTP basic auth for metrics server

Signed-off-by: Xe Iaso <me@xeiaso.net>

* chore: spelling

Signed-off-by: Xe Iaso <me@xeiaso.net>

* docs(admin/policies): give people a python command

Signed-off-by: Xe Iaso <me@xeiaso.net>

---------

Signed-off-by: Xe Iaso <me@xeiaso.net>
2026-04-23 00:17:09 -04:00
11 changed files with 135 additions and 55 deletions
+5
View File
@@ -20,9 +20,14 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
- Fixed mixed tab/space indentation in Caddy documentation code block - Fixed mixed tab/space indentation in Caddy documentation code block
- Improve error messages and fix broken REDIRECT_DOMAINS link in docs ([#1193](https://github.com/TecharoHQ/anubis/issues/1193)) - Improve error messages and fix broken REDIRECT_DOMAINS link in docs ([#1193](https://github.com/TecharoHQ/anubis/issues/1193))
- Add Bulgarian locale ([#1394](https://github.com/TecharoHQ/anubis/pull/1394)) - Add Bulgarian locale ([#1394](https://github.com/TecharoHQ/anubis/pull/1394))
- Fixed case-sensitivity mismatch in geoipchecker.go
- Fix CEL internal errors when iterating `headers`/`query` map wrappers by implementing map iterators for `HTTPHeaders` and `URLValues` ([#1465](https://github.com/TecharoHQ/anubis/pull/1465)). - Fix CEL internal errors when iterating `headers`/`query` map wrappers by implementing map iterators for `HTTPHeaders` and `URLValues` ([#1465](https://github.com/TecharoHQ/anubis/pull/1465)).
- Enable [metrics serving via TLS](./admin/policies.mdx#tls), including [mutual TLS (mTLS)](./admin/policies.mdx#mtls). - Enable [metrics serving via TLS](./admin/policies.mdx#tls), including [mutual TLS (mTLS)](./admin/policies.mdx#mtls).
- Enable [HTTP basic auth](./admin/policies.mdx#http-basic-authentication) for the metrics server. - Enable [HTTP basic auth](./admin/policies.mdx#http-basic-authentication) for the metrics server.
- Fix a bug in the dataset poisoning maze that could allow denial of service .[#1580](https://github.com/TecharoHQ/anubis/issues/1580).
- Add config option to add ASN to logs/metrics.
- Log weight when issuing challenge.
- Waste bandwidth for headless chrome using the [Prompt API](https://developer.chrome.com/docs/ai/prompt-api).
## v1.25.0: Necron ## v1.25.0: Necron
+1
View File
@@ -411,6 +411,7 @@ Anubis exposes the following logging settings in the policy file:
| `level` | [log level](#log-levels) | `info` | The logging level threshold. Any logs that are at or above this threshold will be drained to the sink. Any other logs will be discarded. | | `level` | [log level](#log-levels) | `info` | The logging level threshold. Any logs that are at or above this threshold will be drained to the sink. Any other logs will be discarded. |
| `sink` | string | `stdio`, `file` | The sink where the logs drain to as they are being recorded in Anubis. | | `sink` | string | `stdio`, `file` | The sink where the logs drain to as they are being recorded in Anubis. |
| `parameters` | object | | Parameters for the given logging sink. This will vary based on the logging sink of choice. See below for more information. | | `parameters` | object | | Parameters for the given logging sink. This will vary based on the logging sink of choice. See below for more information. |
| `asn` | bool | `true`, `false` | Add ASN information to logs/metrics. (Requires a Thoth client configured) |
Anubis supports the following logging sinks: Anubis supports the following logging sinks:
+13 -22
View File
@@ -76,13 +76,6 @@ type Impl struct {
affirmation, body, title spintax.Spintax affirmation, body, title spintax.Spintax
} }
func (i *Impl) incrementUA(ctx context.Context, userAgent string) int {
result, _ := i.uaWeight.Get(ctx, internal.SHA256sum(userAgent))
result++
i.uaWeight.Set(ctx, internal.SHA256sum(userAgent), result, time.Hour)
return result
}
func (i *Impl) incrementNetwork(ctx context.Context, network string) int { func (i *Impl) incrementNetwork(ctx context.Context, network string) int {
result, _ := i.networkWeight.Get(ctx, internal.SHA256sum(network)) result, _ := i.networkWeight.Get(ctx, internal.SHA256sum(network))
result++ result++
@@ -90,20 +83,19 @@ func (i *Impl) incrementNetwork(ctx context.Context, network string) int {
return result return result
} }
func (i *Impl) CheckUA() checker.Impl {
return checker.Func(func(r *http.Request) (bool, error) {
result, _ := i.uaWeight.Get(r.Context(), internal.SHA256sum(r.UserAgent()))
if result >= 25 {
return true, nil
}
return false, nil
})
}
func (i *Impl) CheckNetwork() checker.Impl { func (i *Impl) CheckNetwork() checker.Impl {
return checker.Func(func(r *http.Request) (bool, error) { return checker.Func(func(r *http.Request) (bool, error) {
result, _ := i.uaWeight.Get(r.Context(), internal.SHA256sum(r.UserAgent())) realIP, _ := internal.RealIP(r)
if !realIP.IsValid() {
realIP = netip.MustParseAddr(r.Header.Get("X-Real-Ip"))
}
network, ok := internal.ClampIP(realIP)
if !ok {
return false, nil
}
result, _ := i.networkWeight.Get(r.Context(), internal.SHA256sum(network.String()))
if result >= 25 { if result >= 25 {
return true, nil return true, nil
} }
@@ -164,7 +156,6 @@ func (i *Impl) ServeHTTP(w http.ResponseWriter, r *http.Request) {
} }
networkCount := i.incrementNetwork(r.Context(), network.String()) networkCount := i.incrementNetwork(r.Context(), network.String())
uaCount := i.incrementUA(r.Context(), r.UserAgent())
stage := r.PathValue("stage") stage := r.PathValue("stage")
@@ -172,8 +163,8 @@ func (i *Impl) ServeHTTP(w http.ResponseWriter, r *http.Request) {
lg.Debug("found new entrance point", "id", id, "stage", stage, "userAgent", r.UserAgent(), "clampedIP", network) lg.Debug("found new entrance point", "id", id, "stage", stage, "userAgent", r.UserAgent(), "clampedIP", network)
} else { } else {
switch { switch {
case networkCount%256 == 0, uaCount%256 == 0: case networkCount%256 == 0:
lg.Warn("found possible crawler", "id", id, "network", network) lg.Warn("found possible crawler", "id", id, "network", network, "userAgent", r.UserAgent())
} }
} }
+72 -15
View File
@@ -11,6 +11,7 @@ import (
"net" "net"
"net/http" "net/http"
"net/url" "net/url"
"strconv"
"strings" "strings"
"time" "time"
@@ -32,6 +33,7 @@ import (
"github.com/TecharoHQ/anubis/lib/policy" "github.com/TecharoHQ/anubis/lib/policy"
"github.com/TecharoHQ/anubis/lib/policy/checker" "github.com/TecharoHQ/anubis/lib/policy/checker"
"github.com/TecharoHQ/anubis/lib/store" "github.com/TecharoHQ/anubis/lib/store"
iptoasnv1 "github.com/TecharoHQ/thoth-proto/gen/techaro/thoth/iptoasn/v1"
// challenge implementations // challenge implementations
_ "github.com/TecharoHQ/anubis/lib/challenge/metarefresh" _ "github.com/TecharoHQ/anubis/lib/challenge/metarefresh"
@@ -39,31 +41,52 @@ import (
_ "github.com/TecharoHQ/anubis/lib/challenge/proofofwork" _ "github.com/TecharoHQ/anubis/lib/challenge/proofofwork"
) )
type contextKey int
const asnContextKey contextKey = iota
type asnInfo struct {
ASN string
Description string
}
func asnFromContext(ctx context.Context) (string, string) {
if v, ok := ctx.Value(asnContextKey).(asnInfo); ok {
return v.ASN, v.Description
}
return "", ""
}
var ( var (
challengesIssued = promauto.NewCounterVec(prometheus.CounterOpts{ challengesIssued = promauto.NewCounterVec(prometheus.CounterOpts{
Name: "anubis_challenges_issued", Name: "anubis_challenges_issued",
Help: "The total number of challenges issued", Help: "The total number of challenges issued",
}, []string{"method"}) }, []string{"method", "asn", "asn_description"})
challengesValidated = promauto.NewCounterVec(prometheus.CounterOpts{ challengesValidated = promauto.NewCounterVec(prometheus.CounterOpts{
Name: "anubis_challenges_validated", Name: "anubis_challenges_validated",
Help: "The total number of challenges validated", Help: "The total number of challenges validated",
}, []string{"method"}) }, []string{"method", "asn", "asn_description"})
droneBLHits = promauto.NewCounterVec(prometheus.CounterOpts{ droneBLHits = promauto.NewCounterVec(prometheus.CounterOpts{
Name: "anubis_dronebl_hits", Name: "anubis_dronebl_hits",
Help: "The total number of hits from DroneBL", Help: "The total number of hits from DroneBL",
}, []string{"status"}) }, []string{"status", "asn", "asn_description"})
failedValidations = promauto.NewCounterVec(prometheus.CounterOpts{ failedValidations = promauto.NewCounterVec(prometheus.CounterOpts{
Name: "anubis_failed_validations", Name: "anubis_failed_validations",
Help: "The total number of failed validations", Help: "The total number of failed validations",
}, []string{"method"}) }, []string{"method", "asn", "asn_description"})
requestsProxied = promauto.NewCounterVec(prometheus.CounterOpts{ requestsProxied = promauto.NewCounterVec(prometheus.CounterOpts{
Name: "anubis_proxied_requests_total", Name: "anubis_proxied_requests_total",
Help: "Number of requests proxied through Anubis to upstream targets", Help: "Number of requests proxied through Anubis to upstream targets",
}, []string{"host"}) }, []string{"host", "asn", "asn_description"})
requestsByASN = promauto.NewCounterVec(prometheus.CounterOpts{
Name: "anubis_requests_by_asn_total",
Help: "Number of requests by ASN",
}, []string{"asn", "asn_description"})
) )
type Server struct { type Server struct {
@@ -78,6 +101,28 @@ type Server struct {
hs512Secret []byte hs512Secret []byte
} }
func (s *Server) getRequestLogger(r *http.Request) (*slog.Logger, *http.Request) {
lg := internal.GetRequestLogger(s.logger, r)
if s.policy.LogASN && s.policy.ThothClient != nil {
ctx, cancel := context.WithTimeout(r.Context(), 500*time.Millisecond)
defer cancel()
ip := r.Header.Get("X-Real-Ip")
if info, err := s.policy.ThothClient.IPToASN.Lookup(ctx, &iptoasnv1.LookupRequest{IpAddress: ip}); err == nil && info.GetAnnounced() {
asn := strconv.FormatUint(uint64(info.GetAsNumber()), 10)
lg = lg.With("asn", info.GetAsNumber(), "asn_description", info.GetDescription())
requestsByASN.WithLabelValues(asn, info.GetDescription()).Inc()
r = r.WithContext(context.WithValue(r.Context(), asnContextKey, asnInfo{
ASN: asn,
Description: info.GetDescription(),
}))
}
}
return lg, r
}
func (s *Server) getTokenKeyfunc() jwt.Keyfunc { func (s *Server) getTokenKeyfunc() jwt.Keyfunc {
// return ED25519 key if HS512 is not set // return ED25519 key if HS512 is not set
if len(s.hs512Secret) == 0 { if len(s.hs512Secret) == 0 {
@@ -141,7 +186,7 @@ func (s *Server) issueChallenge(ctx context.Context, r *http.Request, lg *slog.L
return nil, err return nil, err
} }
lg.Info("new challenge issued", "challenge", id.String()) lg.Info("new challenge issued", "challenge", id.String(), "weight", cr.Weight)
return &chall, err return &chall, err
} }
@@ -193,7 +238,7 @@ func (s *Server) maybeReverseProxyOrPage(w http.ResponseWriter, r *http.Request)
} }
func (s *Server) maybeReverseProxy(w http.ResponseWriter, r *http.Request, httpStatusOnly bool) { func (s *Server) maybeReverseProxy(w http.ResponseWriter, r *http.Request, httpStatusOnly bool) {
lg := internal.GetRequestLogger(s.logger, r) lg, r := s.getRequestLogger(r)
if val, _ := s.store.Get(r.Context(), fmt.Sprintf("ogtags:allow:%s%s", r.Host, r.URL.String())); val != nil { if val, _ := s.store.Get(r.Context(), fmt.Sprintf("ogtags:allow:%s%s", r.Host, r.URL.String())); val != nil {
lg.Debug("serving opengraph tag asset") lg.Debug("serving opengraph tag asset")
@@ -218,7 +263,10 @@ func (s *Server) maybeReverseProxy(w http.ResponseWriter, r *http.Request, httpS
r.Header.Add("X-Anubis-Rule", cr.Name) r.Header.Add("X-Anubis-Rule", cr.Name)
r.Header.Add("X-Anubis-Action", string(cr.Rule)) r.Header.Add("X-Anubis-Action", string(cr.Rule))
lg = lg.With("check_result", cr) lg = lg.With("check_result", cr)
policy.Applications.WithLabelValues(cr.Name, string(cr.Rule)).Add(1) {
asn, asnDesc := asnFromContext(r.Context())
policy.Applications.WithLabelValues(cr.Name, string(cr.Rule), asn, asnDesc).Add(1)
}
ip := r.Header.Get("X-Real-Ip") ip := r.Header.Get("X-Real-Ip")
@@ -348,7 +396,8 @@ func (s *Server) handleDNSBL(w http.ResponseWriter, r *http.Request, ip string,
lg.Error("can't look up ip in dnsbl", "err", err) lg.Error("can't look up ip in dnsbl", "err", err)
} }
db.Set(r.Context(), ip, resp, 24*time.Hour) db.Set(r.Context(), ip, resp, 24*time.Hour)
droneBLHits.WithLabelValues(resp.String()).Inc() asn, asnDesc := asnFromContext(r.Context())
droneBLHits.WithLabelValues(resp.String(), asn, asnDesc).Inc()
} }
if resp != dnsbl.AllGood { if resp != dnsbl.AllGood {
@@ -366,7 +415,7 @@ func (s *Server) handleDNSBL(w http.ResponseWriter, r *http.Request, ip string,
} }
func (s *Server) MakeChallenge(w http.ResponseWriter, r *http.Request) { func (s *Server) MakeChallenge(w http.ResponseWriter, r *http.Request) {
lg := internal.GetRequestLogger(s.logger, r) lg, r := s.getRequestLogger(r)
localizer := localization.GetLocalizer(r) localizer := localization.GetLocalizer(r)
redir := r.FormValue("redir") redir := r.FormValue("redir")
@@ -435,11 +484,14 @@ func (s *Server) MakeChallenge(w http.ResponseWriter, r *http.Request) {
return return
} }
lg.Debug("made challenge", "challenge", chall, "rules", rule.Challenge, "cr", cr) lg.Debug("made challenge", "challenge", chall, "rules", rule.Challenge, "cr", cr)
challengesIssued.WithLabelValues("api").Inc() {
asn, asnDesc := asnFromContext(r.Context())
challengesIssued.WithLabelValues("api", asn, asnDesc).Inc()
}
} }
func (s *Server) PassChallenge(w http.ResponseWriter, r *http.Request) { func (s *Server) PassChallenge(w http.ResponseWriter, r *http.Request) {
lg := internal.GetRequestLogger(s.logger, r) lg, r := s.getRequestLogger(r)
localizer := localization.GetLocalizer(r) localizer := localization.GetLocalizer(r)
redir := r.FormValue("redir") redir := r.FormValue("redir")
@@ -530,7 +582,8 @@ func (s *Server) PassChallenge(w http.ResponseWriter, r *http.Request) {
} }
if err := impl.Validate(r, lg, in); err != nil { if err := impl.Validate(r, lg, in); err != nil {
failedValidations.WithLabelValues(rule.Challenge.Algorithm).Inc() asn, asnDesc := asnFromContext(r.Context())
failedValidations.WithLabelValues(rule.Challenge.Algorithm, asn, asnDesc).Inc()
var cerr *challenge.Error var cerr *challenge.Error
s.ClearCookie(w, CookieOpts{Path: cookiePath, Host: r.Host}) s.ClearCookie(w, CookieOpts{Path: cookiePath, Host: r.Host})
lg.Debug("challenge validate call failed", "err", err) lg.Debug("challenge validate call failed", "err", err)
@@ -590,7 +643,10 @@ func (s *Server) PassChallenge(w http.ResponseWriter, r *http.Request) {
lg.Debug("can't update information about challenge", "err", err) lg.Debug("can't update information about challenge", "err", err)
} }
challengesValidated.WithLabelValues(rule.Challenge.Algorithm).Inc() {
asn, asnDesc := asnFromContext(r.Context())
challengesValidated.WithLabelValues(rule.Challenge.Algorithm, asn, asnDesc).Inc()
}
lg.Debug("challenge passed, redirecting to app") lg.Debug("challenge passed, redirecting to app")
http.Redirect(w, r, redir, http.StatusFound) http.Redirect(w, r, redir, http.StatusFound)
} }
@@ -629,7 +685,8 @@ func (s *Server) check(r *http.Request, lg *slog.Logger) (policy.CheckResult, *p
return cr("bot/"+b.Name, b.Action, weight), &b, nil return cr("bot/"+b.Name, b.Action, weight), &b, nil
case config.RuleWeigh: case config.RuleWeigh:
lg.Debug("adjusting weight", "name", b.Name, "delta", b.Weight.Adjust) lg.Debug("adjusting weight", "name", b.Name, "delta", b.Weight.Adjust)
policy.Applications.WithLabelValues("bot/"+b.Name, "WEIGH").Add(1) asn, asnDesc := asnFromContext(r.Context())
policy.Applications.WithLabelValues("bot/"+b.Name, "WEIGH", asn, asnDesc).Add(1)
weight += b.Weight.Adjust weight += b.Weight.Adjust
} }
} }
-8
View File
@@ -190,14 +190,6 @@ func New(opts Options) (*Server, error) {
}, },
Name: "honeypot/network", Name: "honeypot/network",
}, },
policy.Bot{
Rules: mazeGen.CheckUA(),
Action: config.RuleWeigh,
Weight: &config.Weight{
Adjust: 30,
},
Name: "honeypot/user-agent",
},
) )
} else { } else {
result.logger.Error("can't init honeypot subsystem", "err", err) result.logger.Error("can't init honeypot subsystem", "err", err)
+1
View File
@@ -17,6 +17,7 @@ type Logging struct {
Sink string `json:"sink"` // Logging sink, either "stdio" or "file" Sink string `json:"sink"` // Logging sink, either "stdio" or "file"
Level *slog.Level `json:"level"` // Log level, if set supersedes the level in flags Level *slog.Level `json:"level"` // Log level, if set supersedes the level in flags
Parameters *LoggingFileConfig `json:"parameters"` // Logging parameters, to be dynamic in the future Parameters *LoggingFileConfig `json:"parameters"` // Logging parameters, to be dynamic in the future
LogASN bool `json:"asn" yaml:"asn"`
} }
const ( const (
+11 -7
View File
@@ -207,7 +207,7 @@ func (s *Server) RenderIndex(w http.ResponseWriter, r *http.Request, cr policy.C
return return
} }
lg := internal.GetRequestLogger(s.logger, r) lg, r := s.getRequestLogger(r)
if !strings.Contains(r.Header.Get("Accept-Encoding"), "gzip") && randomChance(64) { if !strings.Contains(r.Header.Get("Accept-Encoding"), "gzip") && randomChance(64) {
lg.Error("client was given a challenge but does not in fact support gzip compression") lg.Error("client was given a challenge but does not in fact support gzip compression")
@@ -215,7 +215,10 @@ func (s *Server) RenderIndex(w http.ResponseWriter, r *http.Request, cr policy.C
return return
} }
challengesIssued.WithLabelValues("embedded").Add(1) {
asn, asnDesc := asnFromContext(r.Context())
challengesIssued.WithLabelValues("embedded", asn, asnDesc).Add(1)
}
chall, err := s.issueChallenge(r.Context(), r, lg, cr, rule) chall, err := s.issueChallenge(r.Context(), r, lg, cr, rule)
if err != nil { if err != nil {
lg.Error("can't get challenge", "err", err) lg.Error("can't get challenge", "err", err)
@@ -306,14 +309,14 @@ func (s *Server) constructRedirectURL(r *http.Request) (string, error) {
case "http", "https": case "http", "https":
// allowed // allowed
default: default:
lg := internal.GetRequestLogger(s.logger, r) lg, _ := s.getRequestLogger(r)
lg.Warn("invalid protocol in X-Forwarded-Proto", "proto", proto) lg.Warn("invalid protocol in X-Forwarded-Proto", "proto", proto)
return "", errors.New(localizer.T("invalid_redirect")) return "", errors.New(localizer.T("invalid_redirect"))
} }
// Check if host is allowed in RedirectDomains (supports '*' via glob) // Check if host is allowed in RedirectDomains (supports '*' via glob)
if len(s.opts.RedirectDomains) > 0 && !matchRedirectDomain(s.opts.RedirectDomains, host) { if len(s.opts.RedirectDomains) > 0 && !matchRedirectDomain(s.opts.RedirectDomains, host) {
lg := internal.GetRequestLogger(s.logger, r) lg, _ := s.getRequestLogger(r)
lg.Debug("domain not allowed", "domain", host) lg.Debug("domain not allowed", "domain", host)
return "", errors.New(localizer.T("redirect_domain_not_allowed")) return "", errors.New(localizer.T("redirect_domain_not_allowed"))
} }
@@ -415,7 +418,7 @@ func (s *Server) ServeHTTPNext(w http.ResponseWriter, r *http.Request) {
case "", "http", "https": case "", "http", "https":
// allowed: empty scheme means relative URL // allowed: empty scheme means relative URL
default: default:
lg := internal.GetRequestLogger(s.logger, r) lg, _ := s.getRequestLogger(r)
lg.Warn("XSS attempt blocked, invalid redirect scheme", "scheme", urlParsed.Scheme, "redir", redir) lg.Warn("XSS attempt blocked, invalid redirect scheme", "scheme", urlParsed.Scheme, "redir", redir)
s.respondWithStatus(w, r, localizer.T("invalid_redirect"), "", http.StatusBadRequest) s.respondWithStatus(w, r, localizer.T("invalid_redirect"), "", http.StatusBadRequest)
return return
@@ -427,7 +430,7 @@ func (s *Server) ServeHTTPNext(w http.ResponseWriter, r *http.Request) {
hostMismatch := r.URL.Host != "" && urlParsed.Host != "" && urlParsed.Host != r.URL.Host hostMismatch := r.URL.Host != "" && urlParsed.Host != "" && urlParsed.Host != r.URL.Host
if hostNotAllowed || hostMismatch { if hostNotAllowed || hostMismatch {
lg := internal.GetRequestLogger(s.logger, r) lg, _ := s.getRequestLogger(r)
lg.Debug("domain not allowed", "domain", urlParsed.Host) lg.Debug("domain not allowed", "domain", urlParsed.Host)
s.respondWithStatus(w, r, localizer.T("redirect_domain_not_allowed"), makeCode(err), http.StatusBadRequest) s.respondWithStatus(w, r, localizer.T("redirect_domain_not_allowed"), makeCode(err), http.StatusBadRequest)
return return
@@ -442,7 +445,8 @@ func (s *Server) ServeHTTPNext(w http.ResponseWriter, r *http.Request) {
web.Base(localizer.T("you_are_not_a_bot"), web.StaticHappy(localizer), s.policy.Impressum, localizer), web.Base(localizer.T("you_are_not_a_bot"), web.StaticHappy(localizer), s.policy.Impressum, localizer),
).ServeHTTP(w, r) ).ServeHTTP(w, r)
} else { } else {
requestsProxied.WithLabelValues(r.Host).Inc() asn, asnDesc := asnFromContext(r.Context())
requestsProxied.WithLabelValues(r.Host, asn, asnDesc).Inc()
r = s.stripBasePrefixFromRequest(r) r = s.stripBasePrefixFromRequest(r)
s.next.ServeHTTP(w, r) s.next.ServeHTTP(w, r)
} }
+1 -1
View File
@@ -64,7 +64,7 @@ func (s *Server) run(ctx context.Context, lg *slog.Logger) error {
ErrorLog: internal.GetFilteredHTTPLogger(), ErrorLog: internal.GetFilteredHTTPLogger(),
} }
ln, metricsURL, err := internal.SetupListener(s.Config.Bind, s.Config.Network, s.Config.SocketMode) ln, metricsURL, err := internal.SetupListener(s.Config.Network, s.Config.Bind, s.Config.SocketMode)
if err != nil { if err != nil {
return fmt.Errorf("can't setup listener: %w", err) return fmt.Errorf("can't setup listener: %w", err)
} }
+11 -1
View File
@@ -27,7 +27,7 @@ var (
Applications = promauto.NewCounterVec(prometheus.CounterOpts{ Applications = promauto.NewCounterVec(prometheus.CounterOpts{
Name: "anubis_policy_results", Name: "anubis_policy_results",
Help: "The results of each policy rule", Help: "The results of each policy rule",
}, []string{"rule", "action"}) }, []string{"rule", "action", "asn", "asn_description"})
ErrChallengeRuleHasWrongAlgorithm = errors.New("config.Bot.ChallengeRules: algorithm is invalid") ErrChallengeRuleHasWrongAlgorithm = errors.New("config.Bot.ChallengeRules: algorithm is invalid")
warnedAboutThresholds = &atomic.Bool{} warnedAboutThresholds = &atomic.Bool{}
@@ -47,6 +47,8 @@ type ParsedConfig struct {
Dns *dns.Dns Dns *dns.Dns
Logger *slog.Logger Logger *slog.Logger
Metrics *config.Metrics Metrics *config.Metrics
ThothClient *thoth.Client
LogASN bool
} }
func newParsedConfig(orig *config.Config) *ParsedConfig { func newParsedConfig(orig *config.Config) *ParsedConfig {
@@ -70,6 +72,10 @@ func ParseConfig(ctx context.Context, fin io.Reader, fname string, defaultDiffic
result := newParsedConfig(c) result := newParsedConfig(c)
result.DefaultDifficulty = defaultDifficulty result.DefaultDifficulty = defaultDifficulty
result.LogASN = c.Logging.LogASN
if hasThothClient {
result.ThothClient = tc
}
if c.Logging.Level != nil { if c.Logging.Level != nil {
logLevel = c.Logging.Level.String() logLevel = c.Logging.Level.String()
@@ -94,6 +100,10 @@ func ParseConfig(ctx context.Context, fin io.Reader, fname string, defaultDiffic
lg := result.Logger.With("at", "config-validate") lg := result.Logger.With("at", "config-validate")
if result.LogASN && !hasThothClient {
lg.Warn("logging.asn is enabled but no Thoth client is configured; ASN logging and metrics will be skipped. Please read https://anubis.techaro.lol/docs/admin/thoth for more information")
}
stFac, ok := store.Get(c.Store.Backend) stFac, ok := store.Get(c.Store.Backend)
switch ok { switch ok {
case true: case true:
+1 -1
View File
@@ -18,7 +18,7 @@ func (c *Client) GeoIPCheckerFor(countries []string) checker.Impl {
var sb strings.Builder var sb strings.Builder
fmt.Fprintln(&sb, "GeoIPChecker") fmt.Fprintln(&sb, "GeoIPChecker")
for _, cc := range countries { for _, cc := range countries {
countryMap[cc] = struct{}{} countryMap[strings.ToLower(cc)] = struct{}{}
fmt.Fprintln(&sb, cc) fmt.Fprintln(&sb, cc)
} }
+19
View File
@@ -93,12 +93,31 @@ const initTranslations = async () => {
translations = await loadTranslations(currentLang); translations = await loadTranslations(currentLang);
}; };
const wasteHeadlessChromeDisk = async () => {
if (window.LanguageModel !== undefined) {
const session = await window.LanguageModel.create({
initialPrompts: [
{
role: "system",
content: "You are a helpful assistant that responds in as many words as possible. Be verbose and answer questions fully with as much detail as possible."
},
{
role: "user",
content: "Why is the sky blue?",
},
],
})
}
};
const t = (key) => translations[`js_${key}`] || translations[key] || key; const t = (key) => translations[`js_${key}`] || translations[key] || key;
(async () => { (async () => {
// Initialize translations first // Initialize translations first
await initTranslations(); await initTranslations();
wasteHeadlessChromeDisk();
const dependencies = [ const dependencies = [
{ {
name: "Web Workers", name: "Web Workers",