March 2026
Anubis in Front of GitHub Pages
GitHub Pages serves static files. Anubis is a filter proxy. Those are different jobs, and pretending otherwise is how you end up debugging a loop at 1:30am.
The working shape is boring and correct: GitHub Pages stays the origin, your custom domain points at infrastructure you control, and that infrastructure runs the reverse proxy plus Anubis.
Architecture
browser
-> caddy or nginx on your server
-> anubis
-> github pages origin
example:
stayfresh.dev
-> vps
-> anubis
-> https://evoke4350.github.io/stayfresh.dev/
The Non-Negotiables
- GitHub Pages cannot execute Anubis. It can only host the site files.
- Your public DNS for the custom domain must point to your proxy layer, not directly to GitHub Pages.
- Anubis target should be the GitHub Pages origin URL, not your public custom domain, or you will proxy traffic back into yourself.
- If you want search engines or feed readers to survive, write explicit allow rules. Anubis defaults to being heavy-handed.
Minimal Docker Compose
This is the shortest useful version: Caddy terminates TLS, Anubis sits behind it, and GitHub Pages is the upstream.
services:
caddy:
image: caddy:2
ports:
- "80:80"
- "443:443"
- "443:443/udp"
volumes:
- ./Caddyfile:/etc/caddy/Caddyfile:ro
- caddy_data:/data
- caddy_config:/config
depends_on:
- anubis
anubis:
image: ghcr.io/techarohq/anubis:latest
pull_policy: always
environment:
BIND: ":8923"
TARGET: "https://evoke4350.github.io/stayfresh.dev/"
TARGET_HOST: "evoke4350.github.io"
DIFFICULTY: "4"
SERVE_ROBOTS_TXT: "true"
POLICY_FNAME: "/data/cfg/botPolicy.yaml"
volumes:
- ./botPolicy.yaml:/data/cfg/botPolicy.yaml:ro
volumes:
caddy_data:
caddy_config:
Minimal Caddyfile
stayfresh.dev, www.stayfresh.dev {
encode gzip zstd
reverse_proxy http://anubis:8923 {
header_up X-Real-Ip {remote_host}
header_up X-Http-Version {http.request.proto}
}
}
Minimal Policy File
Start small. Allow the obvious utility paths, then challenge browser-looking traffic.
bots:
- name: well-known
path_regex: ^/.well-known/.*$
action: ALLOW
- name: favicon
path_regex: ^/favicon.ico$
action: ALLOW
- name: robots
path_regex: ^/robots.txt$
action: ALLOW
- name: feeds
path_regex: ^/(rss.xml|atom.xml|feed.json|sitemap.xml)$
action: ALLOW
- name: generic-browser
user_agent_regex: Mozilla
action: CHALLENGE
Rollout Order
- Leave GitHub Pages publishing the site exactly as it does now.
- Bring up Caddy and Anubis on a VPS or similar box.
- Target the GitHub Pages origin URL from Anubis.
- Move the custom domain DNS to the VPS.
- Test the challenge flow, feed URLs, and a direct content fetch with JavaScript enabled.
- Add allow rules for anything you actually want to keep working.
Why This Beats Faking It
Because it respects the deployment boundary. Pages stays static and cheap. The proxy layer does the dynamic work. That split is not glamorous, but it is solid.