StayFresh

Static archive of workflow research and patterns

March 2026

Anubis in Front of GitHub Pages

GitHub Pages serves static files. Anubis is a filter proxy. Those are different jobs, and pretending otherwise is how you end up debugging a loop at 1:30am.

The working shape is boring and correct: GitHub Pages stays the origin, your custom domain points at infrastructure you control, and that infrastructure runs the reverse proxy plus Anubis.

Architecture

browser
  -> caddy or nginx on your server
  -> anubis
  -> github pages origin

example:
  stayfresh.dev
    -> vps
    -> anubis
    -> https://evoke4350.github.io/stayfresh.dev/

The Non-Negotiables

Minimal Docker Compose

This is the shortest useful version: Caddy terminates TLS, Anubis sits behind it, and GitHub Pages is the upstream.

services:
  caddy:
    image: caddy:2
    ports:
      - "80:80"
      - "443:443"
      - "443:443/udp"
    volumes:
      - ./Caddyfile:/etc/caddy/Caddyfile:ro
      - caddy_data:/data
      - caddy_config:/config
    depends_on:
      - anubis

  anubis:
    image: ghcr.io/techarohq/anubis:latest
    pull_policy: always
    environment:
      BIND: ":8923"
      TARGET: "https://evoke4350.github.io/stayfresh.dev/"
      TARGET_HOST: "evoke4350.github.io"
      DIFFICULTY: "4"
      SERVE_ROBOTS_TXT: "true"
      POLICY_FNAME: "/data/cfg/botPolicy.yaml"
    volumes:
      - ./botPolicy.yaml:/data/cfg/botPolicy.yaml:ro

volumes:
  caddy_data:
  caddy_config:

Minimal Caddyfile

stayfresh.dev, www.stayfresh.dev {
  encode gzip zstd

  reverse_proxy http://anubis:8923 {
    header_up X-Real-Ip {remote_host}
    header_up X-Http-Version {http.request.proto}
  }
}

Minimal Policy File

Start small. Allow the obvious utility paths, then challenge browser-looking traffic.

bots:
  - name: well-known
    path_regex: ^/.well-known/.*$
    action: ALLOW
  - name: favicon
    path_regex: ^/favicon.ico$
    action: ALLOW
  - name: robots
    path_regex: ^/robots.txt$
    action: ALLOW
  - name: feeds
    path_regex: ^/(rss.xml|atom.xml|feed.json|sitemap.xml)$
    action: ALLOW
  - name: generic-browser
    user_agent_regex: Mozilla
    action: CHALLENGE

Rollout Order

  1. Leave GitHub Pages publishing the site exactly as it does now.
  2. Bring up Caddy and Anubis on a VPS or similar box.
  3. Target the GitHub Pages origin URL from Anubis.
  4. Move the custom domain DNS to the VPS.
  5. Test the challenge flow, feed URLs, and a direct content fetch with JavaScript enabled.
  6. Add allow rules for anything you actually want to keep working.

Why This Beats Faking It

Because it respects the deployment boundary. Pages stays static and cheap. The proxy layer does the dynamic work. That split is not glamorous, but it is solid.

References