The Problem

AI coding agents (Claude Code, Cursor, etc.) run locally with access to your files, MCP servers, API credentials, and development environment. When you want to schedule recurring tasks - weekly CRM dashboards, nightly git syncs, log consolidation - you hit a fundamental tension:

Anything that can use your local tools must run on your machine. Anything that runs reliably on a schedule must NOT depend on your machine being on.

Remote schedulers (Anthropic’s RemoteTrigger, Cloudflare Workers, GitHub Actions) fire reliably but can’t access local files, MCP servers, or VS Code context. Local schedulers (cron, launchd, in-session timers) have full access but die when you close your laptop.

What Doesn’t Work

ApproachLocal access?Survives laptop off?Why it fails
Claude Code CronCreate (session)YesNoDies with session
Claude Code CronCreate (durable)YesOnly if CC is runningDoesn’t survive restarts reliably
RemoteTrigger (Anthropic infra)NoYesSandboxed - no files, no MCP, no local tools
Cloudflare Worker aloneNoYesSame - can only make HTTP calls
macOS LaunchAgent aloneYesSurvives rebootNo external trigger, can’t handle laptop-off scenarios

The insight: you need both halves - a reliable external scheduler AND a local executor - connected by a queue.

The Solution: GitHub as a Task Queue

Cloudflare Worker (cron, fires every hour)
    |
    +- Checks KV store for due jobs
    +- Pushes task .json to GitHub repo `tasks/` directory
    |
    v
GitHub Repository (tasks/ directory)
    |   <- task .json sits here until picked up
    |   <- if laptop is off, tasks queue up
    |   <- worker won't push duplicates (checks before pushing)
    |
    v
Local Daemon (macOS LaunchAgent, polls every 5s)
    |
    +- Picks up .json
    +- Runs: claude -p "<prompt>" --allowedTools "Bash(git:*) Read Grep"
    +- Posts result to Slack (optional)
    +- Deletes .json from GitHub
    |
    v
Result: logged locally + Slack notification

Why GitHub as the Queue?

  • Already authenticated (daemon has a PAT)
  • Provides built-in audit trail (commit history shows every task pushed/deleted)
  • Stale .json files = tasks that never ran = visible failures
  • No new infrastructure - we already have the repo
  • Works with existing daemon polling loop (no architecture change)

Deduplication: The Laptop-Off Problem

If the laptop is off for 3 consecutive Monday nights, the CRM status cron fires 3 times. Without dedup, 3 task files pile up. When the laptop boots, the daemon runs all 3 sequentially - wasteful.

Solution: before pushing a new task, the worker checks GitHub tasks/ for any existing .json file with the same job ID prefix. If one exists, it skips pushing. The daemon processes the one queued task when it wakes up.

What I Learned

Claude Code Binary Path Breaks Silently on Updates

The daemon pointed CLAUDE_BIN at the npm/nvm install path. When Claude Code updated itself to a native binary at ~/.local/bin/claude, the old path vanished. The daemon kept polling (green statusbar icon) but every task execution failed with FileNotFoundError. There were no user-visible errors - just silent failures.

Fix: Statusbar now checks that CLAUDE_BIN in .env resolves to an existing binary on every poll cycle. Yellow icon = something’s wrong.

Lesson: Any tool that auto-updates its install location will break hardcoded paths. The statusbar health check pattern - verify your dependencies actually exist before claiming you’re healthy - applies to any daemon.

Non-Interactive Claude Needs Explicit Tool Permissions

claude -p respects the same permission model as interactive Claude. Git push, file edits, and API calls all require approval. For unattended tasks, you must either:

  • Use --allowedTools "Bash(git:*) Edit Read" to whitelist specific tools per task
  • Use --dangerously-skip-permissions (only appropriate in sandboxed environments)

This means the daemon needs per-task tool allowlists. A CRM status pull only needs Bash(curl:*) Read. A git sync needs Bash(git:*). A rollout needs full permissions and therefore can’t run unattended.

Stale Tasks Are a Feature

When a cron job pushes a task and the daemon can’t process it (laptop off, binary broken, timeout), the .json file stays in GitHub. This is visible - you can see which jobs didn’t run. Compare this to traditional cron where a missed job just… doesn’t happen, and nobody knows.

Components

1. Cloudflare Worker (task-scheduler)

  • Fires hourly at :05 via Cloudflare cron trigger
  • Job definitions stored in KV namespace (adding new crons = API call, no redeployment)
  • Cron matching logic handles standard 5-field expressions
  • Management API: list, create, update, delete, and manually trigger jobs
  • Protected by Bearer token

2. Local Daemon

  • macOS LaunchAgent with RunAtLoad = true - starts automatically on boot
  • Polls GitHub tasks/ directory every 5 seconds
  • Executes tasks via claude -p (non-interactive Claude Code)
  • Deletes task files from GitHub on completion
  • Menu bar statusbar shows health: green (ok), yellow (degraded), red (stopped)

3. GitHub Repository (tasks/ directory)

  • The queue. Task files are .json with: task_id, prompt, channel_id, job_id, scheduled_at
  • Non-.json files are ignored (safe for notes, scripts)
  • Stale files = audit trail of failures

Architecture Decisions

Why not a database? GitHub is the queue. KV is the job store. We considered Cloudflare D1 or a local SQLite, but the volume is too low to justify it. A .json file per task is debuggable, visible, and version-controlled.

Why poll instead of webhooks? The daemon could listen for GitHub webhooks instead of polling. But polling is simpler, works behind any NAT/firewall, and the 5-second interval is fast enough for overnight tasks. If we ever need sub-second responsiveness, webhooks via Cloudflare Tunnel would be the upgrade path.

Why Cloudflare Worker instead of GitHub Actions? Both can fire on a cron. Cloudflare Workers have lower latency, a simpler execution model, and I already have the account/tooling. GitHub Actions would work too but adds YAML complexity and has a minimum billing granularity that’s wasteful for “push a 200-byte file” tasks.