I’m using GitHub as the human-in-the-loop review process for my AI-assisted publishing engine.
The agent drafts the post.
But GitHub handles the judgment layer:
• Every article becomes a pull request
• Labels define the content type
• Scores show quality checks
• Comments become editorial feedback
• Merge means publish
• Closed means reject
It’s basically a developer-native CMS: pull requests for review, labels for structure, CI for quality checks, comments for feedback, and merge history as the audit trail.
For lokerdollar.com (http://lokerdollar.com), this means I can scale content production without removing human taste from the process.
AI writes fast.
GitHub slows it down just enough to make it useful.
I’m sharing the exact markdown → GitHub PR → Next.js publishing workflow soon.
0
28
lokerdollar.com (http://lokerdollar.com) — Remote Job Aggregator for Indonesian Talent
• Architected on Next.js 15 (App Router, RSC), TypeScript, Cloudflare Workers, D1, R2, and KV — zero-runtime-latency SEO pages powered by an ahead-of-time AI enrichment pipeline
• Designed a freemium monetization model (Guest / Free / Open Profile / Pro), Talent Visibility opt-in and a B2B employer discovery product
• Built an autonomous CI/CD repair pipeline using GitHub Actions + anthropics/claude-code-action@beta that detects pipeline failures and pushes fixes to PRs without manual intervention
• Runs a PostHog A/B testing loop on the hero section and conversion funnel; full observability via Sentry and Cloudflare Worker logs
• 100% of planning, implementation, and review flows through Claude Code, Codex, and Antigravity — spec-driven, agentic, and reproducible