I built an open-source tool that cuts LLM token usage by 97% for AI browserI built an open-source tool that cuts LLM token usage by 97% for AI browser
The network for creativity
Join 1.25M professional creatives like you
Connect with clients, get discovered, and run your business 100% commission-free
Creatives on Contra have earned over $150M and we are just getting started
I built an open-source tool that cuts LLM token usage by 97% for AI browser agents. Here's the backstory: I noticed every AI agent hitting the same wall — a single webpage burns 100k-180k tokens when you send the raw DOM to an LLM. That's ~$0.02 per step on Claude. Over thousands of steps, that's a real problem. So I built dom-distill — a zero-dependency TypeScript engine that runs inside page.evaluate() and distills the DOM down to only the interactive elements an LLM needs to act on. Results on real sites: → GitHub: 147k tokens → 4.6k (96.8% reduction) → Stripe: 180k tokens → 9.4k (94.8%) → React Docs: 68k tokens → 6.4k (90.7%)
I also built a complete agent loop on top of it — 200 lines of code that can navigate, search, and click across real websites using any LLM. npm: https://lnkd.in/gyHpEFVh
Post image
Abhiram's avatar
Cutting token usage by 97% is impressive. Optimizing DOM data before sending it to the LLM is a really smart approach.
Back to feed
The network for creativity
Join 1.25M professional creatives like you
Connect with clients, get discovered, and run your business 100% commission-free
Creatives on Contra have earned over $150M and we are just getting started