Living Draft — Perception to Artifact Living Draft is an automated pipeline that turns real-world...Living Draft — Perception to Artifact Living Draft is an automated pipeline that turns real-world...
The network for creativity
Join 1.25M professional creatives like you
Connect with clients, get discovered, and run your business 100% commission-free
Creatives on Contra have earned over $150M and we are just getting started
Living Draft — Perception to Artifact
Living Draft is an automated pipeline that turns real-world observation into a published Figma Make site. I walked into Babe & Butcher in Charlotte wearing Ray-Ban Meta smart glasses, captured the space, and wrote creative direction on paper. Those images were saved to an iCloud folder, synced to my Mac, analyzed by Claude's vision API, turned into a Figma Make prompt, submitted by Playwright — all autonomously, no human at the keyboard. The result: a mobile homepage for Babe & Butcher that reflects the actual shop — its colors, its energy, its build-your-own-box experience — published and live on my phone while I was still at the patio.
Brandi's avatar
The hero interaction invites you to tap and build your own charcuterie board, ingredients accumulating the way ideas accumulate into form. That's the core metaphor of Living Draft: perception layers into artifact. Published Make site: https://hut-snack-14982944.figma.site ...
Brandi's avatar

linkedin.com

#figmamakeathon #figmamake #figma #livingdraft #aidesign #generativedesign #agenticai #uxdesign #productdesign #smartglasses #designautomation #anthropic #claude | Brandi Kinard

What if design started from what you see — not a blank canvas? For the Figma Makeathon 2026, I built Living Draft: a perception-to-artifact pipeline that turns real-world observation into a published website, autonomously. Here’s how it works: I walked into Babe & Butcher in Charlotte wearing Ray-Ban Meta smart glasses. I captured the space — signage, display case, wall colors, the energy of the room. I wrote creative direction on a sheet of paper. Then I saved those photos to a folder on my phone. An AI agent handled everything from there. Claude’s vision API analyzed the photos and my handwritten notes. It generated a design prompt grounded in what it saw — the actual palette, textures, and vibe of the shop. Playwright submitted the prompt to Figma Make. The site built itself and published — all with no human at the keyboard. By the time I sat back down on the patio, a working homepage was waiting on my phone. A design that looked like the space I was standing in. The bigger idea: Living Draft removes the blank canvas problem entirely. Instead of learning complex tools or staring at an empty screen, you just observe a space and let an AI agent draft the first version. Then you refine it. The tool learning curve disappears. Analysis paralysis disappears. You start from perception, not from nothing. This is early. The pipeline is rough. But the loop works: see something → capture it → a design emerges. That feels like where creative tooling is heading. Published site: https://lnkd.in/e73xBkXy Video: https://lnkd.in/eKUjcJT7 Figma Community: https://lnkd.in/ext5gS6n Built with Ray-Ban Meta Smart Glasses, Claude by Anthropic, Figma Make, and Playwright. #FigmaMakeathon #FigmaMake #Figma #LivingDraft #AIDesign #GenerativeDesign #AgenticAI #UXDesign #ProductDesign #SmartGlasses #DesignAutomation #Anthropic #Claude

Back to feed
The network for creativity
Join 1.25M professional creatives like you
Connect with clients, get discovered, and run your business 100% commission-free
Creatives on Contra have earned over $150M and we are just getting started