linkedin.com
#figmamakeathon #figmamake #figma #livingdraft #aidesign #generativedesign #agenticai #uxdesign #productdesign #smartglasses #designautomation #anthropic #claude | Brandi Kinard
What if design started from what you see — not a blank canvas? For the Figma Makeathon 2026, I built Living Draft: a perception-to-artifact pipeline that turns real-world observation into a published website, autonomously. Here’s how it works: I walked into Babe & Butcher in Charlotte wearing Ray-Ban Meta smart glasses. I captured the space — signage, display case, wall colors, the energy of the room. I wrote creative direction on a sheet of paper. Then I saved those photos to a folder on my phone. An AI agent handled everything from there. Claude’s vision API analyzed the photos and my handwritten notes. It generated a design prompt grounded in what it saw — the actual palette, textures, and vibe of the shop. Playwright submitted the prompt to Figma Make. The site built itself and published — all with no human at the keyboard. By the time I sat back down on the patio, a working homepage was waiting on my phone. A design that looked like the space I was standing in. The bigger idea: Living Draft removes the blank canvas problem entirely. Instead of learning complex tools or staring at an empty screen, you just observe a space and let an AI agent draft the first version. Then you refine it. The tool learning curve disappears. Analysis paralysis disappears. You start from perception, not from nothing. This is early. The pipeline is rough. But the loop works: see something → capture it → a design emerges. That feels like where creative tooling is heading. Published site: https://lnkd.in/e73xBkXy
Video: https://lnkd.in/eKUjcJT7
Figma Community: https://lnkd.in/ext5gS6n Built with Ray-Ban Meta Smart Glasses, Claude by Anthropic, Figma Make, and Playwright. #FigmaMakeathon #FigmaMake #Figma #LivingDraft #AIDesign #GenerativeDesign #AgenticAI #UXDesign #ProductDesign #SmartGlasses #DesignAutomation #Anthropic #Claude