Imagine a "daily pal" for AR or desk tablets—managing tasks and routines through natural gestures instead of a mouse. For #FigmaMakeathon, I moved beyond static design to build a live-synced prototype proving the potential of spatial interaction.
I stepped out of the standard use cases for #FigmaMake with a robust output layer for custom computer vision. Custom MediaPipe script translates gestures into data. Supabase holds the system state in real-time. Figma Make consumes that data to trigger complex, animated changes.
This experiment reframes prototyping for AR/VR—using the design tools we love and new interactions.
Prototype
(https://<https://www.figma.com/make/hw3WKbK2OGJ9tjY5znn1AS/Design-Jarvis-Tablet-Dashboard?t=9wnLp7WiOJ7bUOAP-1>)(you won't be able to try without my script, check video)
And yes, I don't code.