Carlos Leon's Work | ContraWork by Carlos Leon
Carlos  Leon

Carlos Leon

Art Direction, AI-Enhanced Design, Digital Campaigns

New to Contra

Carlos is ready for their next project!

Mexico City. Subway platform. Puma energy. I wanted to explore what happens when sportswear storytelling meets cinematic fashion direction. This short AI-generated piece was created as an art direction experiment — treating a subway platform like a fashion runway. The goal wasn’t just to generate visuals, but to direct the scene like a real campaign: • framing and symmetry • character movement and pacing • lighting and mood • wardrobe consistency • editorial composition AI becomes far more powerful when it’s treated less like a generator and more like a creative production tool.
0
13
What happens when you approach action filmmaking as a systems design challenge? Ghost of the Andes is an AI-generated action short built entirely in Higgsfield, focused on character consistency, physics aware motion, environmental depth, and cinematic camera language. Instead of relying on spectacle, the goal was control: • Maintaining identity continuity across dynamic combat sequences • Calibrating motion timing for grounded biomechanics • Designing POV transitions (drone → binocular → handheld) • Layering volumetric fog, dust physics, and depth-of-field realism This project wasn’t about replacing filmmaking, it was about exploring how generative tools can extend creative direction, stunt choreography, and visual storytelling into new territory. AI is not the shortcut. It’s the new production studio.
1
2
61
Cover image for Exploring contrast in motion.
A high-speed
Exploring contrast in motion. A high-speed urban motorcycle sequence. Katana on the back. A kitten in a bubble backpack. Edge meets innocence. Neon reflections, controlled pacing, subtle character beats — all set against a haunting cinematic western ballad with a female vocal. Country undertones layered into a modern city night created a tension that felt unexpectedly right. Sometimes the most engaging creative direction comes from contrast: Speed × softness Steel × fur Urban × western Fun fact: the co-pilot completely stole the frame.
1
68
Cover image for A cinematic study in scale,
A cinematic study in scale, motion, and atmosphere under extreme conditions. Every decision—from camera distance to pacing—was intentional, with AI used to accelerate exploration across image, sound, and voice. Built using Higgsfield AI (https://www.linkedin.com/company/higgsfield/) and Nano Banana Pro for visuals, Kling AI (https://www.linkedin.com/company/kling-ai-api/) for video, Suno (https://www.linkedin.com/company/sunomusic/) for music, and Noiz.ai (http://Noiz.ai) for narration. #CreativeDirection #VisualStorytelling #GenerativeAI #AICreative #CreativeAI #CinematicDesign #ConceptDevelopment #FutureOfCreativity #CreativeTechnology #DesignThinking #AIWorkflow
2
5
139
A still image is no longer the final frame. It’s the starting point. I’ve been experimenting with a new AI-driven storytelling pipeline — blending image, motion, sound, and voice into a single creative workflow. ❄️ FLORA (https://www.linkedin.com/company/floraai/) for scene building and visual composition 🎬 Hedra (https://www.linkedin.com/company/hedra-labs/) for video rendering exploration 🎙️ NOIZ (https://www.linkedin.com/company/noiz-be/) for expressive narration 🎵 Suno (https://www.linkedin.com/company/sunomusic/) for original background music What excites me most isn’t automation — it’s direction. Shaping emotion. Designing camera language. Crafting atmosphere. AI becomes the studio. The story remains human.
1
3
136
Cover image for When culture meets motion, AI becomes a storytelling tool — ...
When culture meets motion, AI becomes a storytelling tool — not a shortcut. I wanted to see how far believable generative cinematography could go: • Real wind on fabric. • Natural fur movement. • Consistent daylight. • Subtle camera drift. • A meteor streak passing overhead — quiet, cinematic, grounded. This piece was built using FLORA (https://www.linkedin.com/feed/#) for scene generation and Google (https://www.linkedin.com/feed/#) gemini Veo 3 for motion, then refined through an editorial film workflow. End of the video shows the full process. The goal isn’t spectacle. It’s authenticity — powered by new tools. #CreativeDirection #GenerativeAI #FloraAI #Veo3 #CinematicAI #AIWorkflow #MotionDesign #DigitalStorytelling #FashionFilm #CulturalNarratives #FutureOfCreative #AIForCreatives
1
25
206
Cover image for POV: Between rounds… but it’s all AI 🥊 Testing how far AI f...
POV: Between rounds… but it’s all AI 🥊 Testing how far AI filmmaking can go when motion, simulation, rendering, and sound work together like a real production pipeline — just without the film crew. Workflow breakdown: • Motion performance captured in Weavy (https://www.linkedin.com/company/weavy-ai/) • Motion control & physical simulation in Kling AI (https://www.linkedin.com/company/kling-ai-api/) 2.6 Pro • Photorealistic rendering in Nano Banana • Original soundtrack generated in Suno (https://www.linkedin.com/company/sunomusic/)No actor. No camera rig. No studio lighting. At the end of the video, I included my full workflow for anyone curious about how these tools connect into a cinematic pipeline. AI filmmaking is no longer about shortcuts — it’s becoming a new creative craft.
25
205
Cover image for Winter action, AI edition ❄️ This was a quick motion experim...
Winter action, AI edition ❄️ This was a quick motion experiment that turned into a full workflow test. Built the scene in Weavy (https://www.linkedin.com/company/weavy-ai/), refined stills with Nano Banana, then used Luma AI (https://www.linkedin.com/company/lumalabsai/) Reframe to resize a 1920×1080 clip into Story format without killing the moment. Same idea, different framing. Still chaotic. Still fun. Loving how these tools are less about “making things” now and more about reworking ideas across formats — which is honestly where most real creative work lives.
22
202
This video started with real movement. I motion-captured a bachata dancer to preserve authentic rhythm, weight transfer, and timing—then paired that data with @FLORA × Kling motion control to translate the performance into believable, cinematic motion. The character image was generated using Nano Banana, allowing visual flexibility while keeping the movement grounded in a real human performance. I really like using Kling motion control because it helps characters come alive and feel more real—especially when the motion starts with an actual dancer rather than a synthetic loop. When choreography leads and AI follows, the result feels human. Note: Bachata music and dancer references are credited to their respective owners. The AI-generated work is my own.
21
236
Cover image for Exploring how AI can reshape brand identity work. Experiment...
Exploring how AI can reshape brand identity work. Experimented in @FLORA to restyle well-known logos using a soft iridescent aesthetic inspired by the new Apple TV logo. The goal was to study how light, materiality, and gradient behaviour can shift the tone of a brand while still preserving recognizability. AI continues to open up interesting pathways for rapid visual exploration — especially when testing alternate brand treatments, motion concepts, and material studies in seconds rather than hours. :
22
249
Here’s a playful behind-the-scenes look at how I built this little moment — a kid on a tricycle, styled using H&M (https://www.linkedin.com/company/h&m/) kids’ sweater and jean as my clothing references 👕🚲✨ I created the whole scene in FLORA (https://www.linkedin.com/company/floraai/), mixing outfit inspo, pose shaping, and a clear visual storyline to unify the whole scene. Such a fun mix of fashion reference + AI creativity.
23
238
Cover image for Experimenting with star fighter concepts in @FLORA today ✨ C...
Experimenting with star fighter concepts in @FLORA today ✨ Crazy how a tiny toy render can look this real.
1
81
I explored how generative AI can capture emotion in small, human moments. I built this mother–daughter scene using FLORA (https://www.linkedin.com/company/floraai/) and Seedream 4.5 — focusing on subtle gestures, natural lighting, and storytelling details that make a moment feel real. What surprised me most was how well these tools handled warmth and authenticity: the excitement at the window, the turn, the hug. Even in an AI-generated environment, emotion can still be the anchor. Always experimenting. Always learning. Always creating.
2
22
206
I’m testing in Flora ai × Kling motion control to see if AI can feel performance — not just move through it. A drummer isn’t a loop. It’s tension → release → rebound → breath. Using motion control to sculpt: • wrist snap + stick recoil • shoulder drag between beats • micro-pauses that create groove This feels less like animation and more like choreographing energy from a single frame. Motion as material. Timing as language. Always experimenting. Always learning. Always creating.
10
39
621