Ship Before You Hype #6
I thought AI would cut my design workload. That belief lasted until I actually used the tools.
Figma MCP to generate canvases from code. Google Stitch for prompt-to-UI. Pencil for rapid prototypes. Figma Make for iterative generation.
I tried them all. My design workload has not decreased.
Fixing What AI Gave Me
I generated a card design for a luxury watch brand using Figma MCP.
The output looked passable. Layout intact. Colors reasonable. Then I opened the layer panel.
The container structure made no sense in Figma.
As CSS flexbox, it was correct. But Figma is not a code editor. I needed to nudge text by a few pixels. The entire Auto Layout nesting collapsed. I rebuilt the whole thing from scratch.
The font came out as Inter. For a luxury watch brand, Inter screams “AI made this.” Swapped it for Cormorant Garamond.
AI-generated layers are not built for human hands.
The Order Is Reversed
Designers see the picture before they find the words.
Font weight. Whitespace rhythm. Color temperature. Already visible before any of it can be described. You move your hands and adjust until it clicks.
AI works the other way around. Words first. Picture second.
“Dark tone, Cormorant Garamond, watch image on right, rounded card.” Type that in, something reasonable comes out. Granular prompting probably gets you close.
But that means translating the finished image in your head into language before AI can even start. If you can do that, you are already faster on your own.
There is a crossover point where the effort to verbalize exceeds the effort to just build the thing.
“Isn’t this one cooler?”
Every designer says it daily. Two options side by side. You pick one instantly. Ask why, and you stall.
The time it takes to explain that “this one” to an AI, a fast designer would have already built it.
Intuition does not translate to language. Design starts with a picture. AI starts with a prompt. That reversal is the root of every frustration.
Figma’s Team Does Not See This
Figma keeps shipping updates. AI generation. Code integration. Plugin ecosystem.
The development team probably does not use Figma the way working designers do.
Code-to-Figma exists. Figma-to-code exists. But “picture in my head, straight to canvas, no words required” does not exist.
If they moved toward reading hand-drawn sketches and converting them to structured layers, that would be a step. For now, the entry point is still a text box.
I have made peace with this. You have to, or you lose your mind.
AI Is Not the Problem
I am not anti-AI.
Code implementation speed has genuinely changed. Tasks that can be verbalized accelerate with AI. Even this blog post — structure and research were done with AI.
The problem is that the core of design lives in territory that resists language.
Color tuning by fractions. Layout differences measured in single pixels. The overall feeling when everything sits right. If someone can express that in a prompt, I want to meet them.
Design workload has not decreased. AI handles the periphery better, which frees up more time for actual design. That is a real benefit. But the expectation that AI shrinks the design work itself has not materialized.
The hands keep moving.