Design Thinking Assessment: Finding Problem-Solving Graphic Designers

Randall Carter

Design Thinking Assessment: Finding Problem-Solving Graphic Designers

When I first started freelancing as a graphic designer, “design thinking” sounded like one of those buzzwords that lived in pitch decks and design conferences. But over time, I realized it shows up in the quiet moments—like when a client can’t explain what they’re trying to solve, and I have to figure it out anyway.
Now, design thinking is baked into how I approach every project. Whether I’m designing a landing page or reworking a brand identity, the way I listen, explore, and test ideas with clients comes straight out of that process.
The more I’ve worked with teams—especially remotely—the more I’ve noticed how design thinking isn’t just about creativity. It’s a way of working that helps people make better decisions, faster, with fewer assumptions.
It’s not always about finding the “right” answer. Sometimes it’s just about asking better questions.

Why Design Thinking Matters for Graphic Designers

Design thinking centers every decision around the person on the other side of the screen. That could be someone tapping through a mobile app, scanning a restaurant menu, or trying to understand a nonprofit’s message in five seconds or less—scenarios where mobile designers for Media and Entertainment could enhance user engagement.
Empathy is the first step—understanding what users feel, not just what they say. That’s often where the best ideas come from, even if they’re not the flashiest.
Testing isn’t just something that happens at the end. In practice, I’m constantly putting ideas in front of clients and users, even when they’re half-formed. It’s about learning what works before polishing what doesn’t.
Clients rarely ask for “design thinking” by name. But they always ask for solutions that work, visuals that connect, and designs that feel intuitive. That’s where this mindset shows up—quietly, consistently, and usually mid-draft.

“Design thinking is less about being clever and more about being curious.”

Graphic designers who work this way don’t just make things look good. They make things that work—for real people, in real situations, which is why problem-solving graphic designers are so critical.

5 Steps to Assess Design Thinking Skills

A structured process is necessary to evaluate whether a graphic designer applies design thinking consistently. Without a clear sequence, soft skills like empathy or iteration are hard to spot. This framework uses five steps that mirror the design thinking model and can be tracked with specific outputs.
Success metrics for each step often include clarity of deliverables, logic of decisions, and adaptability to feedback. These are less about volume of work and more about how the designer solves problems in context.

1. Define Key Criteria

Start by identifying what kind of problem the designer is expected to solve. This includes understanding the target audience, business goals, brand constraints, and platform requirements.
Strong candidates articulate the problem in their own words before jumping into visuals. They ask clarifying questions, reframe vague briefs, and spot conflicting objectives early.
If a designer starts designing before talking, they’re probably skipping this step.

2. Measure Empathy

Empathy shows up in how a designer incorporates user perspectives into their process. This can be evaluated through user interviews, persona development, or storytelling exercises.
Look for designers who build user personas that reflect actual needs—not just demographic data. Their ability to mentally simulate user behavior helps them anticipate pain points before users experience them.
Empathetic designers also tend to explain why certain design choices solve user problems, not just why they “look good.”

3. Evaluate Ideation

This step focuses on how many different directions a designer explores before narrowing down. Mood boards, thumbnail sketches, and brainstorming notes are useful here.
Quantity of ideas is less important than variety. A designer who offers five variations that all look the same is ideating narrowly. A designer who pushes into unexpected territory—even if it doesn't all work—is showing creative range.
🧠 Ideation velocity—how quickly they generate and develop ideas—is another measurable trait in this phase.

4. Check Prototyping

Prototypes can be static mockups, low-fidelity wireframes, or interactive demos. The key is whether the designer uses them to test ideas early—not just polish final work. For immersive 3D concepts, some teams consider Unity experts to refine user interactions at an early stage.
Prototyping that includes real copy, click-through interactions, or sample user flows shows the designer is thinking past aesthetics. It also reveals how they handle constraints like layout, hierarchy, and accessibility.
A pretty prototype with broken logic won’t get far in testing.

5. Analyze Testing Feedback

This final step looks at how the designer interprets and integrates user feedback. It’s not about agreeing with every comment—it’s about spotting patterns and adapting the design accordingly.
Useful indicators include feedback summaries, revision logs, and insight documentation. Designers who resist revision or explain away user complaints tend to struggle with this phase.
Some teams also track how many iterations a designer makes after initial testing and whether those changes improved usability metrics.
✏️ Feedback isn’t the end of the process—it’s another form of research.

Tools to Evaluate Problem-Solving Abilities

Evaluating whether a designer can solve real-world problems through design thinking often requires more than reviewing a portfolio. Tools that track collaboration, decision-making, and iteration over time give clearer insight into how designers actually work.
Figma, Miro, and Notion are commonly used to observe how designers organize thoughts, respond to feedback, and build on input from others. In Figma, version history and comment threads provide a timeline of how a design evolved. For social good projects, Figma freelancers for Social Impact often harness these features to document iterative changes and gather real-time feedback. Miro boards often reveal how a designer clusters ideas or maps user flows. In Notion, documentation habits—like how mood boards, client notes, and research are linked—can show their thought process.
Some platforms now use plugin analytics to capture activity patterns. For example, plugins that log time spent on wireframes versus prototypes or track how often a designer revisits early ideas can help measure iteration depth.

“If a designer’s file history looks like a straight line, it probably wasn’t design thinking.”

Task-based challenges offer another way to assess problem-solving. These can be 1-2 hour briefs simulating real client conditions, such as designing a responsive banner for a mobile campaign or reworking a cluttered homepage with accessibility targets.
Challenges that include constraints—like tight turnarounds, limited colors, or multilingual content—force designers to prioritize and justify decisions. These exercises don’t measure polish. They measure how well a designer can respond to ambiguity.
Collaboration-focused tasks involve shared files, async feedback, or live brainstorm sessions. These are structured less for final results and more for observing how designers communicate ideas, ask questions, and adapt based on team input.
💡 Even small choices—like renaming layers, organizing assets, or tagging teammates—can show how a designer thinks about other people using their work.
Some teams record short walkthroughs where designers talk through their decisions. These clips often reveal whether their process is reactive or reflective.
Watching someone design is more useful than reading about how they “think” through design.

Challenges to Hiring Skilled Designers

Hiring graphic designers based on static portfolios alone creates a gap between presentation and actual performance. Portfolios often reflect polished final outcomes, not the thinking, revisions, or problem-solving that led to them. It’s difficult to assess how those visuals were shaped by user needs, client constraints, or iterative feedback.
In live settings, many designers struggle to articulate their design decisions in real time. Some rely heavily on referencing visual trends or personal taste without tying choices back to user research or project goals. Others may default to passive agreement during critique sessions, indicating limited internal frameworks for evaluating their own work.
There’s also a mismatch between visual quality and cognitive depth. A designer with strong aesthetics might not demonstrate fluency in defining problem spaces or generating diverse concepts. Without observing their process, it’s unclear whether their work emerged from structured exploration or instinctive execution.

“A good portfolio shows what they made. A good assessment shows how they think.”

Another challenge is cultural context. Designers unfamiliar with a target audience's cultural norms may unintentionally skew messaging or usability. For example, gesture symbols, color associations, or reading order can vary significantly across regions. If cultural context isn’t considered during research or wireframing, the final design may alienate or confuse the intended users.
Some assessments overlook this by focusing only on visual polish or tool proficiency. But cultural fluency is a problem-solving skill—it requires empathy, context awareness, and the ability to adjust assumptions. In user research, this shows up when interviews are interpreted through the designer’s lens instead of the user’s lived experience.
When hiring internationally or remotely, these issues are harder to detect without structured assessments. Some clients specifically look for English-speaking graphic designers to ensure alignment with global audiences or brand messaging. Even detailed case studies might omit how feedback was interpreted or whether the designer questioned the brief. Without this transparency, it’s difficult to distinguish between polished executors and reflective problem-solvers.
🧩 This is why assessments that include scenario-based tasks, live collaboration, and user-centered constraints are more effective. They surface thinking patterns that portfolios often hide.

Tips for a Strong Assessment Process

Rubrics work best when broken into observable behaviors. Scoring should be specific—like tracking how many ideas a designer generates per brief, or how often they revise based on feedback. Avoid vague traits like “creativity” or “strong visuals” unless each is clearly defined with examples.
Pilot tasks help reveal how a designer thinks under realistic constraints. These are short, focused assignments—such as redesigning a product card with limited space, or adapting a landing page for accessibility. The task should simulate a real project’s ambiguity, not just test technical tools.
Continuous feedback loops are more reliable than one-time reviews. Instead of judging only the final design, assess how the designer responds to input over time. This includes how they revise, what they prioritize, and whether they clarify conflicting feedback.

“One round of edits tells you what they can do. Three rounds tell you how they think.”

Feedback can come from multiple sources—not just clients or managers. Peer critique, user reactions, and even self-reflection logs give a fuller picture of how a designer navigates trade-offs.
When working with freelancers, platform structure also affects how collaboration plays out. Commission-free systems like Contra simplify this by removing middle layers, making it easier to share files, exchange feedback, and iterate quickly without payment delays or platform restrictions.
This structure supports async workflows, especially when designers and clients are in different time zones. Quick turnarounds are easier when no one’s waiting on an external approval to release funds or unlock deliverables.
“A smooth feedback loop isn’t about speed—it’s about reducing friction.” 🧠
Rubrics, pilot tasks, and feedback history can all be stored in shared docs or tools like Notion, Google Sheets, or Airtable. Consistency matters more than complexity—what’s important is that everyone knows how the assessment works, what’s being measured, and how decisions are made.
Scoring systems that reward progress over perfection tend to reflect real-world collaboration better. For example: tracking how a designer adapted to unclear briefs, not just whether their final layout was pixel-perfect.
Commission-free models make it easier to run these kinds of assessments without artificially limiting scope or time. Freelancers are more likely to engage in testing and iteration when the structure respects their time, process, and pay.

Frequently Asked Questions about Design Thinking Assessment

Even with the rise of structured frameworks and tools, design thinking assessments still prompt regular questions. Many stem from inconsistent definitions, unclear evaluation standards, and the difficulty of observing cognition in a visual process. These responses address the most common areas of confusion.

What are the 5 P's of design thinking?

The "5 P's" is not a universal framework, but some practitioners use the term to describe stages or traits within design thinking. One interpretation aligns with commonly accepted models:
Problem: Understanding the user’s challenge or unmet need.
People: Empathizing with users through interviews, observations, and persona development.
Possibilities: Generating ideas and exploring potential solutions.
Prototypes: Creating drafts, mockups, or interactive samples.
Proof: Testing and validating solutions through feedback and iteration.
This version echoes the Stanford d.school’s five-stage model—Empathize, Define, Ideate, Prototype, Test—widely cited in education and practice (source).
Some teams swap in different “P’s” depending on their workflow. The order matters less than the thinking behind it.

How do you apply design thinking to graphic design?

Each design thinking phase maps directly onto visual decision-making. In the empathize stage, designers conduct user research that shapes tone, content hierarchy, and accessibility. In the define stage, they translate business goals and user needs into creative briefs.
During ideation, visual concepts emerge—often as moodboards, sketch iterations, or layout studies. The prototype phase includes mockups, interactive wireframes, or static visuals that simulate the final experience. In testing, designers collect feedback through usability tests, stakeholder reviews, or A/B comparisons, then revise accordingly.
🖼️ Graphic design artifacts—like typography, color, and spacing—become tools for solving user-defined problems, not just making things look good.

Are online assessments accurate?

Online assessments vary significantly in accuracy. Some tools focus on surface-level outputs, such as portfolio reviews or timed quizzes. Others use structured frameworks that examine behavior, like how often a designer iterates or how they respond to feedback.
Studies show that psychometrically validated tools like the Design Thinking Test Instrument (DTTI) can reliably measure design cognition when aligned with clearly defined scoring rubrics (source). However, tools that rely only on visual samples or self-reported answers often lack consistency across evaluators.

“If an assessment only asks what you’ve done—not how or why—it’s missing half the picture.”

To improve accuracy, some teams combine multiple data points: task-based challenges, behavior logs from tools like Figma, and short video walkthroughs where designers explain decisions.

Why is empathy important?

Empathy allows designers to understand the user's experience from their point of view. In early stages, this shapes the problem definition by identifying real user frustrations or unmet needs. Later, it influences layout, messaging, and interaction patterns.
Neurological research links empathy to specific brain activity in the temporoparietal junction, where mental simulation of others’ experiences occurs (source). This cognitive ability helps designers anticipate confusion points, emotional responses, or cultural misalignment before the user ever sees the design.
Empathy is also a feedback filter. Designers with stronger empathy skills are more likely to interpret user reactions constructively, rather than defensively.
🧠 It’s not about agreeing with the user—it’s about understanding why the user feels what they feel.

Next Steps for Creative Problem-Solving

As of April 16, 2025, design thinking assessments for graphic designers continue to evolve alongside how teams work—remote, hybrid, async, or distributed. The focus has shifted from polished outcomes to process transparency: how designers define ambiguous problems, how often they revise based on input, and how clearly they communicate their decisions.
Ongoing refinement, not static evaluation, is what separates surface-level creatives from problem-solvers. Collaborative tools that log iteration history, async feedback, and real-time ideation help surface this thinking. Platforms like Contra simplify this by removing commission structures that interrupt or delay collaboration. Designers and clients can engage directly, test ideas faster, and document progress without artificial constraints.
“The most useful design file might be the messy one with 14 versions, 3 dead ends, and 1 solution that actually works.” 🧠
Open communication—between clients, users, and designers—remains the most consistent marker of effective problem-solving. Not just in meetings, but in notes, revisions, and version history. The ability to work out loud, even in small ways, allows others to understand, reflect, and contribute. That’s where better design decisions happen—quietly, iteratively, and often mid-comment.
Like this project
0

Posted Apr 17, 2025

Design thinking assessment helps identify graphic designers who solve real problems through empathy, iteration, and user-centered decisions.

Graphic Design Expertise: How to Verify Skills Before You Hire
Graphic Design Expertise: How to Verify Skills Before You Hire
Portfolio Analysis: The Method to Truly Assess Graphic Design Talent
Portfolio Analysis: The Method to Truly Assess Graphic Design Talent
Freelance Graphic Designer Vetting: Your Essential Checklist
Freelance Graphic Designer Vetting: Your Essential Checklist
Red Flags When Evaluating a Potential Graphic Designer
Red Flags When Evaluating a Potential Graphic Designer