Deliverables. Designers need something better to aim for.
We’re stuck in a myth that project-led approaches, with their heavy upfront planning and rigid milestone structures, are the only practical way to make sure team building the right software. This mindset leads teams to race ahead towards the perfect plan and max-out prototypes without real-world input, often wasting substantial resources—like piling up fully accessible Figma components that never get anywhere close to a production environment.
What I’ve learned over a decade of leading design teams is that no consultant will point out that this approach itself might be the flaw. Projects are typically laser-focused on hitting deadlines and budgets, but this often just means spending money and time while users wait on the sidelines.
I’ve spent a lifetime designing for money, and I can tell you that often, hours before a deadline, you might find me obsessing over a custom SVG icon that may not even be necessary. This isn’t just about being thorough; it’s a symptom of a deeper problem. That problem cannot be solved by a design thinking workshop, but instead relentless dedication to a mature approach to design.
UX vision is stale by the time it's presented in PowerPoint.
After that, if it's not horribly disfigured by the manufacturing process, all strategic plans wind up in the creative recycling bin when reality pushes back. Rather than digging through my Figma prototypes for current design documentation, product teams are starved for new ways to view their products and understand how users interact with them beyond the interface.
I spent most of 2016 experimenting with product design techniques while trying to align myself to the cadences of two software engineering teams running SCRUM. At the time, I succumbed to the misconception that Lean UX was all about squeezing as much efficiency and waste from product design budgets—an almost Brutalist focus on speed over quality and user needs.
Then, In the summer of 2018, while working with Agile teams for the Department of Defense, I discovered that Lean UX enhances Agile by enshrining a feedback loop and rigorous prioritization directly into the design process—a continuous pipeline of product user behavior and measurable business insight can be established.
By December 2021, I'd worked closely with enough product and design teams in a humble, midwestern-headquartered global technology consulting firm to become the most annoying guy in the office—The Lean UX Guy. I helped our team establish and refine foundational content and activities that convinced our design department leadership that we could begin engaging Agile teams with the training.
My job was to write the facilitation script and direct the development of key workshop activities. My team and I were a little late to the Lean UX practitioner party—experimenting directly with methods from Gothelf & Seid's book five years after it was first published. Still, we were able to extend core concepts across activities in meticulously designed whiteboards—a set of remote-first Lean UX coaching activities. I facilitated the first of these workshops and trained another facilitator in the second. It's important to note that, while I would like to praise each of the amazing people who made these two projects possible, none of what you're about to read in this case reveals proprietary client information or compromises confidentiality agreements.
Lean UX is an iterative design methodology focused on finding the simplest way to test the riskiest and most ambitious ideas first.
In early 2022, our DesignOps team embarked on two significant projects using our refined Lean UX transformation methodology. The first project was with a major public utility company looking to improve their Asset Management Platform & Services (AMPS). The second project focused on enhancing a Market Sales Program for a leading insurance company. Both projects aimed to leverage human-centered design to address complex operational challenges within these diverse sectors. The participants in these workshops were product teams, subject matter experts (SMEs), and end-users of their current products.
We conducted a Mad Lib style exercise with key participants to generate and refine a Product Problem Statement. This involved answering questions about the product's domain, focus, observable gaps, and desired outcomes, which were then discussed and refined for clarity.
For ideation workshops, this is where we like to start—defining the problem—taking great care to ensure everyone in the workshop understands and agrees with the statement.
Believe it or not, everyone heading into a design workshop already has a really good idea of how they would solve the problem, given unlimited resources and time.
In the weeks leading up to our workshops, participants were surveyed about the problem to reveal key assumptions they held.
How do I say we did, "statistical analysis," but in a fun way? Prior to this activity, my team organized survey responses into thematic clusters, and plotted them on matrices to highlight the most important and urgent to test first.
Participants were guided to introduce, affinity map, and prioritize their assumptions based on their perceived importance and the level of risk associated with them. Assumptions that posed a high-priority to test (located in the upper right quadrant) were those that are either really good if correct or really bad if incorrect.
This approach attempts to ensure that the most important and urgent assumptions were addressed early, reducing the overall risk and increasing the likelihood of project success.
In each workshop, participants were divided into two groups for what the DM in me wants to call, "Character Creation." Each group developed a proto-persona and an empathy map to represent a group of end-users. Creating characters for whom empathy can exist, enabled participants to consider users other than themselves during subsequent design decisions.
This exercise also allowed any end-users of the product in the workshop to provide additional context about their needs and behaviors.
Task analysis. A day in the life. A journey map can have a variety of scopes. Each group was instructed to map each step in a key task in their proto-persona's journey with the group's product to identify key interactions and pain points. This activity enabled each group to benefit from candid conversations between subject-matter experts and end-users as they role-played each task and criticized the status quo.
This activity makes critical moments in the user's journey visceral, providing insights into where users experience the most friction and where improvements would be most impactful. By prioritizing role-play in these moments, the teams quickly establish focus on areas where design change can make the biggest impact.
We took great care to facilitate Journey mapping in a business context, which often needs to keep tack of layers of automation and indirect actions that are triggered through direct interaction with the product. Our participants were guided to identify key moments that were leading to critical business outcomes and focus design there first.
This was the fun part—solving the problem. Both workshop groups were guided to brainstorm a number of solutions they believed would be necessary to change outcomes during the moments they identified in their Journey Map.
Each solution was carefully aligned to business objectives and user outcomes during those critical moments. This alignment ensured that their designs would deliver value during key interactions and support organizational goals.
Below is a framework for creating consistent design hypotheses, similar to the one introduced to our workshop participants. Each design experiment involves clearly defining a hypothesis and setting the stage for testing and validation.
After each breakout group generated ten hypotheses for their persona, room coaches led a guided review with both groups to socialize their assumptions and test methods for a broader understanding amongst all workshop participants. Each hypothesis was evaluated based on its risk and perceived value to users.
Through another rigorous prioritization activity, the hypotheses from both breakout groups were refined further into a list of top hypotheses. These top 10 hypotheses represent the most important and urgent for the groups to test. Each group then selected a single hypothesis to bring into the final day's activities.
With a single hypothesis selected by each group, participants joined Design Studio where breakout room coaches facilitated collaborative sketching sessions. The activity was designed to harness the creativity and expertise of all participants, including non-design roles on product teams, subject-matter experts, and end-users alike.
Following a warm-up exercise, participants selected one idea they were most interested in and spent more time articulating this concept. The Solution Sketches presented are detailed sketches or diagrams of the concept to illustrate how it works. This sketches are often a new idea, a combination of ideas, or include elements from other participants' ideas. The aim was to put the hypothesis to the test using existing tools.
Below, you'll find the three top hypotheses from each worksop along with examples of a rapid design prototype I created to help carry out a test plan for each.
In the case of the Public Utility, their already cross-disciplined, Agile team treated the workshop as a "sprint" aimed at testing specific UI enhancements that they referred to as, "design spikes," that could be executed within the next sprint iteration. These teams found it helpful to scope their hypotheses in terms of vertical slices—end-to-end measures functionality and value added to the user experience. I then aided their team in the creation of prototypes utilizing their open source design system, and my professional experience in data visualization and semantic search solutions. By the end of their next iteration, teams were scheduled to demo their solutions to panels of product end-users and gather feedback on their designs.
During our workshop, both teams focused on testing their assumptions using existing tools. Each team concentrated on defining a 'vertical slice' of value derived from various data sources. Because we had access, the decision was made to prototype using a leading data visualization tool, Tableau. However, we faced challenges due to a lack of required data sources or necessary formulas to develop a fully-automated reporting solution. Employing a "Wizard of Oz" technique for the test plan, I guided our design team to conceptualize several "Insight Widget" prototypes. They were then manually created in Tableau to test the effectiveness of performance insights, coaching strategies, and forecasting the impact of behavioral changes.
The executive sponsors of the Sales Performance workshop expected the participants to deliver UI designs that would guide the development of a project budget. However, the workshop teams focused on using existing tools to test their hypotheses and busted critical assumptions about the nature of the product and the technical architecture required for its success. Their lean approach enabled them to test their hypotheses without a budget and ensure that their solutions were both practical and aligned with user needs.
In contrast, the team in the Public Utility workshop approached their tasks with a sprint mindset, treating the workshop as an opportunity to implement "design spikes" or focused design improvements. They concentrated on enhancing the usability of their existing product by leveraging an open-source design system to define specific UI enhancements that could be executed within their next sprint iteration, allowing for quick, iterative testing and implementation.
The hypothesis-driven approach demonstrated that even without extensive budgets, meaningful and user-centered product development is possible. By busting critical assumptions, teams can create solutions that truly meet user needs and deliver real value. This methodology encourages collaboration, continuous improvement, and a relentless focus on the user, ensuring that the right product is built, not just built right.