Where we started

We began by defining the complete user flow for the ATW scheduling feature. During this phase, it became clear that the number of variables and technical limitations of the ATW appliance introduced a level of complexity that could easily overwhelm users.
Instead of committing to a single UX direction too early, we proposed to the client to explore two different approaches and validate them through a structured A/B test with real users.

The process - Phase 1

Defining two UX approaches
We designed Version A and Version B with different structural logics and interaction patterns for managing ATW programs. One focused on step-by-step guidance, the other on fast access and flexibility.

The process - Phase 2

Prototyping two experiences
Both versions were fully prototyped in Figma and prepared for testing. Once the prototypes were completed, they were connected to Maze and structured with a series of guided tasks. Each task followed a clear, progressive flow to test specific interactions and decision points — from setting up a weekday routine to associating thermostats and managing exceptions.

The process - Phase 3

Launching the A/B test via Maze
We recruited 100 users and randomly split them into two equal groups. Each received an email with instructions and a unique test link. All tests were run asynchronously, and data was collected automatically.
Tasks were focused on real-life scenarios, such as:

  • Creating a weekday/weekend routine

  • Copying time slots

  • Assigning thermostats


The process - Phase 4

Analyzing results and mapping insights
We reviewed quantitative data (success rate, time on screen, misclicks, drop-offs, usability score) and qualitative inputs from open-ended questions and screen recordings.
We then clustered user feedback on Miro, identifying clear pain points and opportunities across both versions.

The process - Phase 5

The final solution combined the strongest elements from both prototypes, refined through real user insights:

  • Thermostat selection repositioned
    Moved to the top of the screen, reflecting users’ natural mental model: first where, then how.

  • Copy & paste functionality added
    From Version B — enabled quick duplication of daily settings across the week.

  • Microcopy revised
    Simplified and clarified labels and instructions based on user confusion.

  • New guided flow introduced
    A step-by-step scheduling assistant for less tech-savvy users:
    “What time do you wake up?” → “What temperature do you want when you’re home?”
    This generates a complete schedule without requiring manual input.

What changed

  • 🧠 Improved mental model: users first think “where”, then “how” — now reflected in the flow

  • 💡 Added flexibility with guided or manual scheduling options

  • 💬 More intuitive UI copy, reducing friction and abandonment

  • 🏆 Designed based on real behavior, not assumptions

What I learned

Designing something from scratch is already a challenge. But designing something complex and invisible — like a heating program — requires empathy, structure, and validation.
This project reminded me that asking the right questions to users can unlock clarity, and that A/B testing isn’t just about choosing a winner — it’s about building confidence in what we deliver. View the prototype

Let’s shape great digital products together — open to new opportunities, wherever they may be.

Let’s shape great digital products together — open to new opportunities, wherever they may be.

Let’s shape great digital products together — open to new opportunities, wherever they may be.

Create a free website with Framer, the website builder loved by startups, designers and agencies.