Dashboard
  
data visualization

turning data into decisions

Automation Reporting Dashboard

Automation Hero Image
At Act-On, marketers relied heavily on automation programs to nurture leads, drive engagement, and convert customers. But there was a critical gap: once a program launched, users had little visibility into how it was actually performing.

The legacy reporting tools were disjointed, outdated, and counterintuitive. Marketers couldn’t answer basic questions like: Are my emails driving engagement? Where are leads dropping off? Is this program helping me hit my goals?

I led the redesign of Act-On's Automation Reporting Dashboard to solve this exact problem.
PROJECT DETAILS
Role
Duration
Design Lead
July - November 2021
KEY CONTRIBUTIONS
User Interview, Product Design, Usability Testing
PROJECT GOAL
Redesign the Automation Dashboard user interface to help customers understand the performance of their automated programs.

Results

The adoption and engagement rates for the Automation Dashboard have remained consistently high since the product's initial launch. We've witnessed a significant enhancement in customer satisfaction rates. However, there are still a few outstanding customer requests that require attention and resolution.

60K
Total number of automated programs
20K
Automated programs are running
73%
Accounts have at least one Automated Program
+15%
Adoption rate increased in 8 months

80%+ of marketers struggle with building and understanding the performance of their programs

AP chart
We kept hearing the same thing:
“I have all this data, but I don’t know what it means.”
“The numbers are there, but I don’t know what to do next.”
“I don’t know how well my automated program is performing.”
I was brought in to lead the redesign of the Automation Reporting Dashboard, with one goal:


Make complex performance data intuitive, actionable, and aligned with what marketers care about most - RESULTS.

“The Automation dashboard is not super useful for me. If I want to look at the performance of an AP currently, I would go to the campaigns actually. I usually pull up the individual stats and review them. And then I kind of compare the performance week over week... So a lot of my reporting is actually outside of Automation.”

-Emily, marketer at Phionline

Almost every user faces challenges when using the legacy dashboard

Arrow
AP legacy screens

200+

User complaints & escalations about the struggles of using Automated Programs

38+

Hours spent on explaining and trouble-shooting for customers weekly when using Automated Programs

My Role

As the lead product designer, I owned the end-to-end design process:
- Defined product goals in collaboration with PMs and engineers
- Conducted user interviews and internal design workshop
- Audited legacy UX and mapped current workflows
- Delivered interaction models, wireframes, high-fidelity UI, and dev specs
- Conducted usability testing and refined designs
- Partnered with engineering and QA for launch
AP Research Image

Pain point #1

Users struggle with getting started with Automation

The majority of our customers continue to use the Classic Editor instead of adopting the new Automatic Journey Builder (AJB). The primary reasons for their reluctance to switch to the new UI (AJB) include the need for extensive planning, and concerns about potential errors or interruptions.

Pain point #2

Users have a hard time understanding how their programs are performing

The current interface is counter-intuitive for navigating program steps and accessing specific statistics. Many users have expressed distrust in the accuracy of the dashboard data and have suggested that comparing the effectiveness of emails would be more helpful.

Pain point #3

Automation is a misnomer

Theoretically, Automation should operate seamlessly on its own. However, in reality, customers still heavily rely on our Customer Success and Tech Support teams for guidance in building and comprehending automated programs. The process involves numerous steps to access desired information, leaving users feeling mentally fatigued after each setup.

The discovery - insights buried in chaos

Through stakeholder interviews and usability reviews, I uncovered three major issues:

Lack of context and outdated UI undermined trust in data

The legacy dashboard left users feeling confused and skeptical. Key metrics were presented without explanation or visual clarity and the outdated, cluttered interface led many users to disregard the data altogether.

No sense of outcomes

Users struggled to understand whether their automation programs were successful. The dashboard surfaced raw metrics, but lacked clear indicators of impact, trends over time, or detailed reporting on individual email performance.

Hard to troubleshoot

Users were unaware that programs needed to be paused or stopped before making changes, leading to confusion and unintended errors. When other issues appeared, they lacked clear guidance on how to troubleshoot or resolve them—leaving many stuck or reliant on support.

Defining the vision

We chose the Automated Programs as our initial interface to revamp due to pressing usability issues and the diverse use cases it accommodates. In collaboration with the PM, we determined what should be in scope and out of scope for the MVP.
AP Information Architecture Image

The Solution: Turning complex data into confident decisions

Data
Analyze the success of automated lead nurturing programs.
Contact Lists
Quickly locate contacts in each program step.
Performance
Track the success of email and SMS campaigns with an easy-to-use reporting dashboard, and make data-driven decisions for better results.
Steps
Tie the visual language of the Automated Journey Builder (AJB) into a digestible view, and guide users to the AJB when they’re ready to edit their program.
Micro-interactions
Micro-interactions that guide users to pause, start and fix program errors.

Prototype

Data
Analyze the success of automated lead nurturing programs.
Contact Lists
Quickly locate contacts in each program step.
Performance
Track the success of email and SMS campaigns with an easy-to-use reporting dashboard, and make data-driven decisions for better results.
Steps
Tie the visual language of the Automated Journey Builder (AJB) into a digestible view, and guide users to the AJB when they’re ready to edit their program.
Micro-interactions
Micro-interactions that guide users to pause, start and fix program errors.

Test

After designing functional prototypes, we wanted to learn about a few key areas in our high-fidelity designs, including the discoverability of the Edit Program and Add New Contacts actions.

In addition, we sought to understand user preferences regarding searching email message reports within the program dashboard, and how easy it is to understand new design patterns introduced in the dashboard.

Discoverability of the Edit Program

Due to project constraints, users were initially required to pause program before making any changes. In order to access the clarity of this design, we conducted usability testing with 5 users. In the first round, only 1 out of 5 participants successfully paused the program in version A. As a result, we made some adjustments for the second round by removing the overflow menu and surfacing the "Pause Program" button. This change significantly improved user comprehension.
Version B 1
In Version A, we nested all program actions in a meatball menu associated with the Run/Pause action in the dashboard header.
For Version B, we surfaced some of these actions in the dashboard interface.

Add New Contacts Actions

We wanted to notify users that they have new contacts waiting to be manually added to the program on a schedule. In the initial round, some users encountered confusion, mistaking the "Contacts added on a schedule" text as a clickable element. To address this, in version B, we implemented a solution by providing a clarifying tooltip with additional information. This satisfied the issue of the icon looking clickable, while also associating the action with the alert.
Usability testing image
Version B 2
In Version A, 2 out of 5 users thought the Add New Contacts icon was clickable rather than looking elsewhere for an action.
In Version B, we made the informational blurb appear less actionable, and then placed it beside the action menu that allows users to manually add contacts.

Data Cards

The data cards were overall easy to understand in both versions. Preferences where almost evenly divided, with two users highlighting a potential redundancy in the information presented in Version B. Based on those insights, we concluded that Version A was the way to go.
Version B 3
Usability testing image
In Version A, we only showed the number of contacts in each stage, and didn’t show the total contacts for each data card.
In Version B, we added the total contacts count for each data card.

Email Performance

We faced a decision regarding the design of the "All Emails" data - whether to use a table format or data cards. In Version A, there was a misconception that the "All Emails" table served as a summary of the "Individual Emails". To address this, we redesigned the "All Emails" section, aiming to establish a clearer distinction and reduce the perceived connection to "Individual Email".
Email performance Version A
Email performance Version B
In Version A, we designed email summary section in a table and placed percentages and unique values on the same row.
In Version B, we redesigned the email summary so that it would feel less tied into the table of individual emails. We also indicated that more information was available by adding a dotted underline and further details in a tooltip.

The outcome: Data that drives action

The redesigned dashboard shifted reporting from static to strategic:
- Reduced support tickets from confused marketers
- Improved program optimization rates, as users could finally pinpoint friction
live product gif

RETROSPECTIVE

By listening to our users and making changes that provided real value to them, we drastically improved the UX and UI for one of the core functions of the product. I grew as a designers and our team walked away with a few takeaways:
Internal stakeholders feedback can be super valuable too.Listening to external user feedback can help you narrow in on the nuances, but sometimes internal stakeholders like the Technical Support team and Customer Success Team understand users struggles with a more holistic view.
Involve the engineering team by asking them be part of the usability testing.We had regular check-in meetings with the engineering team even when we were still in the research phase. We also invited some engineers to do usability testing tasks and asked them to provide feedback. It helped our design handoff process to be relatively smooth.