Dashboard
  
data visualization

ACT-ON AUTOMATION

Automation Hero Image
Automation constitutes a fundamental feature of the Act-On marketing automation platform. It empowers marketers to deliver tailored and personalized messages to their contacts through an automated manner.

However, this crucial part of the product has remained largely unchanged since its creation. This project is centered around the redesign of the Automation dashboard, with the primary objective of mitigating customer frustrations and streamlining the transition process.
PROJECT DETAILS
Role
Duration
Design Lead
July - November 2021
KEY CONTRIBUTIONS
User Interview, Product Design, Usability Testing
PROJECT GOAL
Redesign the Automation Dashboard user interface to help customers understand the performance of their automated programs.

Results

The adoption and engagement rates for the Automation Dashboard have remained consistently high since the product's initial launch. We've witnessed a significant enhancement in customer satisfaction rates. However, there are still a few outstanding customer requests that require attention and resolution.

60K
Total number of automated programs
20K
Automated programs are running
73%
Accounts have at least one Automated Program
+15%
Adoption rate increased in 8 months

Almost every user faces challenges when using the legacy dashboard

Arrow
AP legacy screens

200+

User complaints & escalations about the struggles of using Automated Programs

38+

Hours spent on explaining and trouble-shooting for customers weekly when using Automated Programs

80%+ of marketers struggle with building and understanding the performance of their programs

AP chart

“The Automation dashboard is not super useful for me. If I want to look at the performance of an AP currently, I would go to the campaigns actually. I usually pull up the individual stats and review them. And then I kind of compare the performance week over week... So a lot of my reporting is actually outside of Automation.”

-Emily, marketer at Phionline

Design process

I partnered with the PM and the engineering team throughout the design phase to generate ideas and gather feedback. We talked to both internal stakeholders and end users to understand the challenges they experience when using the dashboard. Through a design workshop, I identified the main pain points that we want to address in the redesign.

Among these, we see that users struggle with getting started with automations and understand how their programs are performing.
AP Research Image

Pain point #1

Users struggle with getting started with Automation

The majority of our customers continue to use the Classic Editor instead of adopting the new Automatic Journey Builder (AJB). The primary reasons for their reluctance to switch to the new UI (AJB) include the need for extensive planning, and concerns about potential errors or interruptions.

Pain point #2

Users have a hard time understanding how their programs are performing

The current interface is counter-intuitive for navigating program steps and accessing specific statistics. Many users have expressed distrust in the accuracy of the dashboard data and have suggested that comparing the effectiveness of emails would be more helpful.

Pain point #3

Automation is a misnomer

Theoretically, Automation should operate seamlessly on its own. However, in reality, customers still heavily rely on our Customer Success and Tech Support teams for guidance in building and comprehending automated programs. The process involves numerous steps to access desired information, leaving users feeling mentally fatigued after each setup.

Addressing the pain points

To address these pain points, I facilitated another design workshop with multiple stakeholders to brainstorm product opportunities for Automation. I formulated the brainstorming question as:
How might we help customers measure and understand the impact of automations?

Defining the vision

We chose the Automated Programs as our initial interface to revamp due to pressing usability issues and the diverse use cases it accommodates. In collaboration with the PM, we determined what should be in scope and out of scope for the MVP.
AP Information Architecture Image

Prototype

Data
Analyze the success of automated lead nurturing programs.
Contact Lists
Quickly locate contacts in each program step.
Performance
Track the success of email and SMS campaigns with an easy-to-use reporting dashboard, and make data-driven decisions for better results.
Steps
Tie the visual language of the Automated Journey Builder (AJB) into a digestible view, and guide users to the AJB when they’re ready to edit their program.
Micro-interactions
Micro-interactions that guide users to pause, start and fix program errors.

Test

After designing functional prototypes, we wanted to learn about a few key areas in our high-fidelity designs, including the discoverability of the Edit Program and Add New Contacts actions.

In addition, we sought to understand user preferences regarding searching email message reports within the program dashboard, and how easy it is to understand new design patterns introduced in the dashboard.

Discoverability of the Edit Program

Due to project constraints, users were initially required to pause program before making any changes. In order to access the clarity of this design, we conducted usability testing with 5 users. In the first round, only 1 out of 5 participants successfully paused the program in version A. As a result, we made some adjustments for the second round by removing the overflow menu and surfacing the "Pause Program" button. This change significantly improved user comprehension.
Version B 1
In Version A, we nested all program actions in a meatball menu associated with the Run/Pause action in the dashboard header.
For Version B, we surfaced some of these actions in the dashboard interface.

Add New Contacts Actions

We wanted to notify users that they have new contacts waiting to be manually added to the program on a schedule. In the initial round, some users encountered confusion, mistaking the "Contacts added on a schedule" text as a clickable element. To address this, in version B, we implemented a solution by providing a clarifying tooltip with additional information. This satisfied the issue of the icon looking clickable, while also associating the action with the alert.
Usability testing image
Version B 2
In Version A, 2 out of 5 users thought the Add New Contacts icon was clickable rather than looking elsewhere for an action.
In Version B, we made the informational blurb appear less actionable, and then placed it beside the action menu that allows users to manually add contacts.

Data Cards

The data cards were overall easy to understand in both versions. Preferences where almost evenly divided, with two users highlighting a potential redundancy in the information presented in Version B. Based on those insights, we concluded that Version A was the way to go.
Version B 3
Usability testing image
In Version A, we only showed the number of contacts in each stage, and didn’t show the total contacts for each data card.
In Version B, we added the total contacts count for each data card.

Email Performance

We faced a decision regarding the design of the "All Emails" data - whether to use a table format or data cards. In Version A, there was a misconception that the "All Emails" table served as a summary of the "Individual Emails". To address this, we redesigned the "All Emails" section, aiming to establish a clearer distinction and reduce the perceived connection to "Individual Email".
Email performance Version A
Email performance Version B
In Version A, we designed email summary section in a table and placed percentages and unique values on the same row.
In Version B, we redesigned the email summary so that it would feel less tied into the table of individual emails. We also indicated that more information was available by adding a dotted underline and further details in a tooltip.

Final product

In March 2022, we successfully launched the new Automation dashboard. Since its release, we've witnessed consistently high adoption and engagement rates. Additionally, this project has unveiled improved design patterns that hold potential for implementation in other areas of the product. Our ongoing efforts focus on validating the user interface at scale, prioritizing a seamless and enjoyable experience for marketers.
live product gif

RETROSPECTIVE

By listening to our users and making changes that provided real value to them, we drastically improved the UX and UI for one of the core functions of the product. I grew as a designers and our team walked away with a few takeaways:
Internal stakeholders feedback can be super valuable too.Listening to external user feedback can help you narrow in on the nuances, but sometimes internal stakeholders like the Technical Support team and Customer Success Team understand users struggles with a more holistic view.
Involve the engineering team by asking them be part of the usability testing.We had regular check-in meetings with the engineering team even when we were still in the research phase. We also invited some engineers to do usability testing tasks and asked them to provide feedback. It helped our design handoff process to be relatively smooth.