Audit reporting with Infrastructure as Code 

PROBLEM STATEMENT

Puppet's infrastructure as code platform helped sysadmins with the pain of configuration management. However, they still had to deal with the time-consuming task of generating security and regulatory audit reports. Since Puppet manages the systems that are being audited, we wanted to find a way for our tools to help.

ROLE

As the UX Researcher, I partnered with a UX Designer throughout all phases of this project. I took the lead during discovery research, problem definition and usability testing. The research and recommendations I provided at each step kept the project moving forward. 

DISCOVERY

We wanted to understand our users’ current auditing workflows, how Puppet fit in, and how we could fill the gaps. The goal of this research was to define the potential problems that Puppet could solve with new workflows. 

I conducted stakeholder interviews with subject matter experts within Puppet to formulate hypotheses about what we would learn. The information from this internal research informed the participant recruiting criteria, as well as the topics for the discussion guide. 

The discussion guide and the observer note-taking guide focused on workflow steps and artifacts so that we could look for similarities among participants. 

Research plan, discussion guide and note-taking guide for discovery interviews.

USER INTERVIEWS

I interviewed 8 Puppet Enterprise users who regularly do audit reporting in a range of industries that included the most common types of security and regulatory policies. The users shared the workflows and custom reporting tools they used during a recent audit, so our conversations were grounded in a real situation rather than a hypothetical one. 

All interviews were conducted via video conference, and whenever possible, participants shared their screen to show us the actual artifacts they used. My role as facilitator in these interviews was to focus on the reasons they created particular solutions, so that we could reimagine a different solution given our ability to change how Puppet works under the hood. 

DATA ANALYSIS WORKSHOP

I analyzed data from the discovery interviews and facilitated a cross-team workshop to visualize the workflows in a mental model. I worked with the UX designer, product owner and technical subject matter experts to identify the common steps among users handling audits. For each of those groups we created generic labels, noted pain points and discussed opportunities for Puppet to address those issues.

Mental model process from collaborative data analysis work sessions.

MENTAL MODEL

I continued to iterate on the mental model in mural.ly (an online whiteboard tool). This tool allowed me to show the workflows at a high level, while also providing the details of each user’s particular scenario. The mental model also showed how Puppet is currently used in some steps of the workflows to either report on or actively manage the systems being audited. 

Key Research Insight

Users want to use Puppet to report on changes they’ve made specific to their security and regulatory policies but they aren’t able to differentiate between these changes and other types of changes that Puppet reports. 

This is a problem that Puppet is best suited to solve.

Mental model of managing and reporting for audits, highlighting Puppet’s role and pain points.

EXECUTIVE REVIEW

I used the mental model to present findings and recommendations to executives and other leaders in the company. The executive team agreed with the analysis and funded a project to solve one of the main pain points we identified. 

Although the short discovery phase was independent from the day-to-day work of the engineering team, we quickly integrated into the agile workflow once we got executive approval.

DEFINITION

After identifying the problem to solve, we needed to talk to more users, but this time with a more narrow focus. The goal of this research was to understand & document the current experience around one particular issue: how users discover and resolve issues associated with unexpected change events on their system. 

USER INTERVIEWS

I interviewed 6 Puppet Enterprise users who are responsible for making changes in their infrastructure. I asked them to walk me through some recent  scenarios where they had unexpected changes in their system — from how they found out about it to the final resolution.

EXPERIENCE MAP

I synthesized the interview data into an experience map that illustrated the commonalities in a seemingly bespoke troubleshooting process. Although there were pain points throughout the experience, the “assess” phase brought the most stress and uncertainty.

Experience map illustrating how users monitor, assess, investigate and resolve unexpected change.

Key Research Insight

Change is unexpected if not correlated to a change window, a code deploy or a ticket.

The participants described how they manually determined if a change was unexpected. This gave us the rubric we needed to automatically categorize the changes as ‘expected’ or ‘unexpected’ and give our users the confidence in Puppet’s assessment. 

The team working on the new feature, including engineers and tech writers, referenced the experience map often to make decisions about implementation.

One phase in the experience map showing the goal, tools, steps and user pain points.

USABILITY TESTING

My UX design partner created wireframes to surface the new status in the Puppet Enterprise dashboard. Using a clickable prototype of the new design, I conducted usability test sessions with 5 Puppet Enterprise users, including one participant from the discovery research. All sessions were remote.

SCENARIO Let’s say that you check the Puppet Enterprise console every morning to check the Puppet activity, look for failures, and generally make sure everything is working as expected. You haven’t deployed any new Puppet code in the past week because most of your team has been on vacation. 

TASK 1 Log in to the Puppet Enterprise console, check on the status of things. How would you describe the state of the infrastructure to your colleague?

TASK 2 Investigate the “corrective change” any way that makes sense to you.

Original statuses show “successful changes” (left) vs. Proposed status changes that split that status type into “corrective changes” and “intentional changes” (right)

RESEARCH SUMMARY

We were thrilled to hear each participant correctly describe the meaning of the new statuses as well as reasons why they are glad to have this differentiation. 

USER QUOTE “The corrective changes? That means that something else is changing something on our machines and Puppet’s probably correcting it or needing to reboot services or these are runs that are not in the desired state of a puppet catalog so it’s making some kind of corrective change.“

We also made some iterations to the design between each session to address minor usability issues as they came up. By the time we met with the last participant, we had resolved most usability issues, and were able to ship a version very similar to the final prototype.

Final version of the change reporting updates in the Puppet Enterprise console. 

OUTCOMES

This feature was one of 3 major product announcements at our annual user conference: PuppetConf. I worked with the UX designer to create a poster that showed part of the design process, but also the value of this new feature. The ‘before’ section repurposed the content of the experience map to show the experience of investigating change manually. The ‘after’ section is an illustration by the UX designer showing the new feature as a part of the ideal experience as designed.

Attendees viewed this poster during the opening reception and during breaks between talks. The poster prompted conversations among attendees and several added comments and ‘+1’s’ via sticky notes.

Poster illustrating the improved process of investigating change. Printed at approximately 8’ x 4’ for display at PuppetConf.
PuppetConf attendees interacting with poster and usability testing station.

In summary, we shipped a major change to Puppet Enterprise reporting that headlined PuppetConf, attracted new users, and provided significant value to all our users with auditing and security reporting requirements.