A banner image containing design elements from the project. A banner image containing design elements from the project.

Supporting a health agency monitor 300+ nursing facilities through data viz

Updated April 2024

Our client was a state health agency that came to us with two asks:

  1. Create a public-facing tool so residents could easily search and compare nursing homes
  2. Create an internal tool that a team would use to regularly monitor nursing homes for quality and make interventions, if needed

This project was interesting because I was a UX designer, data analyst, and data engineer. The tools we built would later be used by 5,000+ people. I learned how to user test, rapidly iterate, and design with empathy for the user.

I worked on a team with one other designer “Maisy” and one Tableau developer “Daisy.” Maisy and Daisy worked on the public tool, while Daisy and I worked on the internal tool. I owned designs and conducted testing and research.

Conducting user research

To begin, we scanned through the state legislature to scope out the project. A bill that had been passed outlined requirements of the external dashboard.

Early on, we stressed the importance of speaking with the users to identify their pain points. We expanded our interview group to include personas we hadn’t initially considered, such as advocates and public health officials.

“What is something you want to do in one year that we can help you do today?”

We documented all our interviews and consolidated them in a Miro board. Based on our feedback, I interpreted the problem statement as "We want a way to see how nursing homes across the state are trending, identify outliers, and drill down into underperforming nursing homes."

I roughed out the average user journey through the tool. To simplify it to the layperson, the idea had to fit on the back of a napkin.

A picture of the back of a napkin showing the user flow diagram starting at the state-wide overview page navigating all the way to facility intervention.

Identify information architecture

Was the internal tool going to be structured like DoorDash? Or like a news website? To identify the information architecture, I drew out the average user journey and indexed on the call-to-action or the So what? on every page.

Once I drew out the user journey, I realized that it was important for the client to get a summary of how the state was doing (Overview). Then they wanted a way to compare different facilities against each other (LTC scorecard). And lastly, they wanted to drill into a particular facility if they wanted to get more information.

Through this exercise, we realized there were two levels of data granularity — one at the state-level and at the facility-level.

From here, I created wireframes and tested the navigation structure with our clients.

A diagram of wireframes for the internal tool. A diagram of wireframes for the internal tool.

Rapid iteration

Next, we turned to Figma to create rapid mockups based on our information architecture. The visual design of the tool changed drastically between design critiques and testing sessions.

Since this was one of my earlier projects, I learned the importance of font hierarchy and using color sparingly. At one point, I turned to mocking designs in black-and-white to focus on the message, not the UI.

A diagram of wireframes for the internal tool. A diagram of wireframes for the internal tool.

User testing

Throughout the above processes, we conducted several rounds of user testing. There were two takeaways that I have apply to all projects now.

First, there is one group of stakeholders that is typically overlooked in product design:

Developers are also users. When user testing, I would share designs with developers to get their feedback on what was feasible and if the data could support our designs. This meant that we would catch issues early on and we wouldn't have to redo our prototypes.

Prototype with real data. Despite starting testing sessions with “this is all fake data,” our client would try to find insights from lorem ipsum, or question how there were suddenly 123,456 facilities in the state.

Creating a robust data pipeline

State legislature required that the public tool had to be updated at least quarterly. After combing through the data, we learned that nursing homes frequently change their names. This meant the tool had to be updated more frequently. We settled for monthly updates, given the frequency of some data points.

I worked with Daisy and the client’s IT team to develop a data pipeline to extract, transform, and load (ETL) data to Amazon S3. I wrote scripts to consolidate files into three tables, ready for ingestion into Athena and Tableau.

A diagram of 14 data files moving through five Python scripts to become three consolidated tables stored on AWS Athena, before being read into Tableau.

Outputs from some Python files were used in others and files were updated at different frequencies.

Effective hand-off

Once development was close to the finish line, we set up biweekly calls with the clients to walk them through how to update the dashboards.

Just after our team wrapped up development, we asked the client to take the lead on updating the dashboards. We said we’d be there in the background if they needed support, but they were on their own. This made them self-reliant and — when the pipeline initially broke — the clients were better equipped to deal with it themselves.

Daisy and I also left about 18 pages of extensive documentation with an FAQs section.

Conclusion

On a closing note, this was one of my most fulfilling projects. I contributed to a tool that 5,000+ people used to search for a nursing home for their loved ones or for themselves. I learned how to user test, rapidly iterate, and design with empathy for the user.

Here are some screenshots of the internal tool that I worked on. All data is fake.

A collage of screenshots of the internal tool. A collage of screenshots of the internal tool.