Under the Every Student Succeeds Act (ESSA), all states are required to publicly display clear and concise data on students’ academic performance, graduation rates, and other key metrics. State-level data dashboards serve as a powerful tool for this purpose, thanks to their intuitive visual design.
District and school leaders can use these dashboards to better understand which groups of students need additional support or may be missing out on key educational opportunities. Beyond educators, dashboards can increase transparency for caregivers and families about how their kids’ schools are performing.
In an ideal world, data dashboards facilitate a shared understanding and productive conversations about how to improve learning experiences for students. But in reality, dashboards aren’t always user friendly and accessible to those seeking information. In fact, many audiences—such as parents and school staff—may find dashboards hard to navigate or may feel overwhelmed by the amount of data they offer.
So how can we make sure data dashboards are user-friendly tools that spark inquiry and action? To begin, we can create a process for improving dashboards that involves regularly collecting and incorporating user feedback.
One State’s Approach
Recently, REL Northeast & Islands partnered with Rhode Island Department of Education (RIDE) to support the state’s work on increasing student access to and participation in early college opportunities. RIDE leaders identified their state’s postsecondary success dashboard as a key tool for helping district and school leaders better understand their data on student participation and success in dual and concurrent enrollment and Advanced Placement programs. To enhance these efforts, RIDE sought REL Northeast & Island’s support to ensure that the dashboard was user-friendly and accessible.
To understand users’ experiences with the dashboard, RIDE and REL Northeast & Islands began facilitating a series of structured feedback sessions with school principals and assistant superintendents—two groups identified as key dashboard users.
Establishing a Structured Feedback Session
To get meaningful feedback, it is vital to ask users the right questions about their experiences using the dashboard. In our listening sessions with Rhode Island, we wanted to capture three dimensions of feedback, which correspond to Munzer’s levels of threats to visualization validity (see image below).1
“No issues. I appreciate the defined calculations that were used to reach the metrics.”
—Feedback Session Participant
We designed three short activities to uncover this information (see feedback from participants in call-out boxes).
For the first exercise, participants focused their feedback on any difficulties or questions they had while navigating the dashboard. While some participants had extensive experience using the dashboard, others had never used it before. This exercise helped new users get comfortable using the dashboard in preparation for the next two exercises and gave RIDE feedback on the dashboard’s performance across different devices.
“As a parent, I might miss the overall tab and go right to the school data tab where there is no definition of post-secondary success. A note redirecting me back to the information would be helpful.”
—Feedback Session Participant
For the second exercise, participants put themselves in the shoes of a parent, school guidance counselor, district administrator, or potential higher education partner. We asked them to identify a specific town or region and then explore the data to consider:
- What can this user learn from the dashboard?
- How does it feel to use the dashboard?
- How difficult is it to find the answer to this question?
- How confident are you in the answer?
For the third exercise, we developed questions that could be answered using the dashboard. We asked participants to select one of the questions and to try to find the answer using only information from the dashboard. As they searched for the answer, we asked them to think about:
“It was useful to learn different ways to get feedback and learn about the importance of letting users know how feedback is incorporated.”
—RIDE Team member
Enhancing the Dashboard
RIDE walked away from these sessions with several specific and actionable suggestions to consider and prioritize in their dashboard improvement processes. REL Northeast & Islands then met with the RIDE data team to review the feedback and decide which suggestions to implement.
After they updated the dashboard, RIDE identified ways to integrate ongoing feedback mechanisms into the dashboard design—such as adding a feedback button on the RIDE dashboard. RIDE also identified regular meetings or touchpoints with educators as opportunities to inform users about changes made based on the feedback.
New Data Dashboard Feedback Resource
At a past REL Northeast & Islands Governing Board meeting, we shared an overview of our work in Rhode Island. Several members echoed the challenges of creating user-friendly dashboards and were excited to learn about the process we used for gathering and incorporating feedback.
In response to this broader interest, we recently developed a resource detailing the process we used with RIDE for engaging users in dashboard improvement efforts from start to finish. This new resource presents a five-step continuous improvement process that states can reference while enhancing their own data dashboards. To learn more, visit Improving Data Dashboards: A Feedback Process Used to Engage Users in Rhode Island.
1 Munzner, T. (2014). Visualization analysis and design. A K Peters Visualization Series, CRC Press.