Admin Reporting

At Newsela, we primarily focused on designing for teachers and students experiences. However, we began to realize we were overlooking another crucial audience within the platform: school and district administrators. As a product team, we recognized the necessity for admins to have their own experience in order to assess improvements and usage patterns of their schools. These insights could also enable them to make the most informed purchasing decisions for their classrooms.

Role:

Product designer

Company:

Newsela

Timeline:

2021 – 2022

Team Size:

~14 members

Phase 1: Discovery

General info on the product

The goal of Newsela is to provide meaningful classroom content for every student, specifically through the use of current events and partner content. We offer a library of articles to teachers and students with customizable Lexile levels, along with various assignments for them to complete."

At Newsela, the product process is divided into three phases: discovery, ideation, and definition. With the general goal set out, we moved into the discovery phase for the admin reporting project.

The discovery workshop.

Our plan was to kick off discovery with a Discovery Workshop that included all key figures in the product, technology, and customer teams. This required those of us on the direct product team (the product manager, my fellow product designer, and me) to engage in pre-planning for the workshop, including gathering research and building the workshop format.

In our discovery workshop, the sequence broke up into three stages over the course of the two-day work session: defining the initiative, reviewing past research, and prototyping basic flows.

Our first stage was to formally define the initiative. Initially, we divided all workshop participants into two groups, and tasked each to create their own definition of the business goal for the Admin reporting experience. Subsequently, we engaged in discussions around the two proposed goals and amalgamated the components that resonated with everyone to formulate the most accurate call to action. Our jointly conceived business goal is outlined below:

Our business goal

Empower admins to make informed decisions with insights on performance, success, and areas of growth.

Administrators need a way to see real-time classroom activity, longer-term trends, and actionable insights in order to plan group-wide instruction changes, tend to struggling students, and celebrate school and district successes.

The workshop wrap-up.

Stage two involved reviewing all relevant past research pertaining to the initiative. Our product manager led a walkthrough of insights we had drawn from both internally conducted and externally sourced research on administrator insight needs, as well as teacher insight needs.

Based on this previous research, it became evident that administrators required in-depth transparency into student and teacher usage and performance. Moreover, the ability to swiftly access an overview of their schools' overall successes and areas of growth was also deemed crucial. Additionally, we identified a distinction between the data needs of school administrators and district administrators: the former necessitates more detailed information, while the latter primarily requires high-level insights.

We concluded the workshop with the third stage: prototyping the basic user flow. We rejoined in our previously defined groups with the objective of conceptualizing potential optimal user flows. These were crafted while considering the recently defined business goal and research findings. Each individual members of their group contributed their own ideas, and subsequently, each group composed their own unified flow guided by the respective product designer of each team. These user flows would later be elaborated upon and tested in the subsequent phase: ideation.

Phase 2: Ideation

The ideation phase.

With user flows generated from the discovery workshop, my next task was to refine the UX design for user testing. This step was essential for continuing to validate the decisions we had made in order to move forward.

Our first round of testing focused on gathering feedback for the optimal overall user journey, targeting both new and returning administrators. My fellow product designer and I created one wireframe prototype each for testing, utilizing the flows we had built in our workshop groups as a starting point. Upon their completion, we teamed up with our user researcher and product manager to create the test script, ensuring we could gather specific feedback on where we needed it the most.

First round interview findings.

Overall, the prototypes were well-received by the interviewees, confirming many of our initial insights and assumptions. The following insights emerged as significant guiding lights as we progressed further in shaping the overall user experience of this new platform.

Flexible filtering is crucial.

Regardless of the admin type (school or district), every user emphasized the necessity for regular and comprehensive data filtering.

Differing levels of data needs.

We consistently confirmed that high-level administrators (i.e., district level) seek aggregated data at a broader level. Simultaneously, school administrators require the capability to delve into individual student-level details.

Data visualization clarity.

During testing, certain charts and data visualizations required additional details, with a few users even requesting that the data visualization be supplemented by a descriptive explanation.

Visual ideations.

Based on the aforementioned findings, we initiated the development of another iteration of prototypes. This time, we prioritized crafting a more deliberate UI experience, emphasizing the seamless navigation of data from aggregated to finer levels. Our aim remained refining the data administrators sought — ranging from the most comprehensive aggregation to individual users. Lastly, we conducted tests to identify the data visualizations that would most effectively convey the aforementioned information.

Additional testing.

Once more, the prototypes garnered positive feedback from our testing group. The subsequent insights were the most significant takeaways, enabling us to transition into the definition phase.

Widgets as a navigational medium.

Administrators favored the prototype that employed interactive widgets both as a means of representing aggregate data and as a navigation element to access more individualized information.

"At a Glance" highlights.

Testers expressed interest in the prototype that employed visually simple and direct highlights. Regardless of the approach, users also indicated the importance of including actionable steps related to areas of growth that were highlighted.

Accessibility of data visualizations

Some users encountered difficulty in visually interpreting the data visualizations. To address this, it's imperative to establish accessibility standards to guarantee charts are fully comprehensible.

Phase 3: Definition

The definition phase.

Building upon the user feedback from ideation, we finalized our MVP approach: a dashboard experience incorporating school highlights and data visualization widgets for aggregated information. This interface would seamlessly transition to individualized data pages, providing deeper and more detailed insights.

Widgets for our design system.

Upon announcing it during our weekly design team huddle, we received feedback that other product designers were also considering future widget usage within their respective domains. Consequently, I would need to incorporate the component into our design system.

Initially, I established the foundation of the widget by standardizing the widget's typographic system and "shell" system. While integrating them into the design system, I had to ensure that any data visualization could seamlessly fit within the widget's framework. Moreover, I had to design pathways for additional interactions that could be expanded gradually, all while maintaining a balance to prevent potential complexity challenges in the future with the component.

After finalizing the design of the shell, my subsequent task was to create the data visualizations that would serve not only the MVP Admin Reporting experience but also accommodate any potential future data visualization requirements.

Accessible data vis.

While it may seem evident, data visualizations primarily prioritize visual interpretation. Consequently, we as a product team must implement supplementary accessibility systems to supplement our data visuals, specifically to ensure users with varying levels of visual impairment can comprehend the data depicted in the charts.

My initial task was to ensure that our data visualizations didn't rely solely on visual perception. This involved establishing a device-agnostic disclosure state system, creating graph-to-text descriptions, implementing a table view mode, and incorporating and designing pattern modes for data visualizations.

My attention then shifted to addressing color concerns. The existing brand and status colors proved inadequate to meet our data visualization accessibility requirements. To resolve this, I employed various tools to ensure that the new colors we proposed would be suitable for all types of visual impairments while maintaining alignment with our brand design.

Quick glance data highlights.

As mentioned earlier, one of our users' needs was swiftly comprehensive top-level data. This drove the creation of a robust school highlight component. Given the data-driven essence of this component, it was imperative that design specifications for engineers were comprehensive, accounting for every potential edge case and data representation state.

Robust filtration.

Given the significant demand for data filtration that emerged during testing, another focal point was the development of a modal filter system with intuitive navigation and visuals. Furthermore, the experience had to be adaptable, allowing for expansion as additional filters necessary for users were identified.

Final Thoughts

Launch of experience.

We maintained a close collaboration with engineers to ensure that the MVP we launched closely adhered to our designs. Through ongoing user acceptance testing and meticulous design quality assurance, we successfully launched an experience that matched our expectations. While the visuals and our initial deliverables were adapted to align with our timeline and the back-end team's capabilities, we managed to ship the experience as a stable release to our beta user group.

Feedback from our beta users was predominantly positive, and we continued to receive constructive input on how to enhance the dashboard further. However, a restructuring in product management and decisions from the executive team led to a temporary pause in the beta phase. This pause was essential to provide time for our leaders to define a comprehensive strategy regarding the data that administrators should have access to.

On continued success.

Future plans beyond the MVP stage encompassed several initiatives, such as further expanding our data visualization library, leveraging A/B testing to enhance user data provisioning, and iteratively refining the filtration system to provide an even more user-friendly solution.

Other Case Studies