Student Data Insights

Problem

How might we break down the siloes of student datasets to provide richer, actionable insights to educators?

Approach

I served as the lead designer, where I guided strategy, facilitated an innovation design sprint and designed MVP wireframes.

Impact

In 10 weeks, delivered a validated MVP concept and earned stakeholder consensus with high-fidelity MVP wireframes.

Design Approach

How might we break down the siloes of student datasets to provide richer, actionable insights to educators?

The company was looking for a lean team of senior staff to explore next-generation features for our data platform.

As the design lead, I partnered with a lead product manager to conduct internal and external research to guide the product strategy, facilitated the design sprint that defined our first concept, and iterated that concept into MVP wireframes.

Sprint 1: Discovery (4 weeks)

  • Prioritize use cases (and our innovation sprint theme) by reviewing them with internal experts

  • Conduct a GV-style design sprint

  • Conduct additional user testing and synthesize research

Sprint 2: Iteration (6 weeks)

  • Scrape relevant data from databases,

  • Get data science support and finalize resourcing,

  • Develop two iterations of wireframes, and

  • Finalizing a high fidelity concept

Note: To protect the speculative and proprietary nature of this work, some details will be omitted.

Client
Panorama Education

Duration
10 weeks (Mar.-May 2023)

My Role
Lead Product Designer, Workshop Facilitator

Team
1 Product Manager, 1 Executive Sponsor, 1 Data Scientist, 1 Researcher, 1 Engineer

Why was Insights So Hard?

Business challenge:
Panorama had not yet defined their role in providing insights

Data-driven decision-making helps educators use their limited resources, but who decides what data or insights need action?

As a team, we debated Panorama’s role: do we report the data, guide interpretation or interpret the data ourselves?

How did we mitigate?

  • Chose a use case that would balance the factors above.

  • Examined the market to understand how we might differentiate and learn from competitors.

  • Aligned approach to long-term corporate strategy

Data challenge:
Insights depend on good, reliable data

Student data is notoriously inconsistent and lacking. It’s often stored in multiple systems and subject to errors thanks to human data entry. We wondered: Did we have enough data? Could we trust the data we received?

How did we mitigate?

  • Identify sources so users can make their own decisions on how much to trust it.

  • Acknowledge the agency of users to also assess these key questions.

  • Narrowed our focus to data types and sources that we deemed more reliable.

Insights challenge:
Data analysis is complicated and mis-interpreting data is risky

Knowing whether data is correlated, covariant, or has a causal relationship is as much art as science. Depending on how data or insights were presented could cause someone to imply certain data relationships.

How did we mitigate?

  • Took caution on over-interpreting the data.

  • Worked closely with our research partners and data scientists throughout the process to know when we pushed that line.

  • Make careful determinations about the use of population-level vs student-level analysis.

Sprint 1: Discovery

Design Approach

Through a series of cross-functional activities over 4 weeks, we consulted internal experts, defined our primary use case, assembled peers to create a prototype and validated that prototype with users and stakeholders.

➊ Co-facilitated internal stakeholder reviews

We held brainstorms and discussions with data scientists, academic researchers, other product managers and educational thought-leaders.

➋ Designed and facilitated a GV-style design sprint

➌ Completed 9 concept tests with educators

We tested risks and hypotheses on Friday of the sprint week and with additional interviews the following 2 weeks.

Discovery Challenges

How do we ease the discomfort towards this new sprint approach?

Sprint participants were unfamiliar with sprint format and clearly uncomfortable with making decisions on limited information.

How did we mitigate?

  • Documented key discussion topics/challenges along the way,

  • Held debriefs with individuals throughout the sprint, and

  • Modified sprint agenda to create more space for agreement.

The 5-day sprint included participants from product, design, research and engineering.

Which comes first: the use case or the data insight?

Designing an insights platform required knowing a) what questions users may ask of the data and b) what our data could tell us about students. Without having yet had time to analyze our own data, how could we know what we could learn from our data.

How we mitigate?

  • Tapped into and shared internal stakeholder experience, documenting as many student data questions as possible that users may pose,

  • Exposed sprint participants to user’s day-to-day and jobs-to-be-done, helping them “think like a user”,

  • Assessed data streams by frequency and client usage,

  • Acknowledged the tension and fact that we were in unknown territory, and

  • Documented the risks/challenges of implying what data would tell us.

The sprint team wrestled with issues around data and equity. We developed our best hypotheses into a prototype for week-end concept testing.

➍ Presented prototype and findings to executive stakeholders

We presented a Q2 plan for resourcing and next steps, and were approved for a slimmed-down plan to iterate on the concept.

Sprint 1: Discovery Results

Sprint resulted in a successful Figma prototype with 3 core components: Insights Cards, Graph Filters and Graph Over Time.

We successfully validated our prototype through user interviews, documented engagement archetypes and presented a plan for an MVP delivery this year.

...Even though I’m a Spanish teacher and the screen isn’t about my class, it shows me my student is moving towards improved outcomes and we can continue to support the student and encourage them on the rising path, across all their classes... I like that I can pick and choose what I want to see.
— W, High School Teacher

Sprint 2: Iteration

Before my sudden departure from Panorama, I developed:

  • 4 medium-fidelity wireframes for data visualization,

  • 2 high-fidelity wireframes with real student data.

For our MVP, I reflected on concept feedback, data visualization best practices, and simplicity.

Round 1

I identified 4 viable approaches for displaying 3 unique data sets. For each option, I outlined pros and cons for each graphical approach: covering topics such as data integrity, page usability, potential for future expansion.

Iteration Challenges

How do we move fast — without design system components?

After discussions with other designers, I learned from we had no frameworks for custom data components or visuals, meaning Figma wireframes could take significant time.

How did I mitigate?

  • Chose not to move to Figma until we had coalesced on overall data visualization approach.

  • Guided design reviews with my key questions related to interaction, data availability and design principles.

  • Used color coding and symbols in sketches to integrate visual design touches, while not getting lost in the design system definitions.

Round 2

After an initial round of feedback, there was no obvious direction, so I aggregated the feedback into a new set of options.

My goals for this next iteration were to:

  • show interaction details,

  • prove these types of visualizations could support real student data,

  • share what I learned from graphing real student data, and

  • recommend a design approach for specific data types based on frequencies and idiosyncrasies of real data.

How do we design for all students — when data changes dramatically as they mature?

Older students will automatically have more data from more sources than younger students. While this is helpful for insights, it’s makes visualizing it far more challenging.

How did I mitigate?

  • Discussed limiting the number of data sources that could be viewed at once, either by number of sources OR by preventing certain data from being compared.

  • Sketched extreme edge cases with colors and symbols to bring to life the design limits of “too much data”

  • Documented a series of approaches to follow up with our data scientists and when in our Figma design system.

How might we manage the delta between recorded date and actual dates?

I learned through my exploration that the story told by the data itself was not always true. I saw situations where data was recorded after an event.

How did I mitigate?

  • Separated data streams rather than overlap all of them — so that educators could

  • Captured research questions for our data scientists to understand how broad and problematic this trend was.

Insights Sprints Impact

In the course of 10 weeks, my co-conspirator and I successfully:

  • Narrowed our insights product strategy to a relevant, differentiated user case,

  • Created a clickable prototype that was validated by end users,

  • Identified key risks and hypotheses to guide the design and development process,

  • Designed a work plan that would achieve an MVP launch by back to school 2023, and

  • Showed stakeholders in the potential of an insights product in less than 1 quarter’s time.

While I was unable to execute changes, my next steps would have been to:

  1. transition to Figma for high-fidelity mockups to begin exploring design system implementation,

  2. work with my lead product manager to conduct buyer interviews and define MVP data streams.

Previous
Previous

SaaS Analytics

Next
Next

Digital Transformation: Student Support Tracking (Team)