

Analytics
Problem
Instructors needed visibility into how students and entire cohorts were performing across assessments, but gathering this information required manually compiling and analyzing results from multiple sources. Identifying where students struggled, especially at the topic level, was time-consuming and difficult to scale. A more efficient way to surface performance insights was needed to help instructors quickly understand learning gaps and support students more effectively.


Research & Wire Frames
Because we did not have direct access to instructors or students for formal user research, we worked closely with the college professor leading the company to understand how instructors currently analyze assessment performance and identify learning gaps.
During a collaborative working session with the professor and product manager, we mapped out how instructors review results today and what insights are most valuable when evaluating student understanding. The discussion focused on how instructors compare performance across a class, identify struggling students, and determine which concepts require additional instruction.
Using a whiteboard, we sketched rough wireframes and defined the key requirements for the analytics experience. A central goal was enabling instructors to move between cohort-level insights and individual student performance, while also allowing them to drill down into concept-level data powered by the platform’s tagging system.
These working sessions helped establish the structure of the analytics views and clarified which metrics and visualizations would be most useful for instructors when evaluating both class-wide trends and individual learning gaps.


Solution
I designed an analytics dashboard that allowed instructors to quickly understand assessment performance at both the cohort and individual student level, eliminating the need to manually compile and analyze results.
The main dashboard provides a cohort-level overview, highlighting overall assessment performance and surfacing the concepts where students struggled the most. This gives instructors an immediate understanding of class-wide learning gaps and helps identify topics that may need additional instruction.
From this view, instructors can drill down into individual student attempts to see how a specific student performed on the assessment. Detailed breakdowns show how each question was answered and which concepts contributed to correct or incorrect responses.
Because questions and answers are tagged with learning concepts, the system can aggregate performance by topic. This allows instructors to quickly identify patterns across the class while also understanding the specific concepts a student may need help with.
Together, these dashboards transform raw assessment results into actionable insights, helping instructors evaluate class performance, identify struggling students, and target remediation more effectively without manual analysis.