top of page
79c69fe938a4c6279159f2d7349eb9d459f7d7b1.png

Assessment Builder & Deployment

Problem

Instructors needed a reliable way to create and deploy placement exams that could accurately evaluate students and determine the appropriate course level for them. However, the existing workflow was fragmented and time-consuming, requiring professors to manage assessments, student cohorts, and placement decisions across several separate tools and manual processes.

Authoring assessments with mathematical notation was difficult, organizing questions by learning concepts was not supported, and creating multiple versions of the same exam for different student groups required additional manual coordination. Instructors also needed a way to define score thresholds that determine course placement, but this process was typically handled outside the assessment system.

 

Beyond administering exams, instructors lacked visibility into how students performed across different concepts. Without structured tagging or analytics, it was difficult to identify which topics students struggled with or use assessment results to guide remediation.

 

A more integrated system was needed to help instructors easily create assessments, manage deployments to student cohorts, and generate meaningful insights into student understanding while reducing the manual effort required to administer placement testing.

Research

Because we did not have direct access to instructors or students for usability testing, we worked closely with the college professor leading the company to understand how placement testing and concept mastery are evaluated in academic settings.

A key insight from these discussions was that assessments needed to measure more than just whether a student answered a question correctly. The platform’s core value was identifying which underlying concepts a student understood and which ones they struggled with. To support this, questions and individual answer choices were designed to be tagged with specific learning concepts.

 

This tagging model allowed the system to capture meaningful learning signals from both correct and incorrect answers. Incorrect answers were intentionally written to reflect common conceptual mistakes, allowing professors to see not only that a student got a question wrong, but which concept led them to that mistake.

 

These concept tags became the foundation for the platform’s analytics, enabling instructors to view performance trends across students and classes while also allowing the system to automatically recommend targeted video lessons and practice assignments to help students improve in specific areas.

 

Working closely with the professor, I translated this concept-driven assessment model into a structured interface that made it easy for instructors to author tagged questions while enabling the platform to generate meaningful insights from student responses.

Create Deployment - Assessments Typed 1.png
Assessment Builder 2.0 - Add Answer 1.png
Assessment Builder 2.0 - Search Hover 1.png

Solution 

I designed a flexible assessment authoring and deployment system that allowed instructors to easily create, manage, and distribute placement exams across different student cohorts. Building on early concepts for the feature, I refined and expanded the interface to better support the complexity of real academic workflows.

The assessment builder enabled instructors to write questions using a structured editor with KaTeX support for mathematical formulas, making it easier to create assessments for math and technical subjects. Questions and answers could be tagged with specific concepts, allowing instructors to later analyze performance by topic.

 

To support real classroom workflows, the system allowed instructors to create multiple versions of the same assessment, enabling different cohorts of students to receive alternate versions while maintaining consistent evaluation criteria.

 

The deployment workflow gave instructors clear control over when assessments were available, which student cohorts would receive them, and how score cutoffs mapped students into specific courses. This made the platform especially useful for placement testing, where results determine appropriate course levels.

 

Finally, instructors could view analytics on assessment performance, helping them identify trends in student understanding and evaluate how well questions measured specific concepts.

Together, these tools transformed what had previously been a manual and fragmented process into a structured system for creating, deploying, and analyzing academic assessments at scale.

© 2035 by Graphic Design Portfolio. Powered and secured by Wix

bottom of page