Problem

The process of declaring a major and choosing courses can be overwhelming, especially at the UW where many majors are capacity-constrained or have minimum requirements. Students often turn to online forums to talk through the process, or build their own tools around courses, GPA, and instructors using publicly available data. These tools are not centrally located, accessible, secure, or routinely refreshed with current data. An official, student-facing tool that combines all of the most useful data into an accessible, approachable web feature would ideally empower students with knowledge and confidence so they can best utilize the time spent with their advisers.

Discovery

Before I joined the team, designers in AXDD had run student focus groups on existing academic tools containing prerequisite, GPA, and academic journey data. I began work on DawgPath with an understanding of what sets of information were useful to students and why:
  • Prerequisite data was helpful in exploring majors and planning course schedules

  • Major GPA data gave students a sense of what they needed to get into competitive majors

  • Popular courses in a major helped students identify if they might enjoy that major

  • The timing that other students took courses helped them think about their own academic journey

User stories

With this in mind, I and the other designers on my team wrote user stories that ultimately determined our initial use cases and crafted user stories for the first major release of the web app:
  • Explore majors: Pre-major students can search different majors in the tool to learn more about the program, courses taken by students in the program, potential career outcomes, and the GPA range for admission.

  • Discover courses related to a major: Pre-major students and students already in a major can discover courses through searching majors and viewing the courses students in that major have taken. They can also discover courses by searching a course in the tool and viewing which courses are available to them upon completion of the searched course.

  • Browse courses to plan out schedule: Pre-major students and students already in a major can search courses they are interested in taking that quarter to view what courses are available to them upon completion for the following quarter. They can also view when other students in that major have taken that course helping them plan which quarter they might take it as well.

  • Browse courses to get a sense of the major: Pre-major students can search different majors and view most commonly taken courses taken by students who declared the program. This can help give pre-major students an idea of what the major might look like.

  • Avoid a “toxic mix” of courses: Students can search for undergraduate courses to get a sense of how difficult a particular course may be. This can help them better balance their schedules and timing for academic success.

Initial designs

I took the lead in creating iterative mockups. Using the existing data, I envisioned what a usable website containing multiple data visualizations could be. I sought feedback from designers and students on my team on the initial wireframes.

Course and major views



Usability Evaluation Round 1

With something as critical as deciding on and applying for a competitive major, our primary concerns were:
  • Students feeling dissuaded from applying to their major of choice due to intimidating data presentation

  • Underrepresented minority students self-selecting out if they feel underqualified

  • Misconceptions around the idea of a “GPA cutoff” for certain majors, especially where mean GPA data of admitted students was presented without context

  • Distilling complex data science into a visualization that would help students assess how challenging a given course may be 

My colleague Kathy Bui and I designed and conducted our first round of user research to understand how best to meet student needs through the design of this tool.

Research questions: Is the information understandable? Is it useful?

Method: User Interviews, 1 hour

Participants: Four departmental students (human-centered design and engineering majors)

My role: Study co-designer and co-facilitator; note-taker

Data sets evaluated:

  • Declared Major GPA Distribution - This data shows the GPA ranges of students accepted into each major over the last 1-5 years

  • Course Grade Distribution - This data includes the aggregated grade distribution of all students who took this course in the last 5 years, regardless of instructor or quarter.

  • Course (Difficulty) Index” - This index is calculated by estimating the number of FWs (fail/withdrawal) for a course and subtracting the estimated FWs from the actual FWs to create an index that we can rescale from 0 to 5. The curve represents the course indexes for all courses at the UW to demonstrate how the specific course falls relative to the rest of the courses at UW. We intentionally avoided using the word “difficulty” in the study to see how students reacted to this information. (The metric was later named “Course Outcome Index.”)

Samples of test material (click to enlarge):

Analysis and takeaways

Kathy and I conducted our analysis using affinity mapping, where we grouped our takeaways into use cases, interpretations, points of confusion, and chart rankings.

Affinity mapping using Miro

Major data:

  • Use cases: applying for a major; histograms were clear winners for course/major data

  • Granular data may prevent self-selecting out

  • The clearer the information, the less they generalized

  • Students scanned for a median and a range

Course data:

  • Use cases: setting expectations around difficulty; major application goals

  • Perceived difficulty may affect timing of taking the course, or decision around a course (if optional)

  • More granularity preferred

Course Difficulty Index:

  • 50% found not useful

  • Fail/withdrawal not always equated with difficulty

  • Difficult to understand without description/in general

 

Usability Evaluation Round 2

Study design

The first study helped us decide how to present course and major data, but there were still lingering questions around the Course Index. Additionally, it was important to us to recruit participants from underrepresented minorities at UW, as well as pre-major and non-STEM students.

Kathy and I collaborated with data scientists in our department to create improved data visualizations for the Course Difficulty Index, based on what we learned from the first usability study.

We reached out to the OMAD (Office of Minority Affairs and Diversity) office and recruited 5 EOP students (Educational Opportunity Program: promotes academic success and graduation for under-represented ethnic minority, economically disadvantaged, and first-generation college students at the University of Washington. This included one pre-major student and 2 non-STEM students.

Research questions: What would make the “CDI” understandable? Useful?

Method: User Interviews, 1 hour

Participants: 5 EOP students

My role: Study co-designer and co-facilitator; note-taker

Updated data visualizations:

Analysis, recommendations, and takeaways

After a second affinity mapping exercise, and in conjunction with our earlier takeaways, we finalized our designs and created the new “course outcome index.”

Key takeaways:

  • Better emotional response: Pass/completion or the inverse of fail/withdrawal, was found to be a friendlier and less intimidating metric. Students found the fail/withdrawal framing stressful.

  • Aligned with students’ goals: Students felt that seeing pass/completions were more helpful since that is the outcome they would want out of a course.

  • More intuitive. Many students expressed that the scale going from “less to more” felt more intuitive, and compared it to reading from left to right.

  • Equally useful to fail/withdrawals. Students expressed they would use the data in the same way (course decisions, setting expectations, schedule balance). 


Why this design?

We tested out seven total visualizations for this data. We ended up with a combination of the number scale and color scale from our second round of research for our final version.

  • Easier to read. Students expressed this chart was more accessible for all students, not just those more familiar with reading graphs.

  • Comparison values add information. Students found that including comparisons (average XXX level course, average course in XXX curriculum) helped contextualize the scores.

  • Color aids in processing. Including color made it easier for students to process and quickly see the course’s proximity to a target outcome. We decided to use shades of the same color on the scale to remove snap judgments associated with certain colors (e.g. red = bad).

  • Labeling helps with clarity. Labels on the scale helped students quickly understand what the course outcome index is and what the score means.

Interactions

  • Toggle between comparisons. Selecting “Course in Context” shows the additional data points for “Average course in X curriculum” and “Average X00 level course at the UW”. The default shows the data point for the course.

  • Highlight data points. Hovering on the shape would show the label for the data point and COI score.

  • Find extra information. Clicking on the (i) will display a pop up describing what the COI score means. It should also include a link to the FAQ page that further details the COI.

Accessibility

  • All text in the COI should be screenreader accessible, not embedded in an image.

  • Course and major data should be presented in table format as well as histograms.

Final data visualizations:

Final designs

Drawing on our findings, I iterated on the design and copy with a focus on clarity, ease of use, and accessibility. After we had a prototype built, I met with the UW Accessibility Team to learn how we could improve usability for screenreader and keyboard users. I also employed color and typography in line with UW-IT’s style guide, and implemented those style decisions in the code.

User feedback

Upon launch, we added a short Google form to the live site to capture user feedback. Ten users responded, and the results indicated that we succeeded in our usability and comprehension goals. (1 indicates “not at all” and 5 indicates “very.”)