deadline Management System

 

Best Project Award for CPSC 444 Advanced Human-Computer Interaction Course - Spring 2014
(Among 50+ students and awarded by Judges from Industry)

An online interface that supports academic deadline management for students and professors.

Skills and Tools
Field Study (6 Students and 4 Professors)
Cognitive Walkthroughs
Balsamiq
HTML/CSS

Stakeholders
Professors and Students

My Role
User Researcher
UX Designer
Usability Tester
Project Manager

Team
5 people

Project Duration
3 months

Experiment
16 Participants
4 Isomorphic Tasks
4 Different Design Interfaces
2 x 2 Design
A vs B
Post-Interview


Current Fallbacks1. No central system to manage course deadlines for professors and students. 2. Lack of efficiency for professor’s when planning entire term 3. No data visualization tools to help students and professors better plan their workloads …

Current Fallbacks

1. No central system to manage course deadlines for professors and students.
2. Lack of efficiency for professor’s when planning entire term
3. No data visualization tools to help students and professors better plan their workloads and deadlines

The Challenge
Professors post course content, announcements, and exam dates in UBC Connect, Piazza, and individual course websites, causing students to have to go to multiple locations to find their deadlines. They also have to then manually input them into their own calendar. This process is prone to human error.

Professors use multiple tools for course planning, like paper notes, spreadsheets, Piazza, their own calendars. No one tool does it all, and planning events after the term has started is challenging.

We’re still staring at text lists and calendars when planning out our workload - which doesn’t help visualize how many hours our CPSC 444 assignment is going to take, comparatively with our other courses upcoming assignments. Currently an 8 hours assignment looks the same as a 1 hour assignment in a calendar.


User Research

Before starting to design the system, we conducted field observations and user interviews on our target audience, students, and professors. We interviewed 6 students in the Human-Computer Interaction (HCI) lab, while conducting the interviews with professors in their offices.

Being in the professor’s office was important, as we needed to understand and observe what tools they were currently using to currently manage their course deadlines. Seeing as students do not have a permanent office to work in however, we concluded that the HCI lab was appropriate, and asked the participants to show us how they used any tools they had on them to manage their deadlines. 2 students used paper agendas and to-do lists.

UBC Human-Computer Interaction Lab

UBC Human-Computer Interaction Lab


Screen Shot 2019-10-07 at 11.35.48 AM.png

Key Findings from Participants

Professors seem to use a combination of tools to manage deadlines and plan their courses. Because of this, most were open and expressed a need for a single tool to handle deadlines and course planning.

We found students also used a combination of tools for keeping track of their deadlines, such as Google Calendar, their memory, and agendas, but overall they were happy with their current way of juggling their deadlines, so a change in system may not be as needed.


Low-Fidelity Prototype - Student View

For the Student View, we used a pie chart to visualize the breakdown daily workloads.
For both Low-Fidelity interfaces, we used a single month calendar view.

Screen Shot 2019-10-05 at 3.43.55 PM.png

Low-Fidelity Prototype - Professor View

Based on our findings, we created our Low-Fidelity Prototypes using Balsamiq.
We created a Professor View and a Student View.
For the Professor View, we used bar graphs to represent a daily visualization of percent of students with other academic deadlines as conflicts for that date.

Screen Shot 2019-10-05 at 3.43.42 PM.png

Screen Shot 2019-10-12 at 10.01.39 PM.png

Experimental Design

Based on the feedback from our Cognitive Walkthrough and User Interviews, we decided to limit our scope on the “Professor View” since students were already happy with their own personalized systems.

We wanted to measure the effectiveness of 2 factors. First, the type of chart. We chose to compare bar and pie graphs which are among the most common for data visualization today. These 2 have both strengths and weaknesses and we wanted to see which was better suited for academic deadline management.

The second factor we looked at, was calendar type. Though there are many time managing systems with a single-month calendar view, we wanted to look at how helpful a multi-month view would be for professors. The idea behind this was that with a multi-month view, professors would be able to see the entire term, and how for example one change in February, could create a second change in March.

We ran a 2 by 2 experiment so we could test our 2 factors, and we compared our interfaces using an A vs B design, and conducted Post Interview.


High-Fidelity Prototypes

Screen Shot 2019-10-05 at 3.47.55 PM.png
Screen Shot 2019-10-05 at 3.48.07 PM.png
Screen Shot 2019-10-05 at 3.48.16 PM.png

Screen Shot 2019-10-12 at 10.01.49 PM.png

Participants and Tasks

To counterbalance our experiment, we recruited 16 participants. Each of whom performed 4 isomorphic tasks on 4 different interfaces. These 4 interfaces were combinations of our 2 factors, graph and calendar type. Each one of these tasks was timed for a quantitative analysis of performance.
Our combinations for Experiment:
1. Bar and 1 month calendar
2. Bar and multi-month calendar
3. Pie and 1 month calendar
4. Pie and multi-month calendar.


Screen Shot 2019-10-12 at 10.02.02 PM.png

Experimental Results

After all these experiments, our goal of the experiment was to answer:
“Which factors performed better, and was there a combination of the 2 that worked the best?”
These are the results for task completion time. We found no statistical significance to suggest that any combination of factors outperformed the others.
But our data did not come from numbers alone. An important part of our experiment was the qualitative analysis.


Qualitative Findings

During our post-experiment interviews, we asked the participants for their personal preference between single month or multi-month calendar layouts, 13 of the 16 participants chose the multi-month calendar view.They found that it was easier to compa…

During our post-experiment interviews, we asked the participants for their personal preference between single month or multi-month calendar layouts, 13 of the 16 participants chose the multi-month calendar view.

They found that it was easier to compare dates between different months, and they can easily get an overview of the whole term.

For data visualization methods, 9 people chose bar graphs, while 7 people chose pie charts.People who chose bar graphs indicated that it was easier to compare the heights of the bars of different dates to figure out the day with the least amount of …

For data visualization methods, 9 people chose bar graphs, while 7 people chose pie charts.

People who chose bar graphs indicated that it was easier to compare the heights of the bars of different dates to figure out the day with the least amount of conflicts.

On the other hand, people that chose pie graphs stated that it was easier to compare the portion of the circle each condition took.


Future Work

For future work, we would like to build another iteration of medium-fidelity prototype based on the results. We would also like to expand on the functionality of the 4 prototypes, such as including a database and allowing users to choose and switch between different graph visualizations based on personal preference.

From there, we would like to recruit professors from different faculties and schools and run future experiments that involve more comprehensive tasks around course deadline management.

In the future, we would like to build a medium-fidelity prototype of the Student View for further investigation.