Owl Express: Usability Testing

Owl Express is a web utility for Kennesaw State University students that serves as a hub for such necessary features as class registration, major and minor declaration and parking, yet has received much flak from peers thanks to its non-intuitive architecture. My team and I conducted multiple usability tests with students to help emphasize some of these glaring issues.


 
 

TEAM: Kyle Franklin | Andrew Mayfield | Matthew Steinbauer | Jake Winnenberg

CLIENT: School YEAR: 2018


 
slide2.jpg

With our testing, we set out to highlight the major underlying problems reportedly experienced often by students of Kennesaw State University. Chief among these problems being the poorly conceived navigation features which have resulted in avoidable complexity.

 
 

 
 

Assigning Roles

In order to properly evaluate the site, we had each participant run through an identical usability test individually. This process would involve:

  1. A moderator to walk the user through the steps of the test.

  2. A technician to operate the screen capture software, microphones and camera.

  3. A note-taker to record and evaluate behaviors both audibly and physically present.

  4. A help desk to answer any questions the participant had while completing the tasks.

 
 

 
 

Deciding Our Methods

To keep the results as consistent as possible the moderator was given a script to read from while conducting the test. To begin the test the participant was greeted by the moderator and handed the consent and anonymity forms. The recording devices used included screen capture technology, an audio/video camera, and a separate audio recorder. The usability test included inspections of the user’s knowledge and reactions while using the website.

The user was given a brief description of what to expect during the test which consisted of four goals for the user to complete. Upon completion of the tasks, all participants filled out SUS forms to give us further insight into their experiences using the website. Then we had them choose 5 out of the 118 PRC cards displayed on a table to describe their experience using the website.

Slide1.jpg
 
 

 
 

Formulating Scenarios

My team began our search for possible improvements to Owl Express by performing a walkthrough with participants to identify which specific aspects of Owl Express are confusing, misleading, and/or needed to be re-structured.

Our usability test consisted of three parts: A walkthrough of Owl Express, a System Usability Scale (SUS) test, and then a Product Reaction Card (PRC) test from the Microsoft Desirability Testing Method (2002). In order to view the website from the perspective of someone who has never used the site before, we developed two scenarios based on anecdotal evidence of common, difficult tasks that the site is utilized for. These brief scenarios would later inform our task creation process.

Scenario #1: An incoming KSU freshman, attempting to register for classes for the first time.

Scenario #2: A sophomore at KSU with moderate knowledge of Owl Express, but now has to navigate to an unfamiliar page to pay a parking ticket.

 
 

 
 

Setting the Tasks

  1. The first task consisted of the participant being told to change their major and would be completed once the user clicked the “Submit” button on the “Request Change Degree Program” screen.

  2. The second task involved the user being told that they needed to view their parking citations and that they did not have a citation number, and would be completed when the user reached the “View Parking Citations” screen.

  3. The third task required the participant to check their time ticket for registration for the fall semester of 2018, and would be completed when the time ticket was displayed on the screen.

  4. The final, fourth task consisted of the user being told that they need to register for English 1101, Section 4, during the fall semester of 2018 and that they will need to add the Course-Registration Number (CRN) to the worksheet. This task was completed when the user successfully added the correct class with the correct section to the worksheet.

 
 

 
 

Finding Participants

 
Slide4.jpg

Our test consisted of four participants in total. The participants were based on a convenience sample conducted by our entire team. Every user participated on a volunteer basis, with no compensation involved. Participant 1 was completing their final semester at Kennesaw with advanced experience using certain parts of Owl Express. Participant 2 was not a student at Kennesaw and had no prior experience with Owl Express. Participants 3 and 4 were both juniors at Kennesaw with a moderate amount of experience using Owl Express.

 
 

 
 

Post-Testing Metrics

All the participants in the study successfully completed the study in under 15 minutes, although some participants had to receive minimal additional help, while others required much more guidance from the individual within the test room that was designated to direct the participants when necessary.

Screenshot (111).png
 
 

 
 

SUS Results

Participant 1 -  “Not Acceptable/Marginally Acceptable”

Participant 2 -   “Not Acceptable”

Participant 3 -   “Not Acceptable”

Participant 4 -   “Not Acceptable”

Participant 1, who had the most experience scored a 50 on the SUS sheet. A 50 on the SUS is considered not acceptable/ marginally acceptable. Participants 2, 3, and 4 all scored in the not acceptable range on the SUS sheet. Their scores were 37.5, 32.5, and 25 and this means that the website fits firmly into the category of “Not Acceptable”.

 
 

 
 

PRC Results

Screenshot (113).png
 
 

 
 

What We Learned

Not every participant shared the same experience navigating through the tasks, but one glaring commonality was that they all needed assistance from the help desk or moderator at some point. Even the more experienced Owl Express users had a difficult time completing certain tasks.

Users became visibly upset and frustrated while using the site, displaying behavior such as throwing their hands up in the air or cursing. There were many instances of a user being on the correct page but unable to find the information they needed, stating “maybe I clicked on the wrong thing” and navigating away from that page. The participants spent a lot of time just scrolling through content which they considered “redundant” and “confusing”. Two of the participants reached a point where they were ready to give up on the assigned task.

 
 

 
 

Final Thoughts

The most glaring problem with the site was an overly complex navigation system that could easily be simplified by establishing a clearer information hierarchy. Revamping the individual pages and condensing related content into the same pages and categories within those pages (being more specific about what these pages/categories accommodate) would help with this problem.

An example of this unnecessary complexity would be signing up for classes. Owl Express has many options to achieve this particular task, and while that may be perceived as a good thing by some, it only clutters the navigation with numerous links that all perform the same job.