Usability Evaluation: Chicago Cares
Chicago Cares is a volunteer-run non for profit organization based in Chicago that provides the public, corporations, and youth service groups with opportunities to volunteer around the Chicagoland area. As a part of this project, we sought to help Chicago Cares improve the usability of their website so that potential volunteers could more easily register for volunteer opportunities.
I worked on this project with three other classmates as a part of my Masters program. My role involved conducting a heuristic review and cognitive walkthrough of the Chicago Cares website, creating scripts for the usability tests, analyzing data gathered from usability tests and A/B tests, and providing design recommendations for the Chicago Cares website.
Understanding Stakeholder Challenges
Initially, we met with a Chicago Cares volunteer coordinator to understand their vision and challenges. The client expressed concern regarding the usability of their website, specifically with users' ability to easily search and sign up for volunteer opportunities.
Heuristic Review & Cognitive Walkthrough
We each conducted a heuristic review and cognitive walkthrough of the Chicago Cares website, identifying general usability issues and obstacles associated with the tasks of registering and dropping out of volunteer opportunities.
Here are some of the issues I identified while conducting a heuristic review of the volunteer registration page:
In addition to uncovering issues with the website, the cognitive walkthrough helped us to uncover 4 key tasks associated with searching and signing up for volunteer opportunities.
Using the insights derived from our heuristic reviews and cognitive walkthroughs, we decided to conduct a usability test to measure the effectiveness and ease of use of 4 tasks related to searching and registering for volunteer opportunities using the Chicago Cares website:
- Searching for volunteer events
- Signing up for volunteer events
- Registering to be a new member
- Dropping out of a volunteer opportunity
Defining Usability Test Objectives & Measures
As a group, we discussed how we could measure the ease of use and effectiveness of each task and derived the following test objectives and measures.
Conducting the Usability Test
We conducted a usability test with eight participants who had never previously used the Chicago Cares website. I moderated and took notes during two of the sessions. During each test, we asked participants to perform each of the four tasks described above and recorded their on-screen activity. After providing participants with 5 minutes to complete each task, we facilitated a 15 minute debriefing session with participants to further explore points of confusion, frustration, and other notable aspects we observed during the usability test.
Analysis & Findings
Using a combination of Google Forms, recorded on-screen activities, and field notes, we identified several issues associated each task.
Using the insights derived from our usability test, we proposed several recommendations to our client at Chicago Cares.
After reviewing the results of our usability study, we decided to conduct an A/B comparison test in order to identify if modifications to the Chicago Cares registration form could positively impact user experience and satisfaction. We focused on the registration form experience because our usability test results showed that users experienced more points of confusion and frustration with this particular task.
Two prototypes of the Chicago Cares user registration form were created and hosted on a secure DePaul University server. The first form (Form A) mirrored the Chicago Cares current registration form. The second form (Form B) included revisions to the original form based on some of the errors and confusion observed in the previous usability study.
Conducting the Test
We conducted a between subjects asynchronous A/B usability study with 40 people. Participants were contacted via e-mail and instructed to fill out either Form A or Form B as if registering as a user with Chicago Cares. Participants were then instructed to complete a 10-Question System Usability Scale (SUS) Survey that they would be directed to after clicking the “Submit” button.
Data collected from the survey was automatically stored and populated in Google Forms. I used the SUS Model to calculate the scores for each participant and then used SPSS to run an independent samples t-test to identify if there was any significant difference between the average SUS scores of the two conditions (Form A, Form B).
Findings and Limitations
There was no statistical difference between the SUS scores of Form A and Form B. Therefore, we could not conclusively say that our modifications to the registration form enhanced the usability of this task. Some limitations of our study included a small sample size and an uneven distribution of participants in both samples. Several participants that were recruited for the Form A usability test had previously participated in the Chicago Cares usability study. Since these participants had prior experience using the registration form, they may have more adept at taking the test (with greater experience) and consequently may have given the site higher scores.