top of page
A child holding up a colorful heart made of interlocking puzzle pieces in red, yellow, blue, and green. The puzzle heart is in focus, while the child’s face is blurred in the background.

Project 1: User Experience Audit & Usability Testing for Alternative Teaching Strategy Center 

This project evaluated Altteaching.org through heuristic review, cognitive walkthroughs, usability testing, and interviews.

Conducted via DePaul University 

Project Timeline:

Project Type: UX Audit & Usability 

Primary Tools: Mural & Canva

My Role: Researcher

Duration: 8 weeks

Project Overview

Background: What is the Alternative Teaching Strategy Center (ATSC)?

Poster for ATSC Non-Profit Organization. The top shows a woman teaching math with a chalkboard labeled "a+b=" and a man with a heart speech bubble, representing services. Below are icons of a brain, gear, and people. Text reads: Academic Tutoring and Psychological Services for children and young adults with cognitive, learning, developmental, and neurological differences.

ATSC is a non-profit organization that provides academic tutoring and psychological services for children and young adults with cognitive, learning, developmental, and neurological differences.
 
​We chose to evaluate ATSC’s website based on a personal connection with the founders, who voiced ongoing concerns.

Problem:

Despite ATSC’s valuable services, its site posed several usability concerns that prompted a deeper investigation.

Three illustrated icons showing organizational challenges: (1) a web browser window with a magnifying glass over a declining bar chart labeled Decline in website traffic, (2) two smiling students with a downward arrow labeled Decrease in student enrollment, and (3) a hand holding a broken heart labeled Decline of donations.

Research Questions:

To better understand these issues and pinpoint specific areas for improvement, the following research questions were developed:

A webpage with a location pin icon, labeled:

How easily can users locate and understand ATSC's enrollment process?

A magnifying glass with a question mark, labeled:

What paths do users take to learn about ATSC's teaching methods and services?

A map with a location pin, labeled:

How well does the site’s structure and navigation align with users’ expectations?

Hand holding a heart icon.

How easily can users find how to donate to ATSC?

Goals:

These research questions guided our investigation and shaped the goals of the project:

#1. Identify barriers that hinder user comprehension and decision-making – represented with a warning triangle and checklist.  #2. Analyze the site’s flow – represented with a magnifying glass.  #3. Deliver actionable recommendations for improvement – represented with an award ribbon hovering over an open hand icon.

My Process:

In this project, I was involved in all stages: 

Visual: Light bulb icon above the word Planning.
Created a test plan that aligned with research questions and incorporated usability frameworks.  Recruited participants through personal and professional connections.
Conducted a cognitive walkthrough using task-based scenarios.  Performed a heuristic evaluation with Nielsen Norman Group’s guidelines and severity scoring.  Facilitated usability tests and interviews.
Visual: Hand icon with tap symbol above the word Testing.
Visual: Document icon above the word Reporting.
Synthesized findings into user profiles and actionable recommendations.

Understanding the Problem

Tip: Click the images in the slider to enlarge them.

Methods- Cognitive Walkthrough:

Our methods began with a one-round solo cognitive walkthrough, with three tasks influenced by our research questions. Each task simulated how a new user would perform key actions related to the problem statement.

 

This approach allowed us to assess the site’s usability based on authentic user needs and concerns from ATSC. Each step for each task was evaluated using a customized measure based on a 5-point severity rating scale (1 = no issue to 5 = critical issue) across the following criteria: Learnability, Comprehension and Success rate, and Error frequency.

Findings- Cognitive Walkthrough:

The following was uncovered for each task:

A three-column chart titled Task Outcomes and Key Insights.  Task 1: Learn the enrollment process (dark teal column)  Outcome: Task failure across majority steps.  Key Insight: Content overload, unclear and missing feedback caused confusion while navigating.  Task 2: Explore the teaching methods (brown column)  Outcome: Task completed with some difficulty.  Key Insight: Disorganized content and messy page flows made it hard to understand actions and system responses.  Task 3: Make a donation (gold column)  Outcome: Task completed with minor difficulty.  Key Insight: Outdated design layout and missing donation details raised doubts about legitimacy.

Methods- Heuristic Review:

Another research method utilized to evaluate ATSC's site usability was a heuristic review. This method consisted of a one-round evaluation using Nielsen Norman Group’s Heuristic Guidelines and a 5-point severity rating scale (1 = no issue to 5 = critical issue), to explore navigation, layout, and structure. 

This evaluation enabled us to prioritize user pain points, identify actionable design opportunities, and propose potential solutions to enhance learnability, efficiency, and overall satisfaction.

Findings- Heuristic Review:

Our team identified and ranked 47 heuristic violations that impact user experience. The top four guidelines violated were:

A table showing heuristic guideline violations, issues, and number of violations.  Recognition rather than recall – Unclear site structure, missing labels, and unclear section headings forced users to remember content locations and categories. 8 violations.  Visibility of system status – Users felt unsure of what was happening due to missing or unclear feedback (e.g., lack of active page indicators, video playback indicators, form submission confirmations). 7 violations.  Aesthetic & minimalist design – Pages felt visually overwhelming with dense text, cluttered navigation, and inconsistent formatting distracting from key content. 6 violations.  Flexibility & efficiency of use – Video content required multiple clicks; some opened 10+ subpages without shortcuts or filters, creating content overload and fatigue. 5 violations.

Methods- Usability Test & Interviews:

To gain real-world insights into the challenges revealed in the heuristic review, a one-round usability test and interview session were conducted. Participants were recruited through personal and professional connections. During testing, participants were asked to think aloud as they navigated the site to complete 4 tasks related to the problem statement.

After, a short interview was conducted, participants rated their experiences using multiple 5-point scales (Strongly Disagree to Strongly Agree, and Very Difficult to Very Easy.)  The interview transcripts were then manually transcribed, and a group thematic analysis using open coding was performed.

This approach led to a deeper understanding of users’ expectations, preferences, and areas where they felt improvements could be made, allowing us to compare findings and uncover key themes across all participant interviews.

Findings- Usability Test & Interviews:

The following four themes were uncovered:

Impact & Next Steps

Tip: Click the images in the slider to enlarge them.

User Profiles:

Based on usability tests and interview themes, two personas were developed to reflect the key user types: caregivers navigating special education and donors supporting these programs. 

This is a headshot of a younger woman with curly hair and glasses, wearing a white shirt and smiling outdoors.

Their goals and challenges reveal key usability issues, reinforcing our focus on simplifying content, streamline decision-making, and make program details easier to access.

This is a headshot of an older man with gray hair and glasses, casually dressed in a dark shirt.

Their frustrations and needs reveal opportunities to build trust through stronger communication and clearer reporting on how donations are used, supported by proof of impact.

These archetypes helped shape our recommendations in real user goals and expectations.

Actionable Recommendations:

We used insights from all testing methods to develop several actionable recommendations that address key user pain points. 

Ongoing Opportunites: 

While this project revealed valuable insights and design opportunities, we also identified a few limitations and lessons to help guide future work.

 A circular infographic divided into two halves: Areas of Improvement and Future Direction. Areas of Improvement (left side): #1. Small sample size – Insights may not represent all potential users, including families with special needs, donors, and partners. #2. Missed accessibility gaps – Time restrictions limited ability to fully assess accessibility issues for users with sensory and motor disabilities. #3. Time-heavy synthesis process – Usability and interview insights took longer to align with personas, requiring significant coordination. Future Directions (right side): #1. Beyond one-time evaluation – Involve staff and user groups in co-design workshops and continuous testing after design changes. #2. Enhance accessibility testing – Use tools like WAVE, JAWS, and keyboard navigation checks to ensure WCAG compliance and usability. #3. Speeding up analysis – Automate transcriptions and simplify synthesis with tools like Turboscribe to focus on actionable insights.

Follow Me

  • LinkedIn

© 2024 By Jessica Polk-Williams.
Powered and secured by Wix

bottom of page