Skip to content

Latest commit

 

History

History
213 lines (140 loc) · 10.5 KB

uef-findings-round13.md

File metadata and controls

213 lines (140 loc) · 10.5 KB

Round 13 UEF Pattern Testing Usability Findings

Results overview from Round 13 of the user feedback sessions

Background

The UEF team conducted usability testing to evaluate specific UEF patterns in the context of a linear application on mobile and desktop devices.

The following patterns were evaluated in Round 13 Testing:

  • Dropdown Button
  • Progress Bar
  • Tree Structure
  • Multi Select
  • Yes/No
  • Toggle Button

Testing was conducted on the following types of devices: Smartphones (iOS and Android), Tablets (iOS), and Laptop (Windows 7)

The Prototype

The prototype was designed to incorporate two separate scenarios. The first scenario walked the participant through completing information for an already submitted MySSA application testing all patterns listed above except one. The Tree Structure pattern was shown in the second scenario (time permitting).

Viewport Sizes

As with prior responsive prototypes used in Mobile UEF testing, this prototype was designed with a single breakpoint. The devices with a viewport size of less than 768 pixels when used in portrait view included the iPhone 5, iPhone 6, and Samsung Galaxy S3.

The devices with a viewport size of 768 and larger included the iPad, Laptop, and Galaxy Tab S2.

The viewport sizes for each mobile device used in this round of testing are as follows:

Mobile Device Viewport Size Operating System
iPhone 5 320 x 568 iOS
Samsung Galaxy S3 360 x 640 Android
Samsung Galaxy Tab S2 800 x 1280 Android
iPad 768 x 1024 iOS
Laptop 768 x 1024 Windows

What We Did

With members of the general public, Mobile UEF Team members:

  • Conducted user testing with 17 participants on October 23, 2017 at the Howard County Public Library, Miller Branch location in Ellicott City, MD
    • 17 participants tested on one of the following types of devices:
      • Smartphone: 8 total participants
        • 5 using a Galaxy S3
        • 3 using an iPhone 5
      • Tablet: 6 total participants
        • 6 marked as using an iPad (Facilitators did not indicate whether or not a Galaxy Tab S2 was used)
      • Laptop: 3 Total participants
        • 3 using an HP Laptop
  • Collected participant information in a pre-test demographic survey, which indicated:
    • Participants ranged in age from 22 to 69, with a median age of 54;
    • 16 participants own and use at least one type of mobile device;
    • 16 participants use their mobile device to complete one or more online activities;
    • Five participants have used Social Security’s online services;
    • Six participants would use a smartphone (2) or tablet (4) to access SSA.gov or a MySocialSecurity account.
  • Analyzed the results, including:
    • Navigation methods and preferences;
    • Participant issues or comments regarding specific UEF patterns or screen details;
    • User satisfaction scores on the overall experience as indicated in a post-test questionnaire.

Challenges & Constraints

As with prior Mobile UEF testing sessions, recruiting of volunteer participants was performed on-site during the testing session with outreach to a broad range of library patrons. The usability test scenario and tasks were designed to be completed within 10 minutes; prior mobile testing had shown this time range yielded the optimal balance of participants and data in any single day.

Metrics

Metrics for this usability test were established by the Mobile UEF Workgroup as follows:

  • Completion Rate – Percentage of participants who successfully completed the application without assistance
    • Target > 80% for each device type
  • Ease of Use – Percentage of participants who indicated the application was “easy” or “very easy” to use, as measured by Questions #3, #5, and #8 of the post-test survey
    • Target > 80% for each device type
  • User Satisfaction – Percentage of participants who indicated they were “satisfied” or “very satisfied,” as measured by questions #4 and #7 of the post-test survey
    • Target > 80% for each device type

What We Learned

Metrics for task completion, ease of use and user satisfaction, as measured by the post-test questionnaire, were as follows:

Metric Target (All Devices) Actual SmartPhone Actual Tablet LapTop
Completion Rate >=80% 99% 98% 85%
Ease of Use >=80% 85% 80% 95%
User Satisfaction >=80% 94% 70% 95%

The following table lists the Post-Test Questionnaire responses by device type as well as overall.

Post-Test Questionnaire

The following table lists the Post-Test Questionnaire responses by device type as well as overall.

Scale of 1-5 where 1 = lowest and 5=highest

Questions Smartphone (n=8) Tablet (n=6) Laptop (n=3) Overall (n=17)
1. How well did the website match your expectations? 4.6 4.2 3.8 4.2
2. How well did the website support the task you were asked to perform? 4.8 4.6 4.5 4.6
3. How difficult or easy was the website to use? 4.44 4.2 3.8 4.1
4. Are you satisfied with the content? 4.3 4.0 4.0 4.1
5. How difficult or easy was it to move through sections of the website? 4.5 4.0 4.0 4.1
6. How easy were the words on the website to understand? 4.4 4.5 4.8 4.5
7. How satisfied are you with the speed at which you can complete tasks? 4.2 4.2 4.0 4.1
8. How difficult or easy was it to find information you needed? 4.1 4.6 4.4 4.3
9. How long would it take you to learn to use this website? 4.3 5.0 4.4 4.5
10. How confident did you feel using this application? 4.0 4.3 4.3 4.2
11. Average User Satisfaction Score by device type 4.4 4.4 4.2 4.3

Qualitative Assessment

This section discusses the usability issues, as well as observations and participant comments. The patterns tested within this evaluation group the findings.

Dropdown Button

  1. There were no major issues with this pattern.
  2. 13 participants successfully opened the Dropdown Button.
  3. 2 Participants (iPhone 5 and Samsung Galaxy S3) Couldn’t see the button below the fold.

Dropdown Button

Progress

  1. 15 participants had no issues with the progress pattern.

    1.1 One participant (iPad) said that the “processing” label below the progress was confusing.

    1.2 One participant (Samsung Galaxy S3) wanted to see more information for the progress pattern.

Progress

Toggle

  1. 6 of 17 participants were confused by this pattern

    1.1 Four participants (2 iPad; 2 Samsung Galaxy S3) asked where to find the "Yes" option.

    1.2 Two participants didn’t know what to do.

    1.3 Two participants (Desktop and Galaxy S3) used a sliding gesture instead of a tap/click gesture causing confusion when it did not work.

    1.4 One participant (Galaxy S3) was unsure they responded correctly.

    1.5 One participant (Galaxy S3) did not expect the circle to move.

    1.6 One participant (Desktop) was confused because the white circle matches the progress bar circle visually. However, they did not mean the same thing and had contradictory messages.

    1.7 One participant (Samsung Galaxy S3) was reluctant to click the circle and thought it would take them to another page where they could see all options.

Toggle Yes Toggle No

Multi-Select

  1. Most participants had no major issues with this pattern.
  2. Two participants (iPad & Desktop) liked and expected that the items were listed in alphabetical order.
  3. One participant (iPad) entered China into search field in order to test the search functionality. (Search did not function in the prototype)
  4. One participant (Galaxy S3) was unable to click the arrow to open the modal.
  5. One participant (Galaxy S3) did not think anything was clickable within the pattern.

Multi-Select Field Default

Multi-Select Modal on Desktop

Multi-Select Field with Items Selected

Tree Structure

  1. 11 of the 17 participants did not know the chevron was clickable.

    1.1 Two participants (iPad, iPad) did not select the arrow.

    1.2 One participant (iPad) said the chevron looked similar to bullets and did not know it was clickable.

    1.3 One participant (iPad) stated that the pattern was frustrating.

    1.4 One participant (Desktop) saw the red hover on the name but no red hover on the chevron and did not think it was clickable.

    1.5 One participant (iPad) clicked the name but not the arrow and never saw the nested names below.

    1.6 One participant (iPad) thought a plus sign would be better.

    1.7 One participant (iPhone 5) thought the arrow and the name should have the same functionality.

Tree Structure

Summary

Overall, most participants thought the site was very easy. Some participants made comments that it was too technical. One participant commented that the site was "neat and clean."

Two participants commented that they did not like the tree structure as they were worried about changing the content in the structure.

As seen in past testing, some participants noted that the “font and screen was too small to complete tasks [on a smartphone].” A couple stated that they would prefer to complete these tasks on a desktop.

Recommendations and Next Steps

Based on this round of testing, the following patterns were found to be problematic for enough participants to necessitate retesting or design refinements:

  • Progress
  • Tree Structure
  • Toggle

Pattern recommendations based on the findings are below.

Pattern Recommendation Rationale
Multi Select Continue with the current design. There were no major issues with this pattern.
Yes/No Continue with the current design. There were no major issues with this pattern.
Progress Re-evaluate including more information about the current step. Participants understood that it was a progress bar but wanted to see more information.
Tree Structure Consider testing again with new icon. Participants did not click on the icon of the tree structure; rather they click on the text.
Toggle Re-evaluate including A/B testing. Participants did not understand whether this pattern was clickable.