The Mobile UEF team conducted usability testing to evaluate how a complex, linear application could be designed to work on mobile as well as desktop devices, using a responsive design approach. The existing iAppeals application was modified to evaluate:
- A proposed navigation structure allowing for linear progression as well as the ability to navigate to prior sections; and
- UEF patterns that might pose usability issues on smaller screens, thus needing further analysis or redesign.
The Mobile UEF team also sought to identify any emerging trends or patterns since beginning mobile usability testing this year. Patterns tested during this round included:
- Definition Link
- State
- SSN
- Date (of Event)
- Error Summary
- Password (Create)
- Save this Page Link
- Secondary Navigation
Testing was conducted on the following types of devices:
- Smartphones (iOS and Android)
- Tablet (iPad)
- Desktop (PC)
With members of the general public, Mobile UEF Team members:
- Conducted user testing with eleven participants at the Charles P. Miller Branch of the Howard County Public Library in Ellicott City, MD, on October 23, 2013.
- Participants tested on one of the following types of devices:
- Mobile smartphone: 5 total participants (4 using an iPhone 5; 1 using a Samsung Galaxy S3)
- iPad tablet: 4 participants
- Desktop: 2 participants
- Participants tested on one of the following types of devices:
- Collected participant information in a pre-test demographic survey, which showed:
- Participants ranged in age from 38-72 with a median age of 46
- Ten of the eleven participants owned and used at least one type of mobile device
- Two of the eleven participants had used Social Security’s online service to apply for, or to access, their benefits. (One participant did not complete this question.)
- Analyzed the results, including:
- Navigation methods and preferences
- Participant issues or comments re: specific UEF patterns or screen details
- User satisfaction scores on the overall experience as indicated in a post-test questionnaire
As with prior Mobile UEF testing sessions, recruiting of volunteer participants was performed on-site during the testing session with outreach to a broad range of library patrons. The usability test scenario and task were designed to be completed within 15-20 minutes, as prior mobile testing had shown this amount of time to yield the optimal balance of participants and data for analysis in any single testing day.
Metrics for this usability test were established by the UXG as follows:
- Completion Rate – Percentage of test participants who successfully complete the application without assistance
- Target = 80% for each device type
- Ease of Use – Percentage of test participants who indicated the application was “very easy” to use on Question #5 of the post-test survey
- Target = 80% for each device type
- User Satisfaction – User satisfaction score, per post-test questionnaire
- Target = 80% for each device type
For each metric, actual performance exceeded the target for each device type.
Metric | Target (All) | Actual (Phone) | Actual (Tablet) | Actual (Desktop) |
---|---|---|---|---|
Completion Rate | 80% | 100% | 100% | 100% |
Ease of Use | 80% | 100% | 100% | 100% |
User Satisfaction | 80% | 86% | 89% | 94% |
The following table lists the Post-Test Questionnaire responses by device type as well as overall.
Scale of 1-5 where 1 = lowest and 5=highest
Questions | Smartphone (n=5) | Tablet (n=4) | Desktop (n=2) | Overall |
---|---|---|---|---|
How well did the website match your expectations? | 4.4 | 3.50 | 5.0 | 4.2 |
How well did the website support the task you were asked to perform? | 4.4 | 4.50 | 5.0 | 4.6 |
How difficult or easy was the website to use? | 4.2 | 4.50 | 5.0 | 4.6 |
Are you satisfied with the content? | 4.2 | 4.25 | 4.5 | 4.3 |
How difficult or easy was it to move through sections of the website? | 4.4 | 4.50 | 5.0 | 4.6 |
How easy were the words on the website to understand? | 4.2 | 4.50 | 5.0 | 4.6 |
How satisfied are you with the speed at which you can complete tasks? | 4.6 | 4.75 | 5.0 | 4.7 |
How difficult or easy was it to find information you needed? | 3.8 | 4.50 | 4.0 | 4.0 |
How long would it take you to learn to use this website? | 4.6 | 4.50 | 4.5 | 4.6 |
How confident did you feel using this application? | 4.2 | 4.75 | 4.5 | 4.5 |
Average User Satisfaction Score by device type | 4.3 | 4.45 | 4.7 | - |
Usability issues, as well as observations and participant comments of note, are listed below.
The "Go to…" display that allows the user to go directly to completed sections of the application was visible as a button on the tablet (in portrait orientation) and the smartphones.
Smartphone
- 1 of the 5 participants went directly to the “Go to…” button to go back to a particular section in the application and again to return to where he had left off.
- Of the 4 remaining participants, 2 used the “Go to…” button to return to where they’d left off in the application; the other 2 instead clicked the “Next” button to page forward, despite it taking longer
Tablet
- 2 of the 4 participants used the “Go to…” button to go back to a particular section in the application and then return to where they’d left off.
- 1 of the remaining participants used it to return to her prior place in the application after the functionality was explained by the facilitator; the other used the “Next” button to page forward
Desktop
- 1 of the 2 participants used the “Go to…” navigation to go back in the application; the other participant used it to get back to where she’d left off after the functionality was explained by the facilitator
9 of the 11 participants did not know what “AR” meant but still did not click the definition link for more information.
Smartphone
- 1 participant construed the dotted underline to indicate a misspelling.
- 2 participants stated the link should be blue; 1 because the lines are so small and 1 because it would be clearer
- 1 participant indicated he noticed the dotted lines but didn’t know what they meant
Tablet
- 2 of the 4 tablet participants clicked the definition link for more information
Desktop
- The 2 desktop participants did not know what AR meant but did not click the definition link
- 2 iPhone users were confused after clicking the dropdown: they expected to see states but saw only Armed Forces input options due to fewer options visible on the iOS scroll wheel. This input was not problematic for the Android phone users.
- 1 of these 2 participants stated “it would be better if states were shown first”
- There were no issues or comments for tablet or desktop users
- 2 mobile phone participants commented on the SSN format; 1 would like it “chunked” and 1 stated it would be easier to use if dashes were “automatically put in”
- There were no issues or comments for tablet or desktop users
- 1 mobile phone participant expected to input Date of Birth via a single scroll wheel providing month, day & year, vs. individual wheels or inputs
- There were no issues or comments for tablet or desktop users
- 2 participants suggested the links would be more useful by providing additional information such as the location of check routing and account numbers
- The only participants who clicked on the error links provided in this pattern were 2 tablet users
- There were no issues or comments from desktop users
- 4 of the 5 smartphone users clicked on the inline link.
- 2 of the 4 tablet users clicked on the inline link.
- All (2) desktop users clicked on the inline link.
- 3 of the 5 participants mentioned noticing the indicator
- 1 of the 4 participants indicated noticing the indicator
- Neither of the desktop participants commented on the indicator
- 2 of the 5 participants mentioned noticing the password matching indicator
- 1 of the 4 participants mentioned she liked the password matching indicator graphic
- Neither of the desktop participants commented on the password matching indicator
The link was problematic in general across device types re: what was actually being saved. Once shown, participants wanted a variety of options to save the Reentry #.
Smartphone
- At least 2 participants believed they were saving the entire application vs. simply the screen with the reentry #
- 1 of these 2 participants raised the issue that access to his information would now be on someone else’s phone (per the testing scenario)
- The other participant thought she could now access the saved page online versus her phone; she did not expect a download onto the device
- 2 participants would like options for saving the Reentry number such as saving to notes or emailing to themselves.
Tablet
- 1 participant voiced uncertainty as to where on the device itself the pdf would be saved
- After seeing what happened with the link was clicked, 1 participant stated he’d like to take a screenshot of it; another would like to email it to himself
- 1 participant thought information would be saved to both her desktop AND ssa.gov
Comments are based on the following number of respondents: smartphone=4; tablet=4; desktop=2
Green Circle
- 1 smartphone participant thought green indicated “correct”; 3 thought it meant “done” or “finished”
- 2 tablet participants thought it meant “everything is good”
- 1 tablet participant thought it meant complete
- 1 tablet participant didn’t know it indicated status
- Both desktop participants thought it meant compete
Red Circle
- 2 smartphone participants thought it indicated an error
- 1 smartphone participant thought it indicated an omission
- 2 tablet participants thought it indicated an error
- 1 tablet participant thought it indicated a section un-necessary to complete
- 1 tablet participant didn’t know it indicated status
- Both desktop participants thought it indicated an omission/something must be completed
Blue Text
- 2 smartphone participants understood blue text indicated a link
- 2 smartphone participants were unsure
- 2 tablet participants were clear
- 1 tablet participant erroneously thought it indicated steps you still have to go
- 1 tablet participant who had used it during the application process was unclear
- Both desktop participants understood blue text indicated a link
Highlighted section
- 2 smartphone participants understood the highlighted section indicated their current location in the application
- 2 smartphone participants were unsure
- 3 tablet participants knew the highlighted section indicated their location in the application
- 1 tablet participant was unsure
- Both desktop participants understood it indicated their location in the application
Gray Text
- 2 smartphone participants understood gray text meant unavailable/unclickable
- 2 smartphone participants were unsure
- 3 tablet participants thought gray text indicated a section they had not yet gotten to
- 1 tablet participant was unsure
- 1 desktop participant understood it to indicate a future section of the application
- 1 desktop participant was unsure
Based on this round of testing, the following patterns were noted to a Mobile UEF pattern “watch list” to monitor moving forward. These patterns did not cause participants notable difficulty in completing their tasks, but may be candidates for future improvements:
- Date of Event
- State Input
Additionally, the following patterns will undergo further design and testing:
- Definition Link
- The Save this Page link