Results overview from Round 6 of the user feedback sessions
The Mobile UEF team conducted usability testing to evaluate specific UEF patterns in the context of a linear application on mobile and desktop devices. The existing MySSA application was modified to evaluate particular patterns, including three different Container with Tabs styles and a refined Menu design. The Mobile UEF team also sought to identify any emerging trends or patterns since beginning mobile usability testing in August 2013. Patterns tested during this round included:
- Template Navigation
- Sign In/Create Container
- Password (Create)
- Container with Tabs
- Document Viewer
- Archive File Type
- Email Link
- Error Summary
- Date Picker
- Help Link
- Informational Modal
Testing was conducted on the following types of devices:
- Smartphones (iOS and Android)
- Tablets (Kindle Fire and iPad)
- Computers (Laptop - PC)
The MySSA application used for testing was modified to allow for the use of specific patterns the Mobile UEF wished to test. This included the use of three different Container with Tabs pattern styles currently being evaluated: an accordion style, a tabbed style, and a slide menu style. These patterns were displayed vertically in that order in the prototype.
A new container style created specifically for user sign in and/or account creation was also included. This design placed the two options (sign in or create account) in close proximity to one another, thus allowing the user to see both options at once. This design addressed an issue detected in previous testing, in which users could see only one option without scrolling on the smartphone.
As with prior responsive prototypes used in Mobile UEF testing, this prototype was designed with a single breakpoint. The devices with a viewport size of < 768 pixels included: the iPhone, Samsung Galaxy S3, and Kindle Fire HD (in portrait view).
The devices with a viewport size of 768 and larger included: the Kindle Fire HD (in landscape view) and iPad. (Note: participants using the Kindle device in this round of testing held it in portrait view only.)
The viewport sizes for each mobile device used in this round of testing are as follows:
- iPhone 320 x 568
- Samsung Galaxy S3 360 x 640
- Kindle Fire 600 x 963
- iPad 768 x 1024
With members of the general public, Mobile UEF Team members:
- Conducted user testing with 14 participants at the Charles P. Miller Branch of the Howard County Public Library in Ellicott City, MD, on August 27, 2014.
- 14 participants tested on one of the following types of devices:
- Mobile smartphone: 7 total participants
- 4 using an iPhone 5;
- 3 using a Samsung Galaxy S3
- Tablet: 5 participants using an iPad 3
- Desktop: 2 participants
- Mobile smartphone: 7 total participants
- 14 participants tested on one of the following types of devices:
- Collected participant information in a pre-test demographic survey, which showed:
- One participant did not complete this survey, leaving 13 participants in total
- Participants ranged in age from 22-70 with a median age of 46
- 12 of the 13 participants owned and used at least one type of mobile device
- Two of the 13 participants had used Social Security’s online service to apply for, or to access, their benefits
- Six of the 13 participants stated they would use a tablet or smartphone to access SSA.gov or MySocialSecurity.
- Analyzed the results, including:
- Navigation methods and preferences
- Participant issues or comments re: specific UEF patterns or screen details
- User satisfaction scores on the overall experience as indicated in a post-test questionnaire
As with prior Mobile UEF testing sessions, recruiting of volunteer participants was performed on-site during the testing session with outreach to a broad range of library patrons. The usability test scenario and task were designed to be completed within 15-20 minutes; prior mobile testing had shown time range yielded the optimal balance of participants and data in any single day.
Metrics for this usability test were established by the UXG as follows:
- Completion Rate – Percentage of test participants who successfully complete the application without assistance
- Target = 80% for each device type
- Ease of Use – Percentage of test participants who indicated the application was “very easy” to use on Questions #3, #5, and #8 of the post-test survey
- Target = 80% for each device type
- User Satisfaction – Percentage of test participants who indicated they were “very satisfied” on questions #4 and #7 of the post-test survey
- Target = 80% for each device type
Metrics for task completion, ease of use and user satisfaction, as measured by the post-test questionnaire, were as follows:
Metric | Target (All) | Actual (Phone) | Actual (Tablet) | Actual (Desktop) |
---|---|---|---|---|
Completion Rate | 80% | 100% | 100% | 100% |
Ease of Use | 80% | 83% | 100% | 50% |
User Satisfaction | 80% | 67% | 100% | 100% |
The following table lists the Post-Test Questionnaire responses by device type as well as overall.
Scale of 1-5 where 1 = lowest and 5=highest
Questions | Smartphone (n=6) | Tablet (n=5) | Desktop (n=2) | Overall (n=13) |
---|---|---|---|---|
How well did the website match your expectations? | 4.17 | 4.80 | 3.50 | 4.31 |
How well did the website support the task you were asked to perform? | 4.67 | 4.60 | 4.00 | 4.54 |
How difficult or easy was the website to use? | 4.33 | 4.60 | 4.00 | 4.38 |
Are you satisfied with the content? | 4.00 | 4.80 | 4.50 | 4.38 |
How difficult or easy was it to move through sections of the website? | 4.00 | 4.80 | 4.50 | 4.31 |
How easy were the words on the website to understand? | 4.33 | 4.80 | 4.50 | 4.38 |
How satisfied are you with the speed at which you can complete tasks? | 3.17 | 4.60 | 3.50 | 3.77 |
How difficult or easy was it to find information you needed? | 4.33 | 4.60 | 3.50 | 4.31 |
How long would it take you to learn to use this website? | 4.50 | 4.60 | 4.50 | 4.54 |
How confident did you feel using this application? | 4.83 | 4.60 | 4.50 | 4.69 |
Average User Satisfaction Score by device type | 4.23 | 4.68 | 3.95 | 4.36 |
Usability issues, as well as observations and participant comments, are listed below.
Small Breakpoint: Below 768 pixels (n=7)
Large Breakpoint: 768 pixels and above (n=7)
- 2 participants did not notice the Menu button.
- Two participants suggested that the Menu button would stand out more by changing the font color, background color, or making it bold.
- There were no issues with the functionality of the menu.
- Participants seemed to like the way the menu worked.
- 1 participant suggested the header and menu be “sticky” (i.e., stay on the screen as the user scrolls down) in order to easily access to the menu while scrolling the page.
- There were no major issues with this pattern on the large breakpoint.
- There were no issues with signing out or placement of this item in the Menu.
- There were no issues with signing out on the large breakpoint.
- 1 participant was confused about which tab she was on.
- 1 other participant stated that they liked the look of the Sign-In Container.
- There were no major issues with this pattern on the large breakpoint.
- Some participants did not notice the green check marks appearing as they met password requirement criteria; they seemed focused on password creation and entry.
- Some participants did not notice the confirm field appear because the phone keyboard covers a large percentage of the screen while typing the password.
- 1 participant stated that they liked the design for creating a password.
- Participants seemed to find using this container easy and straightforward.
- When participants did not interact with this container it was usually due to prototype and test scenario design.
- 1 participant did not like this container because he found the prototype to be “laggy” and thought this was not a good style for “laggy devices.”
- 2 participants were unsure that they were on the home page while on a different tab in the accordion.
- This style of container was not shown in the large breakpoint.
- There were no major issues with this container style.
- 2 participants suggested using the tab label “Other” instead of “More” for the drop down tab.
- 1 participant stated that they would prefer the drop down menu to be larger.
- This style of container was not shown in the large breakpoint.
- All participants utilized the arrows with this container to advance it. No one attempted to swipe.
- 1 participant had difficulty touching the menu items and thought the text was too small for people with larger fingers.
- This style of container was not shown in the large breakpoint.
- The document viewer was difficult to find for participants because it was buried in a container with tabs.
- Of those that found it, at least 3 participants expected to use the browser back button to return to the previous screen, rather than use the “x” button to close the viewer.
- 1 participant attempted to touch the gray area to close out of the viewer.
- 1 participant stated that they would like the option to email the document.
- Some older participants using the larger breakpoint were unclear of the concept of a lightbox and were not sure what it was for.
- None of the participants knew what this icon was intended to represent.
- Participants who guessed thought the file was either a:
- Word Doc
- .txt file
- Fax machine feed tray icon
- When asked what would make it more clear, participants offered these alternative icon suggestions:
- Standard icon for Zip files
- Zipper
- Box icon or “filing cabinet”
- Name the file with a .zip extension
- 2 participants stated that it did not matter to them what the icon was, that it would just open anyway.
- Participants using the large breakpoint had the same issues with this pattern as the Smartphone users.
- There were no issues with this pattern on either breakpoint.
- There were no major issues with the error summary and field errors.
- Many of the participants liked the error summary and the field errors together.
- “Saw what they were at the beginning and then saw what they were when you got to them.” ~P4
- At least 3 of the participants clearly read the Error Summary before scrolling to correct their errors.
- None of the participants used the links in the summary.
- 1 participant mentioned that she would prefer the field errors to be to the right of the highlighted field.
- One participant stated they wanted the error names in the Error Summary pattern to match the field label.
- 2 iPad participants found using both the error summary and the field errors to be redundant.
Small Breakpoint
- 2 participants stated that they felt the dates in the calendar were too small but liked the functionality of it.
- 1 participant mentioned that he liked the “sticky” header for the date picker.
- 1 participant expected the gray dates (indicating unavailability) to be clickable.
- The date picker pattern did not load properly in 2 of the 7smartphone sessions.
Screenshot does not show the dates that were rendered to the user at the time of the test.
Large Breakpoint
- 1 participant mentioned they would like the option to type the date into the field as well.
- 1 participant stated that the calendar icon was “too small for old people.”
- This pattern did not load properly in 3 of the 5 iPad sessions.
Screenshot does not show the dates that were rendered to the user at the time of the test.
Small Breakpoint
- 1 participant did not realize the ‘?’ icon was for help.
- All other participants had no issues with understanding the ‘?’ icon.
Large Breakpoint
- 3 participants stated that they thought the ‘?’ icon on its own was not meaningful enough and suggested to include the word ‘help’.
- 2 participants stated they would have liked the ‘?’ icon to be bigger.
Small Breakpoint
- 2 participants suggested increasing the font size of the information within the modal.
- 1 participant found having both ‘X’ and ‘Close’ was confusing because “You are providing two options for the same action.” They suggested only using one option and to be consistent.
Large Breakpoint
- There were no major issues with this pattern.
- 1 participant mentioned that the ‘Close’ button should be bigger to make it easier to touch (iPad).
During the post-task questions, six smartphone participants were asked to provide their preferences between the three different container with tab styles they encountered earlier in the scenarios. These included the Accordion style, More Menu, and Slide Menu, presented vertically in that order. They were first asked to name their favorite style, and then asked their least favorite style. The results are shown in the tables below:
Favorite Container | # of Participants |
---|---|
Accordion | 3 |
Slide Menu | 1 |
More Menu | 2 |
Participants stated that they liked the accordion-style container because it is simple, easy to navigate, and they had the ability to get rid of (hide) unwanted information. The participant who preferred the More Menu style container liked it because the drop down was colored differently (i.e. the drop shadow) and made it easier to see.
Least Favorite Container | # of Participants |
---|---|
Accordion | 1 |
Slide Menu | 3 |
More Menu | 1 |
No Preference | 1 |
The participants who did not like the Slide Menu container stated it was because the menu was too small and they had difficulty touching the menu items. One participant stated, “If I had to do it on something this small, I’d always hit the wrong button.”
Twelve participants were asked to provide their preference between two different template styles. These included a neutral, white version (used during the scenarios) and a style similar to the current OCOMM template. These are the results:
Favorite Template | # of Participants |
---|---|
Neutral | 6 |
OCOMM | 6 |
The participants who liked the Neutral template preferred it because they thought the white stood out more on the gray and it was easier on the eye.
The participants who preferred the OCOMM style template liked it because they liked the branding, found it to be more colorful, or liked the gray background better. In referring to the neutral template, one participant stated that he thought, “People tend to glaze over white and black.”
Based on this round of testing, the following patterns were found to be problematic for enough participants to necessitate a new or revised design:
- Archive File Type
- Document Viewer
- Container with Tabs (Slider style)
- Template: Either one will work, they are so similar; pick one and go with it
- Archive File Type pattern: re-evaluate alternative icons as replacements
- Document Viewer pattern: Re-evaluate the design, including the animation of the viewer and the header controls; refine and retest.
- Date-picker pattern: troubleshoot pattern loading issues and retest on all platforms.
- Container with Tabs pattern styles: Test both the accordion and more menu styles in the next round of testing and reverse the order in which participants interact with them to see if one style is more usable and/or preferred.
- Password Create pattern: re-evaluate the placement of the password requirements for this pattern due to the keyboard obscuring them.
- Error Summary Pattern: Continue listing a summary of errors at the top of the page as with this pattern; participants did read it, although none links to navigate to the individual errors.
- Troubleshoot the prototype slowness problem to improve prototype performance in future tests.
- Research new venues for testing to recruit participants from different socio-economic groups.