Usability Testing Report by Ruth Eckles, Juanita Wrenn, and Rachael Ward
justWink is an iPhone e-card application. Covering a wide variety of occasions, justWink specializes in e-cards that are sassy, sarcastic, and generally geared towards adults. Although originally created for the iPhone, the application is also available on Android devices, the iPad, and as a website.
The usability test was administered to assess the ease of use of standard touchscreen motions and actions. These include, but are not limited to using a finger to scroll, tapping, using a finger to sign a signature, and typing. The justWink application was chosen to execute the testing as the interface allows for a broad range of motion.
The usability test was administrated to sixteen individuals covering a wide range of demographics. The test itself involved two questionnaires and four tasks. Test administrators observed participants complete the tasks while both quantitative and qualitative data was recorded using a predetermined coding sheet. Upon test completion, the data was compiled and analyzed. Although a large amount of data was collected, the primary focus was on how easily participants were able to navigate between different screens, use the tablet to type, using a slider bar to change text size, and send an email. The total time it took to complete the tasks was also measured. The results of the tests were then compared to a post-test questionnaire to determine perceived ease of use.
The main findings were:
· Younger participants generally had an easier and more satisfactory time completing the tasks.
· Generally all participants were satisfied with the ease of use of the application.
· The most difficult task was using the slider bar to resize text
· The second most difficult task was selecting a card.
· Further testing should be done to study major gestures used by touchscreens.
justWink is a free e-card application that covers a wide variety of topics ranging from life events (birthdays, marriages, births) to seasonal and topical events (Halloween, Christmas, election season). The majority of the cards conveys a more adult tone and can be sent to recipients via email, text, Facebook, or through the US Postal service for a fee. Although originally produced for the iPhone, the application has been expanded to include Android devices, the iPad and a standalone website. For the purposes of this test, the participants used the application on an iPad.
justWink is a casual app that provides quick and easy access to a variety of greetings. While it is certainly geared towards being mobile, it is also an application that performs at its best when the participant is allowed to sit and explore the application at their leisure. Once an individual becomes more familiar with the cards, it could be an easy way to send a quick note, but part of the fun of the application is noting all the interesting details the developers have put into the application.
The application tries to afford participants every possible method of interaction—both physically and socially. Participants are first presented with a home screen consisting of featured cards in a swipeable array, a top menu bar that prompts them to look at more categories, and a smaller menu bar that has three buttons: categories, notifications, and more. The top menu bar has a tabbed arrow that participants can either tap to open up a category’s sub-screen, they can drag the categories sub-screen down, or they can tap the bottom category’s button.
When participants have selected their category, they can either view the cards in “list mode” or “shelf view”. List mode allows participants to see the outside of the card next to text transcribing both the front of the card and the inside cover. In shelf view, participants only see the covers of the cards and must tap a card to view it. In both modes, participants navigate the list of cards by swiping to scroll through the pages. If a participant decides to examine the card, the image comes to the forefront of the screen. Participants then can either click the card open, or drag the front cover open, mimicking the actions of looking at a real card.
Once the final card is selected, participants are prompted with the interior of the card. The left side of the card allows for a picture to be placed, while the right side has room for the predetermined card text, participant-generated text, and a signature. When inputting participant-generated text for the card, an individual uses the standard keyboard layout to type the message. After completing the message, the text is shown and the option is given to resize the text using a slider bar at the bottom of the page. Then, the participant may sign the card using either a stylus or their finger. There are several options for width, and the tablet must be in landscape mode to complete the task.
Finally, when the participant is satisfied with the card, they have the option to preview the finished product, or send it. Individuals may send cards via Facebook, email, text message, or by sending a real card through the mail for a small fee.
Although justWink has many other features, this basic course of action was laid out in the tasks presented to the participants in the usability test. In particular, the aim was to see how participants found the ease of use of the touchscreen through the use of this application.
A total of 16 participants were tested ranging from 18 to 67 years of age. The majority of the participants spend time on the computer on a daily basis. When we first initiated the testing process, we didn’t intend to segment older users from younger users, but after looking at the data, there was a clear correlation between age and ease of use with the application. This will be discussed in more detail in the results section.
The playful interface design of justWink led us to believe that the intended users for the application are professional young women 20 to 30 years old. Part of our reasoning for this demographic is that we surmised women are more likely to send birthday cards than men. We imagined young professional women would most likely be pressed for time, and therefore might choose a faster option such as an e-card. We guessed that this demographic would be more likely to use the latest technology such as iPad applications. Judging from the quirkiness of the cards offered on justWink, we also suspected that the intended user for justWink might be a little bit on the “artsy” side, with a sense of humor.
Our actual participants were older than the intended user. While we did test quite a few younger people, the majority of our participants were over the age of 30. In addition, the intended users for justWink are female and our participants were divided equally by gender.
Participants were selected on the basis of convenience. They were the family and friends of the testers, and a few of the participants were fellow students in the iMedia program. While the participants had diverse characteristics in terms of age and occupations, they did share certain similar characteristics. Most were white and college-educated. Although we didn’t officially measure these demographics, we are certain of it since we personally knew all the people tested.
Because justWink is a card sending application, the scenario created asked the participant to send a card to a (fictional) sister. The scenario was chosen because it reflected the natural circumstances of a person sending a card to a loved one. We also wrote the task scenario to reflect the typical user (young professional female, artsy, sense of humor).
The task scenario read:
“Please select a birthday card to send to your sister Alice. She is celebrating her 27th birthday in Vermont. Alice is a fun-loving free spirit involved in the arts, who teaches at a day care center. She has a great sense of humor and enjoys friends remembering her birthday.”
Tasks were broken down into parts that were based on the chronological process of sending a card with the justWink application. Tasks were kept general rather than specific in order to objectively assess the ease of use of the application. The tasks were as follows:
1. Select the “birthday” card of your choice.
2. Write a message in the card. You don’t need to attach a photo.
3. Make the text bigger.
4. Sign the card.
5. Email the card to firstname.lastname@example.org
The performance criteria for each task was measured using a coding form which contained 5 performance criteria that were directly correlated with each task on the task list. Performance criteria for ease of use with interface were evaluated as follows:
1. Selecting the card: Uses category button, uses down arrow, has difficulty finding categories, selects “list” view, selects “shelf” view. Tester checks box for “yes” leaves blank for “no”.
2. Write message: “Has difficulty writing message?” Check boxes are “yes” and “no”.
3. Make text larger: “Found text slider?” Check boxes are “yes” and “no”. “Has difficulty finding text slider” (yes or no), “Goes back to previous screen” (yes or no). If “yes” is checked, a blank is provided to write the number of times the participant had to go back to search for the text slider).
4. Sign card (uses stylus, uses finger, continues to use stylus, puts stylus away) has difficulty signing the card (yes or no).
5. Email Card: “Do they choose preview?” (yes or no). Notes section provided to record any difficulty with this task.
Participants were tested in a variety of environments ranging from their own homes, to the tester’s home, to the private editing suites at Elon’s iMedia program.
Tests performed in the participant’s home, provided more opportunities for distractions, such as pets or a ringing phone, whereas when the test was performed at the tester’s home provided more control. The tester set up a specific area in her house that was consistent for each participant she tested. One participant brought a small child with her. Although her husband watched the child while she took the test, it still might have been a slight distraction. Elon’s iMedia suites probably provided a better feeling of insulation for concentration. However, those participants were students recruited on the fly while they were busy with other work.
Video and audio devices or software were not used. Qualitative and quantitative data were gathered through observation, a coding form, and pre and post test questionnaires. A few photos were taken of participants while they were taking the test, primarily for the purpose of the presentation, but nothing too obtrusive.
Test Administrator Tools
Video and audio recording devices were not used to record data. Silverback could not be used with the iPad. Video would not sufficiently capture the data on the screen. Audio would not be a sufficient measure of ease of use. As a result, data was collected using pre and post test questionnaires and a coding form, which are outlined in more detail below (also—please see Appendices to view actual forms):
Prior to reading the task scenario and performing the tasks mentioned earlier, the participants were given a pre-test questionnaire that was meant to access their level of experience with the application, e-cards, traditional cards, and their tendency to send funny links. We felt this would give us a good picture of our participant’s interest and literacy with these types of applications.
Immediately after the pre-test questionnaire was administered, the participant was instructed to read the task scenarios and begin performing the tasks on the task list. These tasks and scenarios were mentioned earlier in the report.
After the tasks were completed, the participants were given a post-test questionnaire to complete. The purpose of the post-test questionnaire was to access the participants perceived ease of use with the application. Participants were asked to rate the ease of use on each task they performed on a scale of 1 to 9 (1 being the most difficult, 9 being the easiest). Participants rated the ease of use on the following tasks: selecting a card, adding a message, changing the text, signing the card, and emailing the card.
Participants were also asked to rate the level of enjoyment they experienced with the application on a scale of 1 to 9. They were also asked to explain what they did and did not like about the site and were provided with a space to write their explanations.
The post-test questionnaire also contained questions about whether participants would use the application again, and how likely they would be to use it on their iPhone or computer. They were also asked how often they might use the application on these devices. The questions were multiple choice, ranging from less than once a year to 6 or more times per year.
Demographic questions determining age, occupation, gender, and computer usage were placed at the end of the questionnaire and participants were provided with a blank to fill in.
The coding form was used by the tester to record and measure the participant’s ease of use with selecting a card, writing a message in the card, signing the card and emailing it to the specified recipient. The coding form also measured the time it took for the participants to complete all the tasks. It also contained the participant ID number, and the date of the test. Using the coding form, the tester observed the participant while they were engaged in the specific aspects of the interface that involved the tasks. The coding form contained several boxes the tester can check to measure the ease of use in the participant’s interaction with these aspects of the interface. A check means “yes”, a box left blank means “no”, and in each section there is space for notes that the tester can write extra observations in.
Sequence of events in chronological order:
1. Participants were greeted, and settled in to the space where they would take the test.
2. Testers read aloud a brief orientation script that introduced participants to the testing procedure. Each participant was read the same script to maintain uniformity in the process.
3. Participants were asked to fill out a consent form.
4. Participants were given a pre-test survey to fill out.
5. Participants were given a list of tasks to complete.
6. Participants were given a post-test questionnaire to fill out.
7. Participants were thanked for taking part in the study.
Each testing session involved just two people—the tester and the participant. The tester was present with the participant throughout the entire process, sitting in close proximity to the participant in order to observe and record their actions on the coding sheet. Occasionally the tester would have difficulties performing the tasks on the list. Rather than help the participant with the task, the tester would encourage the participant to continue exploring the interface. Testers recorded participant’s comments that indicated difficulty, frustration, or pleasure on the coding form. The testers were not compensated in any way
The main findings indicate that 1) younger participants found justWink to be easier than older participants, 2) the text slider is difficult for most participants to find and use and 3) selecting a card was rated as the second most difficult task. The study also shows that signing the card was surprisingly intuitive and easy. Several technical issues arose that involve the iPad interface rather than the application interface.
Generally younger participants found the tasks to be easier than older participants. However, participant #JW04 (67 years old Fig.1) reported all tasks as “easy.” Data collected from observation suggests the participant did have difficulty with several tasks. We feel that this discrepancy makes this participant an “outlier.” With the outlier data removed, results show that younger participants found justWink to be easier to use than older participants.
[Text Box: Fig. 2 Participants grouped by age with outlier removed.]
Which tasks were more difficult?
The task difficulty was rated by participants using a questionnaire (Fig. 3). Their ratings generally matched the observed difficulty when compared to the Task List (bottom of Fig. 3). Participants rated changing the text scale as most difficult. Many participants spent time looking for text enlarger icon on the message typing page and the top of the page containing the slider. Eventually most participants “found” that the slider increases the text size.
Selecting the card was rated the second most difficult task. Most participants used the shelf-view, as it was the default view. Many users went back to the homepage looking for more cards. It was not readily evident that swiping “up” will show more card choices. Often users tried to swipe left and right to look along the “shelf.” In addition to a less intuitive swipe motion, many participants accidently selecting the lower navigation bar and pulled it up when trying to get to the cards behind it. While younger participants generally spent less time to complete all the tasks than older participants, total time was not solely related to ease of use. Mid-range age participants often enjoyed spending time browsing cards and writing witty messages.
AREAS FOR IMPROVEMENT AND FURTHER STUDY
While the findings of the usability test are very interesting, there are definite areas where the test could be refined and focused. Additionally, there is certainly opportunity for follow-up study.
It is only in hindsight that it became apparent that the design of our pre-test questionnaire could use some improvement. Not only was the lay out confusing for some participants but it also did not measure if the individual was a power user or novice—something that should have been factored in to research design. Another reoccurring issue was straying from the original research question: what is the ease of use of a tablet? After examining the data it became very clear that there were too many questions on both the pre and post test that asked users about their e-greeting card, how they felt about the content of the card, versus how they found the ease of use. Questions such as “What motions seemed natural to you?” and “What actions did you find difficult/confusing?” would have been more helpful.
In this same vein, it was only after examining the data that we began to see a trend between power users and touch screen novices. It would have been useful to ask in the pre-test questionnaire what experience each user had with touch screen devices. This line of questioning also leads to an area of further study. It would be interesting to compare touch screen novices with touch screen power users, covering all demographics. Although an interesting field of research, it could prove problematic. As more time passes, the harder it would be to find young individuals who have limited to no experience with touch screen devices. Additionally, having a larger sampling size and coding for these individuals would allow for a broader understanding of the findings.
Our design suggestions involve the aspects of the interface that participants had the greatest difficulties with—the text slider and the card selection.
We feel the text slider is counterintuitive and remote. Having the “Make Text Bigger” label at the top of the screen while having the slider at the bottom of the screen further distracts the user from the slider. Moving the slider closer to the “make text larger” would prompt users to relate to slide function to increasing the text. An additional design suggestion would be to include a plus (+) and minus (–) icon to increase and decrease font size.
Card selection issues involved the list and shelf view options. Shelf view is the default view of cards. The shelf view emulates a card store shelf on which rows of cards are lined left to right. We feel that this orientation misleads users to want to swipe left and right, while the actual action needed is up and down. Additionally, the cards peeking from behind the footer are too low to touch and “pull up” with a swiping motion. Simply moving the third row of cards up would solve many of the “mis-touches” that caused frustration. Perhaps making the list view the default would offer a more user-friendly experience.