EXECUTIVE SUMMARY
Testing was recently conducted on the freshman English TOPIC interface. Tasks evaluated include:
Testing was conducting using the same computer in the testing room for each test, with a test assistor and logger in the testing room with each participant. In the observation room, a test facilitator and camera operator were present for each test. The four participants completed each of the four tasks named above. Two experienced participants and two inexperienced participants were tested, as well as one experienced user in the trial run.
The goals of the test were to identify problems within each of the above tasks. Specific goals include:
Major findings include:
Our recommendations include the following:
INTRODUCTION
The usability test conducted was designed to give a brief interpretation of how students respond to the interface used for freshmen English at Texas Tech University. TOPIC, the interface tested, is designed to allow students online access to assignments and notes, as well as to provide a portal for submitting completed homework and drafts. The test subjects were equally split between new users with no prior experience and experienced users with at least one semester of experience.
By providing the participants with a list of specific tasks to be completed, measurable results were obtained for each individual task. Each task has more than one way to be completed, and each participant was allowed to use their own methods. This helped to insure the reliability of the test by not restricting the test subjects to an exact way of approaching the task.
The goal of the test was to provide clear and precise examples of users’ experiences within certain aspects of TOPIC. Through the use of questionnaires, analysis of the data, and examination of the two specific user groups, the test results will provide the necessary information in determining possible shortcomings or unintuitive designs within the interface.
The data presented in this report will help the creators of TOPIC to design a more comfortable and user friendly experience in future updates and revisions.
METHODOLOGY
This section of the report provides:
Participant information:
The demographics of our users include two experienced users and two inexperienced users. Of the experienced users, one was female, and the other male. Both experienced participants were 18 year old freshmen composition students. Of the inexperienced users, both were female, one being 20 years old and the other 23 years old.
All four users had three or more years of computer experience and spend approximately one to three hours per day using their computer. All four users have experience with and currently use an IBM compatible computer, including Microsoft Windows, with one user having additional experience with a Macintosh.
Applications used by our participants include word processing, spreadsheet, presentation and CD-ROM multimedia software. The most common use of the participants’ computers is
1
word processing. Other uses for their computers include games, accounting, finance, data storage and programming.
Each participant is familiar with web browsers, including Microsoft Internet Explorer and Netscape Navigator, and have browsed the web for more than one year.
Of the users experienced with TOPIC, each has only been using this interface for one semester and maintains a currently active account. The experienced participants admit to using TOPIC for three to six hours per week and have previously completed the tasks involved in this test.
The context of tasks used in the test:
The test was conducted in the English department’s usability lab on November 21, 2002 from 3:00 to 5:00 p.m. The test was conducted without time restraints on the participants. The participants were also informed that the test was being conducted on the TOPIC interface and not the users themselves. Each participant was informed that they would be video taped, audio taped, and visually observed by test administrators.
The participants were asked to complete four tasks. The first task was to log in to the TOPIC interface as a new user. Given a minimum set of written instructions, the participants were to rely on the instructions provided on the TOPIC website.
The second task was to submit a draft onto TOPIC using the completed draft on the 3 _” floppy, which the test logger previously inserted into the computer. The draft was titled “Sample Draft” and was to be submitted on the assignment due date of December 1, 2002.
The third task was to locate and review the recently submitted draft entitled “Sample Draft.” This task relied on the successful completion of the previous task.
The final task was to locate the next assignment due according to the date of the test. This task could be completed by accessing the next assignment due after November 21, 2002.
After completing each task, participants were asked to complete a task assessment survey relevant to the most recent task completed. After the test was completed, a post-test survey was issued to each participant.
Our test administrators completed checklists before, during, and after each test.
Measures used for gathering data:
Data was gathered through direct observation, audio and video recording, and analysis of user interactions.
Testing Procedures:
Each of our four test administrators played a valuable role in the completion of the usability test. Each member contributed to the greeting of each test subject. After initial greetings, each test administrator assumed their testing positions. RF acted as our test logger
2
and therefore directly observed and noted each participant’s interaction with the interface in the testing room. BG was our camera operator and monitored cameras and visual and audio equipment in the observation room. BR was the test facilitator and monitored overall operations from the observation room. RW acted as the test assistor and was seated next to the participants to allow for communication with them during testing.
Upon arrival, the test facilitator provided each participant with a non-disclosure agreement before entering the testing room. Upon entrance to the testing environment, the test assistor provided verbal pre-test training and debriefing, along with written task instructions. After each task, the test assistor was also responsible for supplying each participant with task assessment surveys. At the completion of each testing session, participants completed a post-test survey, which was gathered by the test assistor. All materials used during the testing session were then submitted to the facilitator for the purpose of organization and review. In order to show gratitude to each participant, the test logger verbally thanked each of them and offered them gift bags.
Participant general instructions:
During each test, the participants were provided with a set of written instructions which allowed them to raise their hand to gain assistance from the test assistor. Any questions that were presented were resolved with return questions from the assistor.
All materials used with the participants are included in the appendices to this report.
Measures used to determine satisfaction, efficiency, and effectiveness:
After each task was completed, the participant was presented with a task assessment survey. This survey allowed the participant to rate the tasks according to ease, performance satisfaction, and level of success. The participant was given a scale of one to five, with one being difficult and five being very easy. After the test was completed, the participant was given a post-test survey in which they were asked to rate their agreement with a series of five statements using a scale of five choices ranging from strongly disagree to strongly agree. In addition, the participant was asked to rate the ease of each task on a scale from one to five with one being difficult and five being very easy.
Each task was timed manually by the test logger and automatically by the video recording system. This allowed the test administrators to gather exhaustive time measurements to record task efficiency.
Effectiveness was measured through direct observation and review of video and audio surveillance.
3
RESULTS
This section of the report will contain the results from the usability test. The results are broken down into two categories, experienced users and inexperienced users, and will be presented in order of task completion.
Experienced users:
The experienced users both rated the ease of the first task, logging in to TOPIC as a new user, as easy, and agreed that they were completely satisfied with their performance in completing this task. The success level for these users was rated at four and five, on a scale of one to five, with five being the highest level of success. However, even with the high rating given by the experienced users, test administrators noted difficulty and confusion during the task. The first experienced user hesitated while logging in as a new user, and then became frustrated as the main page slowly loaded from the server. This user successfully completed the task in two minutes. The second experienced user had difficulty in locating the first time registration link and received an error when the user selected the incorrect course number. After returning to the log in page, the user successfully completed the task in five minutes and then asked if log in was completed.
The second task, submitting a draft in TOPIC, had an overall positive rating by the experienced users. The first user rated the ease of the task as easy, agreed that the task performance was completed satisfactorily, and felt their level of success was a four. The second user rated the ease of the task as very easy, strongly agreed that the task performance was completed satisfactorily, and felt their level of success was a five. Both users chose to submit the draft using the same method. This method was to choose the date using the drop down box on the main TOPIC page to select the date the draft was due and then submit the draft. Once at the submit draft page, the users titled the document and then used the mouse to paste the document into the draft section. Both users successfully submitted the draft in two minutes.
The third task, viewing the submitted draft, had an overall positive rating by both experienced users. Both users rated the ease of the task as very easy, strongly agreed that the task performance was completed satisfactorily, and felt theiSr level of success was a five. For this task, the users used different methods. The first user used the link to “review the class writing” and scrolled down the page to find the draft. The second user used the drop down box on the main TOPIC page to select the date and then view the draft. Although different methods were used, both users successfully completed this task in less than one minute.
The fourth task, finding the next assignment, also had a positive rating by both users. Each user rated the ease of the task as very easy, strongly agreed that the task performance was completed satisfactorily, and felt his or her level of success was a five. Both users chose to use the “Select Other Date” drop down box in order to successfully complete the task in less than one minute.
Overall both users took approximately four minutes to complete the test.
4
Based on the post-test surveys, the first user agreed that TOPIC was easy to use overall, while the second user strongly agreed with this statement. The first user agreed that there would be no problem logging in to TOPIC as a new user in the future and the second user strongly agreed. When asked if the user would not have a problem submitting or viewing a draft into TOPIC in the future, the first user agreed and the second user strongly agreed. The first user strongly agreed that finding the next assignment due would not be future problem, whereas the second user strongly agreed. The first user found logging in to TOPIC as a new user to be an easy task, while the second user rated the difficulty level as moderate. The first user found submitting and viewing a draft to be easy, while the second user rated these tasks as very easy. The first user rated the task of finding the next assignment in TOPIC to be easy, whereas the second user rated this task as very easy.
Inexperienced Users:
For the first task, logging in to TOPIC as a new user, the first inexperienced user rated the task as moderately difficult, disagreed that the task performance was completed satisfactorily, and felt their level of success was a two. The second inexperienced user rated this task as moderately difficult, was neutral that the task was completed satisfactorily, and rated their level of success as a four. The first inexperienced user hesitated throughout the process and asked several questions and took three minutes to complete the task. The second inexperienced user, just as the first, hesitated throughout the process and referred to the written task instructions more than once. The second user took four minutes to complete the task.
For the second task, submitting a draft in TOPIC, the first inexperienced user rated the task as being difficult, while the second user rated the task as being moderately difficult. The first user disagreed that the task performance was completed satisfactorily, whereas the second user agreed that the task was in fact satisfactorily completed. The first inexperienced user rated the level of success as a two, whereas the second inexperienced user rated the level of success as a four. The first inexperienced user was not successful in submitting the draft and did not find the link without some assistance from the test assistor. The task took three minutes to complete. The second inexperienced user was successful in submitting the draft; however, there was a glitch in the testing process and Microsoft Word was left open between tests. The task took two minutes to complete.
For the third task, viewing the submitted draft, the first inexperienced user rated the ease of the task as a one, whereas the second inexperienced user rated the ease of the task as a three. Both inexperienced users disagreed that the task performance was satisfactorily completed. The first user rated the level of success as a one, while the second user rated the success level as a three. The first inexperienced user could not locate the draft because the “BACK” button was used to return to the main menu. This resulted in the return to the page before the draft was submitted and unsuccessful viewing of the draft. This caused the unsuccessful completion of the task to take four minutes. The second inexperienced user successfully viewed the draft in two minutes.
For the fourth task, finding when the next assignment is due, both inexperienced users rated the task as being easy and agreed that the task performance was completed satisfactorily.
5
However, the first user rated the level of success as a four, while the second user rated the level of success as a three. Both users had difficulty finding the next assignment, and the second user questioned whether or not the assignment had been located correctly. Both inexperienced users took a total of one minute to complete this task.
Overall, the first inexperienced user took eleven minutes to complete the test, and the second inexperienced user took twelve minutes to complete the test.
Based on the post-test surveys, the first inexperienced user disagreed that the TOPIC interface was an overall easy experience, whereas the second user agreed. Both users agreed there would not be a problem logging in to TOPIC as a new user in the future. When asked if the user would not have a problem submitting a draft in TOPIC in the future, both users agreed that it would not. When asked whether or not the user would have a problem viewing a draft in TOPIC in the future, the first inexperienced user disagreed and felt that this could be a possible future problem, whereas the second user agreed there this task should not be a future problem to complete. Both users agreed that finding an assignment due date in the future would not be a problem.
Problems:
Based on the results presented above, we found the following problems:
Users experienced difficulty logging in to TOPIC for the first time. This problem can be attributed to a lack of emphasis on the link to the new user log in page.
Inexperienced users experienced confusion and difficulty when submitting a new draft into TOPIC. This problem can be attributed to a lack of emphasis on the instructions on the submit draft page.
Viewing a recently submitted draft is not possible when the “BACK” button is used. This problem can be attributed to a lack of specific instructions for the student to review the draft.
The inexperienced users hesitated when finding the next assignment due date. However, we feel that some hesitation is normal and expected when a user is presented with a new task and not related to the interface. Therefore, we find that this task is effective and requires no design alterations.
RECOMMENDATIONS
Based on test results we are offering the following recommendations in order to improve the TOPIC interface.
In order to improve the process of logging in to TOPIC as a new user, we recommend moving the new user link to add emphasis and improve usability. In addition, we recommend that the “choose semester” page be redesigned for clarity. Appendix I illustrates
6
an example of a possible redesign for this page using color and layout changes to achieve the above recommendations.
In order to improve the process of submitting a draft to TOPIC, we recommend changing the current link to read “Submit This Assignment.” Also, by changing the color of the background, the focus would be placed on the actual links.
In order to improve the process of viewing a recently submitted draft in TOPIC, we recommend adding text which states, “To view your submitted draft, please return to the main menu. DO NOT USE THE BACK BUTTON. Using the back button will result in you not being able to view your newly submitted document.” This text should appear on the page under the statement “Your draft has been successfully submitted.”
CONCLUSION
The usability test conducted by four test administrators on four participants, two being experienced and two being inexperienced with the TOPIC interface, was completed within two hours on a single day. Participants were able to use their previous computer knowledge to navigate through the TOPIC interface using only a brief set of written instructions. Based on observations made during the tests, we found only minor cosmetic problems with the TOPIC interface. We feel these problems can be easily addressed through minor changes in text, graphics, and instructions.
7