Usability Assessment Report

Remote Sensing Using Satellites

 

 

 

Prepared for:

PAGE

Program for the Advancement of Geoscience Education

 

 

Prepared by:
Michael Hughes

December 16, 1998

 

 

Contents

Executive Summary

Introduction

Background

Assessment Team

Purpose

Objectives

Deliverables

Limitations

Participants (Users)

Criteria

Sample Population

Methods

Findings and Recommendations

 

Appendix A: Scenarios

Appendix B: Recruitment Questionnaires

Tables and Figures

Figure 1: Usability Lab

Table 1: Actions and Findings (sorted by Action #)

 

Executive Summary

On December 14-16, a usability assessment was conducted on the web site Remote Sensing Using Satellites at the request of Dr. Mary Marlino, Director of the Program for the Advancement of Geoscience Education (PAGE) and under the supervision of Dr. Thomas Reeves, Professor of Instructional Technology at the University of Georgia. The assessment was conducted at the facilities of The Usability Center of Atlanta.

Although the results showed that in general the site was well-received and favorably reviewed by the participants who used it, the assessment identified some significant usability issues that would block students from getting to or realizing the intended benefit of the site.

The evaluation team recommended several specific actions to remove these usability blocks. These recommendations cover such areas as

 

The Findings and Recommendations section of this report provides the detailed recommendations of the assessment team.

Dr. Reeves also has a list of items to examine more closely for their pedagogical effectiveness based upon the observations and recommendations of the assessment team. These items are also listed in the Findings and Recommendations section.

 

Introduction

This document communicates the results of a usability assessment conducted on the web site . This assessment was designed based upon meetings, correspondence, and conversations with Dr. Mary Marlino, director of the Program for the Advancement of Geoscience Education (PAGE), Sue Kemner-Richardson, product developer at , Dr. Tom Reeves, product evaluator from the department of Instructional Technology at the University of Georgia, and Michael Hughes, doctoral student in Instructional Technology at the University of Georgia and Senior Consultant with The Usability Center of Atlanta, Georgia .

Background

The mission of PAGE is to enhance teaching and learning in undergraduate geoscience education through the application of contemporary pedagogies and educational technologies. The product under current evaluation is , a web-based program developed by PAGE to teach college students how satellite images are analyzed and used in meteorology.

PAGE enlisted the services of Dr. Tom Reeves to conduct an evaluation of this web site. After preliminary review of the site, Dr. Reeves felt that a usability test would be helpful in eliciting the input of typical users. He also felt that the overall instructional evaluation would be more helpful if any obvious or easy-to-fix usability issues were corrected before continuing with the evaluation

Assessment Team

The following individuals participated in the usability assessment:

Mary R. Marlino, Ed.D Director, Program for the Advancement of Geoscience Education (PAGE)

Thomas Reeves, Ph.D. Professor in Instructional Technology, UGA

Michael Hughes, Assessment facilitator

Kevin Peng, Graduate assistant in Instructional Technology, UGA

Michael Matzko, Graduate assistant in Instructional Technology, UGA

Michael Sullivan, Graduate assistant in Instructional Technology, UGA

 

The remainder of this report discusses the purpose and limitations of the assessment, the participants and methods used, and the findings and recommendations of the evaluation team.

Purpose

This assessment is part of a formative evaluation, that is, its results will be used to make design decisions and point out changes that need to be made before a further, more in-depth evaluation of the product is conducted. This report covers only the usability assessment portion of the total evaluation project.

Objectives

The usability assessment was conducted to identify the critical usability requirements, as perceived by actual users, that will have an impact on the success of the web site. Specifically, the usability assessment was designed to do the following:

Deliverables

In addition to participating as part of the evaluation team in the assessment, the client has been provided with the following:

Limitations

This section identifies possible limitations to the interpretation and generalizability of these evaluation activities.

The main focus of a usability assessment is to determine the ease-of-use of a product during typical user tasks. This aspect of a product is critical to success, but it is not a comprehensive evaluation of the total scope of the product. The following limitations apply:

  1. The assessment examined typical user tasks with the product and did not examine every screen and feature. This is generally an acceptable limitation for the following reasons:
  1. This process does not evaluate transfer or efficacy of instruction. Its focus is on determining how easily the materials are accessed and understood during instructional use. It does not measure later performance in the user's environment. For this reason, the client is advised to conduct additional evaluation procedures to assess how well the product meets learner's needs and instructional goals.

Participants (Users)

Criteria

Users were through on a criteria-based, purposive selection process. Criteria were established around the following user characteristics:

 

Sample Population

Volunteers were recruited from among three sessions of an undergraduate Technical Writing course taught by Michael Hughes at Southern Polytechnic State University. See Appendix B for a copy of the recruitment questionnaires (student identities have been blocked out).

Methods

The assessment was conducted at a commercial usability testing lab:

The Usability Center
8601 Dunwoody Place
Atlanta, GA 30350
(770) 641-1522

User were given realistic tasks (scenarios) to perform and were observed by the assessment team. See Appendix A for the scenarios that were used. (The order of the scenarios was varied when appropriate based on the emergent data that were gathered.)

A think-aloud protocol was used to help the evaluation team 'hear" user thoughts and reactions to tasks and problems. To determine how well students were understanding the material presented on , the facilitator also use a guided recall protocol (e.g. "Show me again what you did when ..."), teach-back (.e.g. tell me what you know about...) or a coached intervention (e.g. "Try going to ... and let's see what happens") to debrief users after each scenario. Figure 1 shows the layout of the lab in which the evaluations were conducted and observed.

 

Figure 1: Usability Lab

During each session a time-stamped log was maintained by Dr. Reeves that noted user actions and comments. After each session, the assessment team reviewed the log and their individual notes and identified usability problems they had detected during that session. The facilitator recorded these problems on flip charts for ongoing review by the team. Table 1 contains a list of all the findings and the entries from the time-stamped log that triggered the team to note them.

Findings and Recommendations

The assessment team met and reviewed all of the findings and determined what actions to take to correct each finding. (See Table 1.) In some cases, one action solved several findings. Each action was assigned a responsible party (Who) for implementing that action and given a priority rating:

1 = Must fix as soon as possible

2 = Important to fix but is not mission critical

3 = Warrants further investigation.

 

 

The time-stamped entries in the Findings column are direct excerpts from the session log that triggered the team to remember the incident. These entries are not intended to be full-text explanations of the event that let to the associated Action. The use of multiple perspectives recalling and discussing the assessment immediately after observing it provide a rich data base upon which these recommendations are made and which is not reflected in the brief Findings description in the table.

The finding number (F#) notes the order in which the usability findings were recorded on the flip chart.

 

Table 1: Actions and Findings (sorted by Action #)

A# Action Who Priority F# Findings
1

Restructure Web Site's Opening Page

  1. Redesign Home page as more of a high-level Table of Contents to the site
  2. Redesign cover graphic to emphasize satellites and satellite images
  3. Redesign graphic treatment
  4. Drop parchment background from content pages
  5. De-emphasize italics and script, emphasize sans-serif body fonts
  6. Readdress web-page title
  7. Have three "introductory" topics:

Student

Learner-centered objectives

Catalog description.

Site-map (with liberal links from other pages)

Instructor

Educational Goals

K-12 tips

Using thumbnails as independent graphics

Technical Support

Technical Requirement

PAGE 1 3

Getting Started-not about content

1 17:46:17

She seems stick a bit on the goals screen "I wasn't expecting this."

1 17:55:29

I thought getting started would give me information about a table of contents....about the content..like stars and stuff

3 12:44:46

First Impressions: Getting Started wasn't helpful.

4 15:54:11

Scrolling through and not finding what she wants.

1 See above.     39

Wanted Introduction

3 12:43:04

Then went back to Remote Sensing. Was looking for an Introduction.

1 See above.     1

Explore Hurricane not seen as an option

1 17:43:04

She read everything but "Explore hurricanies."

1 See above.     5

Motion Working Graphic- clicked on it

1 17:47:40

She is reading the Quick Start She clicking on the motin graphic to see if something might happen.

1 See above.     7

Link Color Text-confusing

1 17:49:03

She is reading the navigation options screen She is confused by the linking colors.

1 See above.     2

Getting Started-clicked on bottom graphic

1 17:44:14

She is clicking on the "Getting started" at the bottom instead of the real option of the top.

1 See above.     4

Start Here- clicks on text

1 17:44:51

She goes to the getting started screen. She notices the start here, and she clicks on it.

1 See above.     8

Did not know how to navigate back to TOC

1 17:58:41

She wants to go back, but she can't see how.

1 See above.     11

Missed TOC up arrow

1 18:04:26

She missedthe table of contents UP arrow.

1 See above.     29

Selects Hurricane Features twice

2 09:04:06

She went to Hurricaine Features twice.

1 See above.     34

Smoke & Surface Features-could not find

2 09:42:40

She is looking for smoke. She missed it on the first menu.

1 See above.     45

Does Explore Hurricanes before Features

3 13:38:29

Went to Explore Hurricanes first. Skipped the "Features of Hurricanes"option.

4 16:27:37

She starts with "Explore Hurricanes." This seems like the most intelligent place to start.

1 See above.     53

Wants Home button on top of page

4 16:42:16

Do you have any ideas for improving this? Home at the top would be nice!

1 See above.     12

Preload-didn't know was done.

1 18:05:02

She is on the preload graphics screen I don'tkniw if it is done.

1 See above.     6

Technical Requirements not important

1 17:48:07

She is looking at technical stuyff and says "It doesn't seem important to me."

2

Eliminate text on introductory screen.

(Keep disclaimer below the scroll line)

PAGE 1 40

Hurricane Centric

3 12:52:04

Mike asks him to explain VIS. Gives a reasonable explanation. Thinks there is a hurricaine in the image, but there isn't.

3
  • Refer to location of picture, not behavior (not "still" but "upper" or "lower")
  • Always show at least a portion of every picture above the scroll line.
  • Remove "Remote Sensing" and menu bar and replace with topic title and Home and Site Map navigation. (Do for all topics to create more useful room on page.)
PAGE 1 26

Lower Right-second image off screen

1 18:51:58

"Lower right" was confusing for her.

1 18:56:41

She explains why the "lower right" direction confused her.

2 09:52:58

She does not see the still in the lower right.

3 13:06:09

He has the LOWER RIGHT problem. But he then scrolls down and see the still picture.

4

Provide a structure map pointing out:

  • Image
  • Label
  • Legend
  • Map overlay
PAGE 2 21

Did not notice state maps

1 18:27:23

I did even pay attention that this was the states.

1 18:38:54

How am I supposed to know that is Wisconsin and Illinois.

5 Investigate in pedagogical review Tom 3 43

VIS uses temperature

3 13:26:20

He says that VIS also uses temperature.

5 See above.     10

Wants to click on Brochure & NOAA

1 18:03:10

Where is this brochure It is bold and it caught my attention

5 See above.     14

IR & VIS not understood.

1 18:07:14

What is IRS or VHS?

2 09:19:03

She does not know what IR means.

2 09:59:19

She now finds out what IR means.

5 See above.     19

Question Format-not as statements to be completed

1 18:13:38

This is the first one that actually asks me a question.

5 See above.     20

VIS-text difficult to understand

1 18:26:14

That could habe been said claerer, but oh well, I guess you ponder and....

5 See above.     22

Mexico looked like water; text- dark = water

1 18:27:23

I did even pay attention that this was the states.

5 See above.     24

Smoke-needed more info

1 18:42:41

That was odd I was looking for a little bit more That smoke looks like clous. There is no explanation about the difference between smoke and clouds.

5 See above.     30 Does not understand IR (duplicate entry-see F# 14)
5 See above.     32

Dense Paragraphs

2 09:25:44

She comments on length of paragraph. Wants it to refer to pciture.

2 10:12:10

Reading level is good she says, at college level, but some parts are too "wordiness."

4 16:23:29

I have forgotten the question after reading the content.

5 See above.     33

Confused about Remote vs In-Situ

2 09:32:19

He asks her about the differences between in situ and remote, and she says she is still a little confused.

5 See above.     38

Storm breaks up

2 10:03:31

She mistinterprets the nature of the loop. But she does answer the question correctly.

5 See above.     49

Cloud graphic should be IR

4 16:14:26

She clicks on Next topic to go to Thunderstorms and IR. She says that the picture has absolutely nothing to do with the text. It should be an infrared graphic.

5 See above.     52

Wanted graphic on storm surge

4 16:33:27

She would like a graphic with the materialabout damage on right and left side of the hurricane.

5 See above.     23

Smoke-needed more feedback

1 18:39:57

She clicks lots of time. I can't believe that is it! That'snot even telling me anything.

5 See above.     28

Alt text-doesn't want to wait

1 19:03:21

She went to in situ sensing, but she did not read the title. I don't know why these pictures are here. She doesn't want to wait to see picture labels

6
  • Provide an obvious Download progress message with Done button to go back to module.
  • Keep thumbnails below scroll line.
PAGE 1 51

Preload/thumbnail obsession

4 16:28:43

She clicks graphics preload. She starts clicking thumbnails.

7 Re-label PAGE 1 17

Tell Me More-it's a question

1 18:10:14

She clicks "Tell me more"

1 18:10:42

Oh that was a question! Let me go back.

1 18:17:30

I thought the tell me more would tell me more, but it quizzed me a little bit.

2 09:09:47

The nature of the question confuses her a bit.

2 09:15:00

Selects Tell Me more. I am starting to see that Tell Me More is asking me questions. It needs to be labeled quizzes or something.

 

8
  • Separate maps by hurricane (3 side-by-side)
  • Within each map, label one VIS button and one IR button to indicate what the user would see.

 

PAGE 1 13

3 hurricanes 4 phases confusing

1 18:06:22

She is looking at major hurricaines screen She was confused by the numbr ....4 or 3.

9
  • Don’t assign new meaning to standard icon
  • Label controls

    Pause

    Play

    Next Frame

    Prev. Frame

  1 18

Getting Started- wanted VCR buttons to work

1 17:52:23

She is looking at the VCR controls. I am not really interested in this.

9 See above.     16

Flashing Labels

1 18:08:12

They flash so quick and you can't tell.

9 See above.     37

Doesn't understand VCR buttons at first

2 09:51:07

She is not using the VCR controls correctly at first, but then she figures it out

3 13:02:57

He is trying to understand the VCR controls. After a little time, he figures them out.

4 16:01:10

She clicks on next screen and she again comments that she doesn't like the motion imagery while she is trying to read. There are VCR controls on the second picture. She uses the VCR buttons semi-correctly.

10
  • Make feedback more prominent by repositioning to top and emphasizing with typography.
  • Make graphic on feedback screen "playable."
  • Restructure question and graphic

PAGE

Tom

2 25

Did not read feedback on snow vs cloud answer

1 18:48:52

She tries to make distinction in the images. She did not read the feedback

3 13:03:59

He is struggling to answer the question. He thinks brightness is the clue. He has stopped picture and can't see the real information. He got the right answer, but for the wrong reason

11 Make whole graphic hot. Map wrong answer to current location or "wrong" message. PAGE 1 35

Hand gives away answer.

2 09:48:35

She notes that cursor turns to a hand when she scrolls over the smoke

3 12:55:02

Notices finger is the cue to right answer on graphic on the smoke screen.

4 15:58:38

She is unsure about her response on the Smoke question. So she uses back on Browser to go back. She notices that she can't click on anything but the right answer because of the cursor changing to a hand.

12 Outline Plume. PAGE 1 41

Thought smoke labels were vectors

3 12:55:52

Thinks area lines are indication of something else.

13 Label all graphics. PAGE 1 44

Did not recognize water level graphic

3 13:31:34

Does not understand graphics.

14 All animations need controls (but should start active) PAGE 1 48

Animated Graphics-distracting

4 15:51:09

She says that the animated graphic on the remote sense screen was distracting.

4 15:54:57

She doesn't like the movie loop. She says it is distracting while she reads.

15 Consider Glossary Tom 3 36

Smoke Plume-does not understand

2 09:49:05

I am not sure what a smoke plume is.

99 No Action     31

Did not notice preload

User 2 in general

User 3 in general

99 No Action     46

Relies on browser Back button

3 13:39:50

Still using back arrows on browser rather than the navigation at bottom of screen.

99 No Action     47

Uses Back to make link on Simpson Scale

3 13:39:50

Still using back arrows on browser rather than the navigation at bottom of screen.

99 No Action     9

Graphics-responds to graphic

1 18:00:21

The graphics are cute.

99 No Action     15

Weather Channel-metaphor

1 18:07:34

Thatis pretty neat. It is like the weatherchannel thing.

99 No Action     27

Why picture of world?

1 19:02:26

Why is the picture of the world there?

99 No Action     42

Red as "hottest" or "highest"

3 13:21:11

He interprets the red as the hottest part of the IR image.

99 No Action     50

Doesn't want audio

4 16:25:57

She says she doesn't want audio. I usually turn audio off.

Appendix A: Scenarios

Scenario: Look Around

You have enrolled in a science course on meteorology. One of the resources for the course is a web-based module called . In the syllabus, it says that the web site will be used for labs and outside study. You want to see what the site is like and find out a little about the subject matter. Go to the web address the facilitator gives you and investigate it the way you normally would if you were actually going to take this course.

  1. Go the web address the facilitator gives you.
  2. Look around the site or do whatever you would normally do on a first visit.
  3. THINK OUT LOUD.
  4. Call the facilitator when you want to quit (or the facilitator may interrupt you.).

Scenario: Hurricanes

In class this week you are discussing hurricanes. You decide to go to the web site and see what you can learn about hurricanes.

  1. Go to the web site.
  2. Learn what you can about hurricanes.
  3. THINK OUT LOUD.
  4. Call the facilitator when you want to quit (or the facilitator may interrupt you.)

Scenario: VIS (1)

For lab, the professor has assigned you to review the two topics "What the VIS Channel Senses" and "VIS Imagery Legend and Labels."

  1. Go to the web site.
  2. Find the assigned topics and review them.
  3. THINK OUT LOUD.
  4. Call the facilitator when you are through reviewing the two topics.

Scenario: VIS (2)

The professor has directed everyone to go to the topic "Surface Features on VIS Imagery" and choose a topic of interest. You choose "Smoke."

  1. Find the topic "Surface Features on VIS Imagery."
  2. Review the topic.
  3. THINK OUT LOUD.
  4. Call the facilitator when you are done.

Scenario: VIS (3)

The assigned lab exercise is the Feature Locating Exercise in the section on the Visible (VIS) Channel.

  1. Find the exercise.
  2. Answer all of the questions.
  3. THINK OUT LOUD.
  4. Call the facilitator when you are done.

Scenario: Remote Sensing

You missed class last week, and in the lecture today, the professor talked about remote and in-situ sensing. You said "huh?" and the professor did not look amused. She suggested that you visit the web site and catch up on what you missed.

  1. Go to the web site.
  2. Learn about the differences between remote and in situ sensing.
  3. THINK OUT LOUD.
  4. Call the facilitator when you are done.

Scenario: IR (1)

In class this week you discussed thunderstorms and cloud temperature. The professor suggests that you look at the topic that discusses using infrared technology to look at thunderstorms.

  1. Go to the web site.
  2. Find and review the appropriate topic and answer all the questions.
  3. THINK OUT LOUD.
  4. Call the facilitator when you are done.

Scenario: IR (2)

For lab, the professor has asked that you do the exercise on comparing features in VIS and IR images.

  1. Find the exercise.
  2. Answer all the questions.
  3. THINK OUT LOUD.
  4. Call the facilitator when you are done.

Scenario: Getting Started [optional scenario: use only after non-directed data are saturated]

Today is the first lab day in your meteorology class. The professor has directed you to go to the web site and take the Getting Started module.

  1. Go to the web site.
  2. Take the Getting Started module.
  3. THINK OUT LOUD.
  4. Call the facilitator when you are done.

 

 

Appendix B

Usability Test Recruitment Form

What is it?

An exercise that lasts about two hours. You will use a computer product and be videotaped, observed, and interviewed. The purpose is to determine how the product can be made more user-friendly.

What do I get?

At the end of your session, you will receive an honorarium of $75.00.

When? (Check the time(s) when you would be available.)

_ 6:00 PM Monday evening (December 14)

_ 9:00 AM Tuesday morning (December 15)

_ 12:30 PM Tuesday afternoon (December 15)

_ 3:00 PM Tuesday afternoon (December 15)

Where?

The Usability Center in Roswell. (Map & directions will be provided.)

Qualifiers

  1. Are you currently enrolled as a full-time student in college? (Required)

    _ Yes Where?

    What is your major?

  2. Do you use a PC for school, personal, or business use? (Required)

    _ Yes What programs do you use?

     

     

  3. Do you use the Internet? (Required)

    _ Yes How often do you use the Internet?

    _ Less than once a week

    _ Several times a week

    _ Every day

  4. Gender (Information only)

    _ Male

    _ Female

  5. Age (Information only)

_ less than 21

_ 21-25

_ 26- 30

_ Over 30

Name    
     
Daytime Phone #   Evening Phone #

If you are selected, someone will call you and confirm your appointment time.