Evaluation Report for
"Remote Sensing Using Satellites"

An Interactive Learning Environment Developed By
Program for the Advancement of Geoscience Education (PAGE)
University Corporation for Atmospheric Research (UCAR)
September 1999

1. Introduction

This web document presents the findings of an evaluation of the "Remote Sensing Using Satellites" interactive learning module that was conducted during the 1998-99 academic year. The primary evaluators were Dr. Thomas C. Reeves, Mike Hughes, Kevin Peng, Mike Matzko, and Mike Sullivan, all from the Department of Instructional Technology in the College of Education at The University of Georgia.

The original plan for the evaluation was negotiated in the Fall Semester of 1998 in consultation with the major clients for the evaluation, Dr. Timothy Spangler, Dr. Greg Byrd, and Dr. Joe Lamos from the Cooperative Program for Operational Meteorology, Education, and Training (COMET), and Dr. Mary R. Marlino, from the Program for the Advancement of Geoscience Education (PAGE).

Additional assistance in both planning and carrying out this evaluation was provided by other stakeholders in this evaluation, including Ms. Sue Kemner-Richardson, the instructional designer for the program, Ms. Kathryn Ginger, the PAGE meteorology expert, and Ms. Lynne Davis, the PAGE software engineer. Ms. Eileen McIlvain, the PAGE Office Manager, has provided invaluable logistical support throughout the evaluation.

Most importantly, this evaluation could not have been completed without the generous cooperation of the faculty and students at Iowa State University, Jackson State University, and Carroll Community College.

2. Background

In 1998, the University Corporation for Atmospheric Research (UCAR), under the sponsorship of the National Science Foundation, developed a World Wide Web-based interactive learning environment called Remote Sensing Using Satellites (hereafter called RSUS). This dynamic meteorology module was jointly produced by UCAR's Cooperative Program for Operational Meteorology, Education and Training (COMET®), the Program for the Advancement of Geoscience Education (PAGE), and the University of Nevada's Desert Research Institute (DRI).

The RSUS lab module is freely available on the WWW as a proof-of-concept or demonstration of the viability of using computer-based learning technology in teaching science to undergraduates. The project and its instructional module serve the following three purposes:
- motivate non-science majors with relevant and active learning about science;
- provide dynamic imagery, graphic, and animations of atmospheric science concepts for use by both instructors and learners; and
- provide a model and template for Web-based atmospheric science instruction for undergraduates.

The RSUS module is being integrated into the laboratory portion of undergraduate survey courses in meteorology. It is expected that it can also be integrated into earth science, physical geography, or related science courses. The target learners are primarily lower division non-science majors, and the course may be the only college-level science/geoscience they take.

The RSUS lab module has the following instructional objectives:

1. Provide learners with synopses of essential content knowledge that assist them in developing a basic understanding of the principles of remote sensing, what satellite-based sensors observe, and what imagery is created from remotely sensed data.

2. Provide learners with synopses of essential content knowledge and assist them in developing a basic understanding of the principles of satellite interpretation.

3. Provide learners with synopses of essential content knowledge and assist them in developing a basic understanding of some classic characteristics of a severe weather system (hurricane, tornado, or thunderstorm).

4. Provide learners with a well-supported exploratory environment in which they can inquire about phenomena, develop and test hypotheses, and make calculations.

The module is intended to help students learn practical skills in understanding and interpreting satellite weather data. Satellite meteorology is used to explore surface and atmospheric phenomena. Hurricane systems provide focus to the investigations. The instructional approach includes two primary elements, brief, illustrated synopses of instructional content and an interactive exploratory hurricane environment.

The content synopses provide brief summary information for remote sensing, satellite imagery interpretation, and hurricane tracking. Dynamic satellite imagery and animations augment each synopsis, illustrating referenced features or concepts. This information is intended to complement other modes of instruction (lecture, textbook, etc.) Additionally, each major segment provides learners with brief, practical exercises.

The interactive exploratory environment of the Hurricane Matrix provides learners with the opportunity to investigate, compare, and contrast several hurricanes over time. Questions are provided to guide the students' exploration. Participants investigate three recent hurricanes from well-before landfall until dissipation after landfall. Students can choose hurricanes, time periods, and satellite imagery type in any order. This innovative design can be reused as a template when populated with other content, e.g., a module about remote sensing using radar.

3. Purposes

The evaluation of the RSUS lab module had four primary purposes:
1. To identify corrections, improvements, and extensions to the RSUS lab module,
2. To describe the implementation and effectiveness of the RSUS lab module in a variety of undergraduate contexts,
3. To estimate the value of the RSUS lab module to the instructors and students in undergraduate geoscience, and
4. To judge the feasibility of the RSUS lab module as a model and template for WWW-based instruction in undergraduate geoscience courses.

The focus of the evaluation has been formative, i.e., intended to enhance the RSUS module, rather than summative, to evaluate its effectiveness and worth. Recent estimates of the number of pages on the WWW run into the hundreds of millions, but few of them integrate substantive content with meaningful instructional interactions. The need for models and templates in higher education is widely recognized. It is important to optimize the quality of these types of materials before they are subjected to extensive summative evaluation, and thus the formative focus is justified.

Specific aspects of RSUS that were evaluated included:
- the appropriateness of the scientific content for undergraduate courses,
- its design, user interface and navigation components,
- its pedagogical dimensions,
- its instructional implementation strategies,
- its technical implementation requirements,
- its effectiveness as a learning environment,
- its feasibility for integration within different types of undergraduate courses, and
- its utility as a model or template for Web-based instruction in geoscience courses.

4. Audiences

The primary client for this evaluation is Dr. Mary R. Marlino and her colleagues at the Program for the Advancement of Geoscience Education (PAGE). Important evaluation audiences include personnel at PAGE, the Cooperative Program for Operational Meteorology, Education and Training (COMET®), the University Corporation for Atmospheric Sciences (UCAR), the University of Nevada's Desert Research Institute, and the National Science Foundation as well as all members of the larger PAGE community. This evaluation should also be of interest to other educators who are developing web-based resources for science education within both higher education and K-12 sectors.

5. Decisions

The purpose of formative evaluation is to direct decisions affecting revisions of the program being evaluated as well as the design of future programs. Evaluations are not ends in themselves, but means to enable better decision-making by stakeholders. If this evaluation was going to influence decision-making, it was important to anticipate as many probable decisions as possible before the evaluation began. The following chart specifies the decisions we anticipated and the considerations that have guided our evaluation activities.

Decisions Considerations
What changes should be made in the RSUS module interface?
  • How intuitively the RSUS user interface presents what to do and how to do it.
  • How clearly the user interface keeps the user oriented to his or her place in the program.
What changes should be made in navigation?
  • How easily the user can find and access other places of interest in the module.
What changes should be made in feedback?
  • How clearly instructional feedback messages guide learners to correct decisions.
  • How effectively error messages help learners move ahead in the program.
What changes should be made in terminology and narrative?
  • How easily the scientific content is understood.
  • How easily learners understand technical terms.
What changes should be made in help components?
  • How effective the help function is in answering learner questions.
What changes should be made in the interactions?
  • How effectively learners engage in and remain engaged in the interactive aspects of the module.

6. Questions

Based upon the anticipated decisions listed above, the following questions were addressed during this evaluation:

1. Is the scientific content of the RSUS lab module appropriate for integration into undergraduate geoscience courses?

2. To what degree do the goals and objectives of the RSUS lab module align with the goals and objectives of a range of undergraduate geoscience courses?

3. Are the pedagogical dimensions of the RSUS lab module appropriate for integration into undergraduate geoscience courses?

4. What are the range of instructional alternatives for integrating the RSUS lab module into undergraduate geoscience courses?

5. What are the technical requirements of implementing the RSUS lab module within the context of undergraduate geoscience courses?

6. With respect to its stated goals and objectives, what is the effectiveness of the RSUS lab module within the context of undergraduate geoscience courses?

7. With respect to unexpected outcomes, what is the effectiveness of the RSUS lab module within the context of undergraduate geoscience courses?

8. Given the intended undergraduate learners who will interact with the RSUS lab module, how can the design and navigational components of its graphical user interface (GUI) be enhanced?

9. What is the overall value of the RSUS lab module according to the participants (students and faculty) in this evaluation?

10. What recommendations can be made for enhancements in the RSUS lab module and similar modules?

11. What recommendations can be made for the integration of web-based instruction into undergraduate geoscience education?

12. What recommendations can be made for the enhancement of the RSUS lab module as a model or template for Web-based instruction in undergraduate geoscience courses.

7. Methods

There were two primary methods used for the evaluation of the RSUS lab module:

Usability Testing

Mike Hughes, who is both a usability expert and a doctoral student at The University of Georgia, facilitated the usability testing process. The usability testing allowed the PAGE team to see their product through the eyes of potential users in a lab environment. The usability testing approach had 7 steps:

Beta Testing

Three diverse institutions of higher education agreed to serve as formal beta sites for testing the RSUS lab module:

An evaluation team consisting of Dr. Reeves, the primary evaluator, and a team member from the PAGE, either Ms. Kathryn Ginger (Iowa State and Jackson State) or Ms. Lynne Davis (Carroll Community College) visited each site to observe the RSUS lab module in use, interview students, and interview faculty.

8. Sample

Students for the usability testing sessions were recruited from Southern Polytechnic State University in Marietta, Georgia. These students were selected because they matched the primary criteria for participation in the evaluation:

The three beta test sites were selected from a list of twelve candidate sites because of the desire to see the RSUS in situ at three different types of institutions: a large land grant research university, a historically minority-oriented university, and a rural community college.

We were very fortunate to recruit the full participation of the faculty and students at the following institutions:

9. Results

Usability Testing Results

The usability testing of the RSUS lab module was conducted on December 14-16 at The Usability Center in Atlanta, Georgia. Usability. The usability test was facilitated by Mike Hughes, a professional facilitator who is also a doctoral student in the Department of Instructional Technology at The University of Georgia. Additional personnel assisting in the usability test included Tom Reeves, Kevin Peng, Mike Matzko, and Mike Sullivan from the Department of Instructional Technology at The University of Georgia. PAGE was represented by its Director, Dr. Mary R. Marlino.

Usability is an important issue in the design of any software, including interactive learning environments such as the RSUS lab module. Usability is a combination of the following user-oriented characteristics: 1) ease of learning, 2) high speed of user task performance, 3) low user error rate, 4) subjective user satisfaction, and 5) user retention over time. In usability testing, the user (specifically in this case, the learner) is the instrument. The user is given realistic tasks to perform and is observed by the evaluation team. A think-aloud protocol to used to help the evaluation team hear user thoughts and reactions to problems.

Usability testing is a rigorous, intensive process that generally requires the use of special facilities. This evaluation was aided greatly by access to a professional usability lab thanks to the generosity and support of Loren Burke, Director of The Usability Center in Atlanta. Figure 1 shows the layout of the lab in which the evaluations were conducted.

Figure 1: Illustration of usability lab.

The full usability testing report is available in Appendix A. The usability test indicated that the RSUS lab module was generally a well-designed exemplar of a web-based learning environment. However, several important problems in the design and graphical user interface of the module were detected, and specific actions to remove these usability blocks were recommended. These recommendations cover such areas as:

To address these results before the best testing occurred, the following actions were taken by the design team at PAGE:

Beta Testing Results

The site visits took place according to the following schedule:

 Site

 Date

 Site Visit Team
 Iowa State University  April 6-8, 1999  Tom Reeves, Katy Ginger
 Jackson State University  April 28-29, 1999  Tom Reeves, Katy Ginger
 Carroll Community College  May 4-6, 1999  Tom Reeves, Lynne Davis

Three primary evaluation strategies were utilized at each site:

The results of each of these evaluation strategies are reported below by site.


Iowa State University (ISU)

Instructor Interview at ISU

Tom Reeves and Katy Ginger interviewed Doug Yarger in his office at Iowa State University. The following is not an actual transcript of the interview because the interview was not recorded. However, it is an approximate recreation of the interview based upon extensive notes.

Please describe your first reactions to the "Remote Sensing" module.

When I first looked at the module, I thought it wouldn't work in my course. It seemed too much like simple information dissemination. But then, I reconsidered. Hurricanes is a captivating topic, and I thought I could use the program to reinforce other things in the course, to check their understanding of related material. As it turned out, I think the module allowed me to off load some of the learning that I couldn't accomplish in class.

What did you have to do to incorporate this module into your course?

[Laughs] Oh, you don't want to know. Actually, I spent days reconfiguring this module. The original design gave me no information about how students would do with this module. But I was able to configure it for my use. I scaffolded the module so that students could benefit from it and I would know what they did. It was a lot of work, but every new thing is both a burden and a resource.

Why did you feel the need to reconfigure the module?

I thought the questions in the module were not the ones I would ask. On the other hand, there aren't any inappropriate questions in the module. But it is important for me to define the questions in such a way that I would know how students answered what they know.

Would you prefer a module that was less structured, one that includes only pieces for you to configure?

No! [Emphatically] This program had the whole fabric. I could modify it, but it was good the way it was. I would not want to construct a whole module on my own.

How would you improve this type of module?

The module didn't allow students to do anything. It is not really open-ended in any place. I prefer open-ended activities. I want students to look at data and tell me what they see. The satellite part with the hurricanes is really powerful, and I was able to use our course management tool to work with that.

What would you recommend to PAGE concerning future development efforts?

First, faculty need to be involved in the design, more than one faculty. You need to build resources with faculty, not for faculty. There are lots of resources out there, but people won't use them. They aren't theirs. Let's say you involved ten schools in the design, then you would have at least ten users. Cluster development projects would lead to ownership and use.

Second, you need to recognize that the pedagogical underpinnings for this sort of development are quite foreign for most faculty. We are not rewarded for teaching. Research is. To use technology like this in teaching, the buy-in has to be low penalty and high payoff.

Third, I think faculty actually need to go to Boulder to work with people there to develop resources. They need to get away so they can concentrate. They need to have other people around who think what they're doing is OK. A support team is critical for development. This is a long process.

What other comments do you have?

In this course, I am not trying to teach prediction skills per se, but it is a way for them to develop better mental models of science. They learn how to collect data, organize it to express what they see, explain and interpret it, and make conclusions. This is what is really important in a course like this. Remote sensing is an important secondary mental model in this course, and the module helps with this.

Thank you.

Observations at ISU

Tom Reeves and Katy Ginger attended a full class session of Meteorology 206 - Introduction to Meteorology. The class was held in a lecture hall that would accommodate more than 300 students. April 7, 1999 was an absolutely beautiful sunny day in Ames, Iowa, and before the class, Dr. Yarger had expressed concern that attendance might me low because the weather was so nice. But the class seemed well-attended.

Classroom activities began with a few announcements followed by a group response activity. The group response strategy is a technique Professor Yarger uses to remind students of prior learning and reinforce basic concepts. The students use half sheets of paper to record their responses and these are handed in at the end of each class. The data is used to check on class progress and possible misconceptions.

The next section of the class was Professor Yarger lecture which was supported using multimedia, including PowerPoint slides, video, and portions of the RSUS lab module. The topic was hurricanes, and the students viewed some especially gripping video of hurricane survivors talked about their terrifying experiences. When Professor Yarger used the RSUS lab module in his presentation, some of the images from the module, especially the video weather maps, appeared too small for viewing, especially for those students who were sitting toward the rear of the large classroom. However, no one complained. In fact, most of the students appeared to be very engaged throughout the lecture, and indeed throughout the whole class period.

The class ended with a brief weather discussion led by a graduate teaching assistant. This is also a daily feature of the class during which students discuss the current weather. Interestingly, there were some powerful systems building up that day, and there were a number of damaging tornadoes throughout Iowa the next morning.

Student Interviews at ISU

Tom Reeves and Katy Ginger interviewed nine students who agreed to remain after class as a group. The following is not an actual transcript of the group interview because the session was not recorded. However, it is an approximate recreation of the interview based upon extensive notes.

What were your first impressions of this remote sensing module?

How much time do you think you spent with the module?

How easy was it to interpret the graphics in the program?

What are the advantages of this type of web-based learning over other approaches?

What are the disadvantages of this type of web-based learning?

How does this program compare to other computer programs used in this course?

What are some of the outcomes of this module from your perspective?

How did you actually use the module?

What other comments do you have about the module?


Jackson State University

Instructor Interview at JSU

Tom Reeves and Katy Ginger interviewed Paul Croft in his office at Jackson State University. The following is not an actual transcript of the interview because the interview was not recorded. However, it is an approximate recreation of the interview based upon extensive notes.

How did you first find out about the "Remote Sensing" module and what were your first reactions to it?

I first heard about the module during the Satellite Workshop last June in Boulder. At that time I just looked through the first few screens because I didn't have time to do anything else. My first impression was excitement because it seemed to cover some important, basic materials. It has good pictures, not just satellite images but all kinds of graphics. The down side was that it wasn't clear how I could integrate it into my course. How you use something like this is what is really important.

How did you first use the RSUS lab module in your courses?

I was teaching a course about meteorology and the environment. This was a graduate course, Environmental Meteorology. There were eight students, six graduate students and two advanced undergraduates. I had my class look at it over two days, one fifty-minute class period the first day, and another 30 minutes the next class. I more or less threw them into the pool, telling them "Use this lab." We downloaded it so it was freely available in our lab. Then, I followed the session up with discussion and questions about remote sensing and related topics. I wanted them to be able to answer questions like: What is the importance of remote sensing in environmental meteorology? How could remote sensing data be used in real time? Or for research? How could remote sensing data, both visual and infra-red be used to interpret air quality? Those kinds of questions, Frankly, their knowledge of remote sensing was very limited before they used this module, even the graduate students. But the lab module helped a lot. They went though it individually, and then we had a group discussion.

How are you using the module now?

My colleagues and I are using it several ways, even with freshman students. We feel that our majors should know all of the content in this module, but they don't, so we have pointed them to it several times. Material on the Internet like this really depends on the personal relevance to the student at a specific time. After all, some sections of the module, or any online materials for that matter, are boring and tedious, but if it is relevant to something you need to know, it becomes a valuable resource. Whenever, we use Internet materials like this, we always follow-up with class discussions. Inevitably, new issues are raised by the module, and there are important insights that come out of the discussion.

Would you prefer a module that was less structured, one that includes only pieces for you to configure?

This does present a dilemma. On the one hand, with a self-contained module, everything is there. You can decide whether or not you want to use it. It's like a textbook. You either decide to use the whole book or not at all. You don't use some chapters from one book and others from a different one. The downside of a self-contained program is that if it doesn't fit your curriculum, it can be hard to integrate the whole thing into a course. On the other hand, I don't think that just providing the pieces is a better way. It takes too much time to figure out what to use where. I prefer the complete package. If the whole thing fits into a course, then it is all there. Even if it is self-contained, I can take some pieces out, although this takes some extra work.

What other web resources do you use?

We use our own web pages, and we use weather maps and some of the training course materials online. We also use some CD-ROM materials. The University of Illinois has a great site that is helpful for our students. But we wouldn't assign them to the whole site at Illinois because it's too complex. I like to use some videos in my courses, but they are hard to find. We don't have a projection system so we have to crowd around a monitor to see the material.

How would you improve this type of module?

I would add more help and support. If we assign this type of resource out of class, the students groan! They don't want to work independently outside of class. Access would be a big issue with some of the students. They would have to go to the library to use the module, and the computers there are always busy. Commuting students would complain the most. Interestingly, some of our students spend a lot of time online in chat rooms, but they don't seem to want to work on the web. I think it may be because when they encounter problems, they feel that they have no help. Technological literacy is still a big challenge. There are all kinds of things that can go wrong such as monitor settings, browser versions, windows compatibility, and so forth. You really have to be pretty savvy to deal with all these issues. It really comes down to a literacy issue.

What would you recommend to PAGE concerning future development efforts?

Obviously, I am in favor of the development of more modules. We would like to be involved if possible.

What other comments do you have?

None really, except that I think it's important that these materials are being evaluated. I look forward to seeing the results.

Thank you.

Observations at JSU

Tom Reeves and Katy Ginger attended two class session during which the RSUS lab module was used, one class with meteorology majors and one with general undergraduates. The materials were used in the meteorology lab. A illustrated in Figure 2, the lab has state of the art computers, but the space is very tight in the lab. The lab tables were clearly not intended for computer usage, but the students do not seem to mind the space limitations.

Figure 2. Meteorology lab at Jackson State University.

Most of the students appeared to prefer working in pairs through the RSUS lab module, although several worked on their own. The students freely discussed the content in the module and they asked questions of each other and the students close by them. When the system asked them to answer a question, there was usually some discussion and negotiation before one of the students entered a response.

Although they may have been influenced by the presence of two visitors in the lab, the students appeared to be genuinely engaged in interacting with the module. No students appeared to be accessing web material that was unrelated to meteorology. The students in both classes worked with the module for about thirty minutes after which the instructor led a discussion related to the module and the current weather forecast. Each day a student in the class is assigned to make a forecast, and whatever the content of that day's lesson is related to the forecast. This seems to be an excellent strategy for increasing the relevance of the material for the students.

Student Interviews at JSU

Tom Reeves and Katy Ginger interviewed nearly twenty students as a group. The following is not an actual transcript of the group interview because the session was not recorded. However, it is an approximate recreation of the interview based upon extensive notes.

What were your first impressions of this remote sensing module?

How much time do you think you spent with the module?

How easy was it to interpret the graphics in the program?

What are the advantages of this type of web-based learning over other approaches?

What are the disadvantages of this type of web-based learning?

How does this program compare to other computer programs used in this course?

What are some of the outcomes of this module from your perspective?

What other comments do you have about the module?


Carroll Community College

Instructor Interview at CCC

Tom Reeves and Lynne Davis interviewed Bill Kelvey in the library at Carroll Community College. The following is not an actual transcript of the interview because the interview was not recorded. However, it is an approximate recreation of the interview based upon extensive notes.

How did you first find out about the "Remote Sensing" module and what were your first reactions to it?

I really don't remember how I found out about it. Probably at a conference. My first impressions were that it was really easy to navigate. I liked the look and feel of it. It is easy on the eyes.

How did you first use the RSUS lab module in your courses?

I am not sure that I have figured out the best way to incorporate it into my courses yet. I don't want the students to feel that it is busy work. I have a real mixed class, mostly young students but a few older ones with none in the middle. We spent most of a class period using the site earlier this week. The younger students wanted to know right away, "Is this going to be on the test?" We're going to talk about it some tonight.

How else are you using the web here?

There is a lot of pressure to get some courses on the web here. Next semester, I'll be teaching meteorology totally on the web. I'll be a facilitator rather than a lecturer. I don't know how many students I can handle this way so I am capping enrollment at 20 for now. When I have used the web in regular classes, I have been disappointed by the degree to which students communicated and engaged in online discussions. This is a concern. We have a web master here who can help with the technical details, but we really need some instructional designers to help with course development.

Would you prefer a module that was less structured, one that includes only pieces for you to configure?

Not really. It's good to have a complete module. I am glad that I have learned to develop my own web pages, but I don't think most faculty should have to learn FrontPage! Plus handling graphics and scanning! Most faculty would prefer to have complete modules around important topics.

What challenges do you face in using the web in your courses?

Access is less of a problem here than some other places because we have over 600 computers for about 3,000 students. And many of our students have access at home or elsewhere. But not all do so access is still an issue. As we move more and more to the web, do we have to guarantee access to computers? This is unanswered question. The county closes us down on Sundays and holidays. And what will we do when the technology goes down? This is why I have my students do some things by hand. I have them draw maps. And they learn to draw isobars. You know these students are supposed to be the new high technology generation, but they're not. You still have to help them a lot with the technology.

How would you improve this type of module?

I would like to see more tests and quizzes. In fact, the tests could even be on paper with the images the students would interpret on the web. I would like to see a section about how other faculty are using this module. That would help.

What would you recommend to PAGE concerning future development efforts?

Keep us up to date with recent trends and hot topics. What is the latest research that we can use in the classroom? Provide us with real data in ways that we can use it. There are lots of meteorology materials out there, but someone needs to filter it. I teach four courses, and only one of them is meteorology. I teach two oceanography classes and an earth science course. There is a lot of preparation time. I don't have time to go out on the web and look for lots of materials, much less figure out how to integrate it into a course. I need whatever help I can get.

What other comments do you have?

I am glad that I used this module in my course. The students seemed to like it and I'm sure I'll use it again.

Thank you.

Observations at CCC

Tom Reeves and Lynne Davis attended a full class session of Geoscience 201 (Meteorology). There were eight young students in the class (late teens to early twenties), both male and female, and three older women (forties or older). The latter three students asked a disproportionate number of questions, and they were also the most willing to respond to the instructor's questions. The class began with a "chalk talk" (Bill Kelvey's words) about cloud formations. This was an engaging lecture during which Bill tried to keep the students engaged by asking frequent questions. The students seems genuinely interested, especially in the explanation related to the formation of thunderstorms. During the second half of the class, the students viewed a video about tornadoes chasers. This was a well produced video produced by the Discovery channel, and the students seemed fascinated by it. The last part of the class was a lab exercise.

Student Interviews at CCC

Tom Reeves and Lynne Davis interviewed all eleven students in the Geoscience 201 (Meteorology) class. The following is not an actual transcript of the group interview because the session was not recorded. However, it is an approximate recreation of the interview based upon extensive notes.

What were your first impressions of this remote sensing module?

How easy was it to interpret the graphics in the program?

What are the advantages of this type of web-based learning over other approaches?

What are the disadvantages of this type of web-based learning?

How does this program compare to other computer programs used in this course?

What are some of the outcomes of this module from your perspective?

What other comments do you have about the module?

10. Discussion

The discussion of the results of the evaluation are organized according to the twelve overall questions that guided this evaluation:

1. Is the scientific content of the RSUS lab module appropriate for integration into undergraduate geoscience courses?

All three instructors clearly agreed that the content the RSUS module was appropriate for undergraduates. The diversity of the institutions and the courses into which the RSUS lab module was integrated supports its appropriateness for its target audiences.

2. To what degree do the goals and objectives of the RSUS lab module align with the goals and objectives of a range of undergraduate geoscience courses?

The RSUS module was integrated into a wide range of courses at the three beta sites ranging from general geoscience courses to specific meteorology courses. Across these courses, it is clear that students need to understand the functions and utility of remote sensing using satellites, the differences between VIS and IR imagery, and how satellite imagery enables the tracking of weather systems such as hurricanes.

3. Are the pedagogical dimensions of the RSUS lab module appropriate for integration into undergraduate geoscience courses?

The instructional interactions in the RSUS are appropriate, but both instructors and students believe that the program could have more interactions. Students especially desire more quizzes or tests that enable them to assess their understanding of the concepts and the ability to apply the knowledge to activities such as forecasting.

4. What are the range of instructional alternatives for integrating the RSUS lab module into undergraduate geoscience courses?

There is a wide range of instructional alternatives for integrating the RSUS lab module into undergraduate geoscience courses, including the following:

5. What are the technical requirements of implementing the RSUS lab module within the context of undergraduate geoscience courses?

The beta test sites in this evaluation used a variety of approaches for providing their students with the RSUS lab module. The program can be downloaded to a local server so that students have faster access to the module than if they were accessing it via the WWW. Where students have reliable, high speed access to the WWW, the preferred mode is access directly from the COMET/UCAR server. The most sophisticated and technically challenging approach to using the module was the initiative by Professor Yarger at Iowa State University who integrated portions of the RSUS lab module into his own web-based assessment and course management system. It is unlikely that many faculty will undertake this type of technical challenge.

6. With respect to its stated goals and objectives, what is the effectiveness of the RSUS lab module within the context of undergraduate geoscience courses?

The faculty and students at both Iowa State University and Jackson State University were unanimous in their endorsement of the RSUS lab module within the context of their course. The students at Carroll Community College were less positive in their assessment of the program, although their instructor perceived it as an effective example of web-based instruction. The students at Carroll Community College used the program during a one hour lab period, and some of them clearly disliked having to read the material online during a relatively short period of time. They expressed preferences for being able to print out the text portions of the program and being able to study the material at home during their own time.

7. With respect to unexpected outcomes, what is the effectiveness of the RSUS lab module within the context of undergraduate geoscience courses?

The most enthusiastic responses to the RSUS lab module were found at Jackson State University. Many of the students we interviewed there are meteorology majors, and they expressed a desire to have more materials like this. Outside the context of the interviews, several of the students expressed a strong interest in helping to develop similar interactive materials or perhaps modifying the current module so that it could be used in K-12 schools.

8. Given the intended undergraduate learners who will interact with the RSUS lab module, how can the design and navigational components of its graphical user interface (GUI) be enhanced?

The usability testing of the RSUS lab module eliminated most of the user interfaces problems evident in the first version of the program. There are still a few "glitches" that should be fixed. For example, in the middle of most of the screens is a "squashed" graphic that appears to read "Return Home," but it should actually read "Preload Graphics." There is also still a need to enhance the video controls as indicated in the usability test results. With respect to design, the most important need is to increase the size of the graphics and video to full screen or nearly full screen. This is especially important for those images that call for fine discriminations of phenomena such as snow and cloud cover. It seems reasonable that the imagery the students see should be at least the same size as the imagery that real meteorologists see on their systems.

9. What is the overall value of the RSUS lab module according to the participants (students and faculty) in this evaluation?

Most of the faculty and students at our beta test sites were very enthusiastic about the overall value of the RSUS lab module. The most compelling evidence of this were the frequent requests that similar modules be developed for other topics such as tornadoes.

10. What recommendations can be made for enhancements in the RSUS lab module and similar modules?

The primary recommendations center around three issues:

11. What recommendations can be made for the integration of web-based instruction into undergraduate geoscience education?

The instructors expressed an interest in having more information about how other faculty are integrating this type of resource into their courses. Integration is the key issue for faculty who are concerned that students may perceive this type of program as peripheral to the main instructional activities in a course.

12. What recommendations can be made for the enhancement of the RSUS lab module as a model or template for Web-based instruction in undergraduate geoscience courses.

The instructors clearly prefer modules such as the RSUS lab module to be provided as complete programs rather than pieces that they would need to assemble or integrated into their own web pages. The current RSUS lab module appears to be a robust prototype for future modules provided the recommendations of larger graphics, more interactive exercises, and additional assessment options are followed. All faculty and several students expressed strong interest in being involved in the development of future modules.

11. Recommendations

The following recommendations are warranted based upon the findings of this evaluation:

1. If additional funding can be procured, the RSUS lab module should be modified to include larger graphics, more interactive exercises, and additional assessment options.

2. Regardless of whether the aforementioned enhancements can be undertaken, efforts to disseminate the current RSUS lab module should continue because it has been demonstrated to be appropriate for integration into a wide range of geoscience courses. This should include dissemination of information about how various faculty are integrating the module into their courses.

3. Given the recommended enhancements, the RSUS lab module is an appropriate template for the development of future interactive modules.

4. Efforts to involve faculty and perhaps even students in the development of future modules should be given a high priority.

12. Limitations

Usability Testing

The main focus of the usability test of the RSUS lab module was to determine ease of use of the learning environment during typical tasks in which a student might engage. This aspect of an interactive learning environment is critical to success, but it is not a comprehensive evaluation of the total effectiveness of the program. The following limitations apply:

Beta Testing

Beta testing was limited to three institutions of higher education, primarily because of budget restrictions for travel to out-of-state sites. Although the three sites certainly reflect a wide range of diversity with respect to the types of institutions included, they cannot be construed as a representative sample of all members of the PAGE community. It is up to the audiences of this evaluation report to determine the applicability of the findings within their own contexts.