Monitoring Student Performance in Online Courses: New Game - New Rules

George Pappas, Ellen Lederman, Brooke Broadbent

VOL. 16, No. 2, 66-72

Abstract

Online learning environments are more challenging than face-to-face settings because the visual and aural cues of student performance are missing and there are limited process safeguards to avoid plagiarism and ensure that the learner participating is the same person who supplies the course work. This imposes challenges for a virtual instructor trying to monitor and assess online students' progress and gain some assurances of satisfactory completion of knowledge transfer. This article presents the experience of a virtual instructor with adult students completing a health and safety course entitled Lockout. Three typical cases are presented whereby the instructor must evaluate student performance.

Résumé

Les environnements d’apprentissage en ligne représentent un plus grand défi que ceux en face à face, car les signaux visuels et oraux de la performance des étudiants sont absents et pour éviter le plagiat et s’assurer que l’apprenant participant est réellement celui qui remet les travaux de cours les procédés de garanties sont limités. Cela pose un défi pour l’instructeur virtuel quand il tente de contrôler et d’évaluer le progrès de l’étudiant en ligne et de s’assurer de la complétion satisfaisante du transfert des connaissances. Cet article présente l’expérience d’un instructeur virtuel avec des étudiants adultes complétant un cours sur la santé et la sécurité intitulé Lockout. Trois cas typiques sont présentés dans lesquels les instructeurs doivent évaluer la performance des étudiants.

Every culture must negotiate with technology, whether it does so intelligently or not. A bargain is struck in which technology giveth and technology taketh away. (Postman, 1992, p. 5)

Introduction

The growing trend toward using electronic means for knowledge and skills transfer requires the online facilitator to rethink the monitoring of student performance. The different medium means a different way of evaluating student participation is required to ensure that the necessary knowledge and/or skills have been developed from that learning experience. The technology for developing written passwords, voice passes, and corneal verification has not yet reached a point of mass use to ensure that the individual participating is also supplying the work. Even if the technology provides evidence of individual participation and contribution, there remain issues such as plagiarism when the participant has access to a plethora of other sources. Where the old game allowed visible cues and clues of student performance, clearly the new game needs new rules. Online learning environments are more challenging than face-to-face settings as “the cues that govern ordinary social interaction are absent: eye contact, body language, facial expressions, and voice tone, for example” (Holt, Kleiber, Swenson, Rees, & Milton, 1998, p. 48). In addition to the other responsibilities held by the Web instructor, “creating the environment, guiding the process, moderating the process, managing the content and creating a community” (p. 48), assessing student performance can become quite challenging.

The body of literature on the evaluation of online learning is growing (1 million references were found on a recent Internet document search). The focus of most evaluation is on the materials rather than the participant. Research papers are found from the University of Glasgow on developmental evaluation (pretesting) of materials (Doughty, 2000) to the University of Malaysia report on formative evaluation as a means for improving distance education materials (Thompson, 1987). Although one recent study on social work reports that “learning technologies impact the learning of the social worker as well as their satisfaction and preferences for on-line learning” (Hicks, 1999), there remains little on monitoring or assessing participants using a technology-based learning method.

In the immediate term we need to examine the facilitator’s role in monitoring the online learning process in order for the final result to provide some assurances of satisfactory completion of knowledge transfer from within the virtual environment. A new game requires new rules for both the coach and the players.

This report examines an online instructor’s experience with monitoring online student performance with a Web-based course entitled Lockout developed by the Industrial Accident Prevention Association (IAPA). Lockout means to neutralize physically all energies (such as mechanical, electrical, hydraulic, chemical, steam, and gravitational) in a piece of equipment before any maintenance or repair work is done. The need for lockout training is broad in the manufacturing sector. Most workplaces have machinery and equipment that use one or more energy sources. If proper lockout is not performed, serious accidents can occur. Exposure to energy is a leading cause of industrial worker injury (IAPA, 2000). Federal and provincial Canadian governments regulate the practice of isolating and reducing exposure to energy sources. Individuals involved require both knowledge and skills. Training would include theory of energy movements and skill training in shutting down, locking out, and ensuring no accidental movement of equipment. This in turn would be followed by site-specific and hands-on procedural skills training or orientation. In the normal course of face-to-face training, visual cues of performance (e.g., work group participation, class discussion, asking questions) would be used to evaluate participation; and quizzes, tests, or applied skills would evaluate knowledge.

The Course

The Web course was developed to help participants learn about identifying

the need and types of lockout procedures required in their workplace. Participants would learn about the concept of lockout energy sources, relevant legislation, and responsibilities and how to prepare a workplace

action plan that would identify, assess, and recommend controls for worker exposure to lockout hazards. This was primarily a knowledgebased course with one testing opportunity to apply the knowledge to a case scenario. A record of successful completion would be awarded to individuals who met the criteria established including: submitted exercises, participation in class discussions via regular e-mail or threaded

discussions, and successful completion of a submitted final course quiz. This course was offered over a two-week period and took two to five hours to complete.

The Process

The new game required new rules. If the facilitator could not visually identify participation, then some other methods must be used to expose the work being done. The Web template allowed an assessment of the hits (number of times a student went into an area), the time spent by a user, by date, by day of the week, and the work in the course sections visited (e.g., exercises, Web sites, content). The one variable that could not be isolated was verification that the individual logging on was in fact the person performing the work.

In order to replace the face-to-face discussions and promote sharing that otherwise may occur in a traditional classroom, participants were obliged to post their assignments directly to a threaded discussion forum for review by all the other participants. It was determined that the best system for encouraging participation was an open forum whereby all participants could view other submissions and reply directly to them. Participation in the process would be required through e-mail, threaded discussion, submission of assignments, as well as the final quiz requirements.

The final quiz would only be viewed and evaluated by the Web instructor as an assessment that the transfer of knowledge had occurred. This meant that the final quiz needed to be a one-shot (no changes) reply submitted to the facilitator on course completion. The nature of the final quiz was to apply the knowledge learned about recognizing, assessing, and controlling the lockout hazard to a real case scenario.

Evaluation

As mentioned above, in a normal face-to-face encounter of knowledge and skill training, the facilitator uses cues and observations to make judgments as to the participants’ role in the process. The virtual environment creates a lack of body language and nonwritten cues (including nonphysical cues) that reduce the facilitator’s ability to judge what the participant is doing during the learning sessions. The e-learning process does not in itself, using current mass technology, provide assurance that the individual responding possesses any or all of the information. The facilitator, who has no access to the physical or individual cues provided in normal learning situations, must rely on other techniques. Strategies must be developed to overcome these deficiencies and reduce the effect of partnering, especially where several persons may be registered from the same organization. Although working together needs to be encouraged, for evaluation purposes e-learning needs some form of process safeguards.

In the IAPA model, these safeguards were included in the measurement of individual involvement. The process itself provided techniques for assessing participation, and when combined with the answers to the final quiz provided evidence of satisfactory completion of the information. The parameters for monitoring the student’s performance would enable the facilitator to assess whether sufficient work had been accomplished during the process. Immediate and complete submission of the final quiz answers without following the parameters of the new game rules eliminated the possibility for recognition of completion. Ensuring that the student was an active participant who read and used the theory, therefore, played a significant part in determining who would be granted this record of completion. The facilitator who reviewed the involvement and evaluated the response based on the time and effort extended could make the judgment that a record of satisfactory completion was in fact earned.

Application

The following case examples demonstrate some approaches taken by the IAPA facilitator to assess whether the participants had earned successful records of completion in a variety of real situations.

Case 1

Participants A and B decide to work together. Their postings are identical. Individual responses to questions may come from either individual. They make no effort to respond to other participant inputs. The facilitator reviews the time spent by each and finds that A is exerting considerable time and effort whereas B is simply copying answers from the first (based on time on task and hits). A private e-mail directed to B from the facilitator questions work done and requests clarification. If B responds in a manner that demonstrates comprehension, then the facilitator proceeds to ask of A similar information on a different exercise. Comparisons of style and form are part of the equation. In reviewing final quiz answers, the time of day of responses may play an integral part in whether a certificate of successful completion is to be awarded. The facilitator, after reading and determining that the partnership worked to the mutual benefit of both individuals, awards a certificate of completion to each.

Case 2

Individual A fails to provide information throughout the two-week window of opportunity. On the final day, the individual posts all of the information required including answer to the final quiz. The facilitator reads and determines that a significant portion of the information came from sources other than the course documents. Review of time spent supports this view. Although the final quiz response could technically pass the minimum requirement of knowledge (50%) required, the facilitator fails to issue a certificate of completion due to lack of participation in the process.

Case 3

Individual A spends considerable time and effort in the process. The answers posted reflect much of the documentation found in the materials. Many resemble pat answers required and may be cut and pasted from documentation but meet or exceed the requirements of posting. The final quiz answer, however, is lacking in detail of the information required to ensure adequate of knowledge, although 50% is present. Despite the information and involvement in the process, the individual is clearly not able to apply the information in one case scenario. The facilitator sends a private e-mail providing additional ideas about how the case study might be approached. The individual, having spent the time and effort in the system, should be able to identify required actions. If the individual fails to respond or fails to supply the required information, the facilitator does not issue a record of completion.

Conclusion

If the culture of this new generation must negotiate with technology as Postman suggests, then the benefits that accrue from the facility and speed of use must also allow for the need of the facilitators to modify the game rules. In order to minimize the costs and optimize what the technology is offering, the facilitator of a Web-based course must ensure that some controls are in place that identify involvement in the learning process. The facilitator does not have access to the total environment from which the participants to the e-learning forum arrive. Nor can it be determined what pressures have created the selection of the media; they may vary from a simple choice of wanting to learn to enforced participation. But the need to ensure successful completion falls directly on the facilitator, who has no physical cues and few abilities to verify that the individual participating is in fact providing the work. Clearly more work is required to minimize false-identity responses including allocation of passwords, time, and so forth. This remains to be addressed.

Although there are no safeguards over false-identity application, the IAPA has included at least one measure of assurance by including monitoring and assessing of the process of learning within the course structure. The facilitator can rely on the fact that the final score (summative evaluation result) can be related to the efforts extended in the process. By negotiating the game rules to suit what this new technology is offering, the facilitator can reasonably monitor student performance in the learning game.

References

Doughty, G. (2000). Evaluation of learning with information and communication technologies. Retrieved July 6, 2001 from rccenquire@gla.ac.uk/tcc/projects/elicit/

Hick, S. (1999). Formative evaluation of an innovative model for an online introduction to a social work course. Office of Learning Technologies. Retrieved July 5, 2001 from http://olt-bta.hrdc-drhc.gc.ca/publicat/reports_e.html/

Holt, M.E., Kleiber, P.B., Swenson, D.J., Rees, F.E., & Milton, J. (1998). Facilitating group learning on the internet. New Directions For Adult And Continuing Education, 78, 48.

Thompson, R.D. (1987). Responsive formative evaluation: A flexible means for improving distance learning material. Journal of Distance Education, 2, 1.

Postman, N. (1992). Technopoly: The surrender of culture to technology. New York: Knopf.

Industrial Accident Prevention Association (IAPA). (2000). Lockout: Hazard specific training. Web course. Toronto, ON: Author.

George Pappas is IAPA's first virtual instructor and has pioneered the beginnings of Web facilitation standards for IAPA. He has 17 years of experience working in the health and safety field and has a graduate degree in education. He can be reached at gpappas@iapa.on.ca.

Ellen Lederman as been involved with the management and development of distance education initiatives for the last seven years. She holds graduate and postgraduate degrees in education and can be reached at elederman@iapa.on.ca. For a preview of IAPA Web courses, check out IAPA's homepage at www.iapa.on.ca.

Brooke Broadbent is an independent e-learning analyst and the founder of e-LearningHub. com, a center for e-learning excellence. He is writing a book about e-learning to be co-published next year by the American Society for Training and Development and Jossey-Bass. He may be reached at www.e-learninghub.com

ISSN: 0830-0445