s JITE v33n4 - Effectiveness of Computer Simulation for Enhancing Higher Order Thinking


Journal of Industrial Teacher Education logo

Current Editor: Dr. Robert T. Howell  bhowell@fhsu.edu
Volume 33, Number 4
Summer 1996


DLA Ejournal Home | JITE Home | Table of Contents for this issue | Search JITE and other ejournals

Effectiveness of Computer Simulation for Enhancing Higher Order Thinking

Anu A. Gokhale
Illinois State University

The National Education Association Research Division (1994) stated that student acquisition of higher order thinking skills is now a national goal. In a world in which technology is changing rapidly, workers need to be able to think creatively and solve problems so that the United States can stay economically competitive. A primary objective of today's teachers is to prepare students for the world of tomorrow. Pogrow (1994) indicated that if students are to be competitive in the years to come, faculty need to be able to provide their students with the cognitive strategies that will enable them to think critically, make decisions, and solve problems.

According to Leutner (1993), in traditional education, the teacher is responsible for the students' learning. Teachers typically lecture to students who take notes and then memorize and recall the material to perform well on examinations. This type of learning environment is not appropriate for college students who bring life skills and increased reasoning ability to the classroom. In such a situation, it may be appropriate for students to take responsibility for their own education. One method of transferring the responsibility from the teacher to the student is through guided discovery.

Guided discovery was developed by Dr. Charles E. Wales at the Center for Guided Design, West Virginia University (Leutner, 1993). In contrast to unguided exploratory activities, guided discovery has been found to be an effective learning method (Veenman, Elshout, & Busato, 1994). This method stimulates group interaction and is challenging enough to force students to use resources beyond what are available in the classroom. Menn (1993) evaluated the impact of different instructional media on student retention of subject matter. It was found that students remember only 10% of what they read; 20% of what they hear; 30%, if they see visuals related to what they are hearing; 50%, if they watch someone do something while explaining it; but almost 90%, if they do the job themselves even if only as a simulation. In other words, guided discovery through labs and computer simulations that are properly designed and implemented could revolutionize education.

The purpose of this study was to incorporate the characteristics of guided discovery in the use of computer simulation activities to explore the impact on the problem-solving ability of students. These activities were integrated into a traditional lecture-lab sequence.

Theoretical Framework

Although the importance of hands-on labs to the technology curriculum cannot be denied, Garcia (1995) cites several advantages of computer simulations compared to laboratory activities. First, there appear to be important pedagogical advantages of using computer simulations in the classroom. Second, the purchase, maintenance, and update of lab equipment is often more expensive than computer hardware and software. Also, there is no concern for students' physical safety in this learning environment.

Thomas and Hooper (1989) discuss the instructional use and sequencing of computer simulation and its effect on students' cognitive processes. The sequence in which learning occurs influences the stability of cognitive structures (Ausubel, 1968). New knowledge is made meaningful by relating it to prior knowledge and optimization of prior knowledge is done through sequencing. According to Gokhale (1991), simulations used prior to formal instruction build intuition and alert the student to the overall nature of the process. When used after formal instruction, the program offers the student an opportunity to apply the learned material.

According to Pogrow (1994), a learning strategy based on the Higher Order Thinking Skills Project (HOTS), involves three principles:

  1. Creating an intriguing learning environment,
  2. Combining visual and interactive learning experiences that help students to form mental representations, and
  3. Developing cognitive architecture that unifies their learning experiences.

Interactive computer simulations based on this strategy help students to create explanations for the events and argue for the validity of those explanations using a mixture of their own ideas and technical concepts in the simulation. In addition, simulations that employ an array of media will help bridge the gap between learning styles of students and teaching styles of instructors.

There is evidence that simulations enhance students' problem solving skills by giving them an opportunity to practice and refine their higher-order thinking strategies (Quinn, 1993). Computer simulations were found to be very effective in stimulating environmental problem solving by community college students (Faryniarz & Lockwood, 1992). In particular, computer simulation exercises based on the guided discovery learning theory can be designed to provide motivation, expose misconceptions and areas of knowledge deficiency, integrate information, and enhance transfer of learning (Mayes, 1992). In three studies, students using the guided version of computer simulation surpassed unguided students on tests of scientific thinking and a test of critical thinking (Rivers & Vockell, 1987). As a result of implementing properly designed simulation activities, the role of the teacher changes from a mere transmitter of information to a facilitator of higher-order thinking skills (Woolf & Hall, 1995). According to Magnusson and Palincsar (1995), simulations are seen as a powerful tool to teach not only the content but also thinking or reasoning skills that are necessary to solve problems in the real world.

In spite of the advantages of simulations, hands-on labs are tremendously important in the industrial technology curriculum, which is based on Dewey's experiential learning theory (Tanner, 1991). The basic premise of this theory is that students learn as a result of doing or experiencing things in the world, and learning occurs when mental activity is suffused with physical activity (Dewey, 1938; Smith, 1995). The professional success of a technologist is directly related to her/his ability to transfer knowledge gained in the academic environment to real-world situations. Acquisition of manipulative skills is only possible through the use of real instruments and real experimental data. Therefore, to enhance student learning, the technology curriculum must integrate the effective characteristics of both computer simulations and lab activities.

Purpose of the Study

This study examined the effectiveness of integrating guided discovery computer simulation into traditional lecture-lab activities to enhance the problem-solving ability of the students. The following research questions were examined in this study:

  1. Will there be a significant difference in achievement based on a problem-oriented test between students in the experimental group versus the control group?
  2. Will there be a significant difference in achievement on a drill-and-practice test between students in the experimental group versus the control group?

Methodology

The independent variable in this study was the method of instruction, a variable with two categories: computer-simulation and lab. The dependent variable was the posttest score. The posttest was made up of problem-oriented and drill-and-practice type items. The subject matter was small-signal amplifiers.

The study used a nonequivalent control group design. The level of significance (alpha) was set at 0.05. The meaningful difference or effect size was estimated at 0.8. Since the significance criterion, effect size, and sample size were known, standard tables were used to determine the statistical power of the test (Cohen, 1987). It was found to be 0.75; in other words, the Type II error rate was 0.25.

A pretest was administered to all subjects prior to the treatment to assess students' prior knowledge of amplifiers and also to test the initial equivalence between groups. A posttest was administered to measure treatment effects. This test was designed to assess the content that was previously learned and to have the students apply the learned material. The pretest and posttest were not identical but parallel forms of the same test. Since the study was conducted over a period of six weeks, there was limited concern for the students becoming "test-wise."

Subjects

The sample for this study included 32 students enrolled in two sections of an electronics course offered in an industrial technology department at a state university in the midwest. Sixteen students were enrolled in each section. A course in basic electronics was a prerequisite for enrollment in this course. One section of the course was randomly assigned to the control group while the other section was assigned as the experimental group.

Treatment

Each group met twice a week for a period of one hour and fifty minutes. An assignment was given to both the control group and the experimental group to design, build, and test a three-stage amplifier, within a six week a period. Both groups did a prelab for 35 minutes followed by a 75 minute lecture during one class period, and then a postlab for 110 minutes during the next class period. Students in both groups worked in pairs and they self-selected their partners. Overall, treatment time was the same for both groups.

The prelabs for both groups were designed with four objectives in mind:

  1. Prepare students to learn new material,
  2. Build intuition and alert the student to the overall nature of the process,
  3. Expose misconceptions and areas of knowledge deficiency, and
  4. Establish the need for deeper understanding.

The lecture served to answer questions that were raised by the students. The postlab was designed to help students apply the learned material. The guided activities in the prelabs and postlabs helped to achieve a cognitively similar treatment for both groups.

There was only one difference between the treatment for the control group and the experimental group. The experimental group did all the prelabs and postlabs, and completely designed and tested the amplifier using a computer simulation software. They built the amplifier in the lab only after the three-stage amplifier had been completely designed using the computer. The control group also did all the prelabs and postlabs but designed, built, and tested the amplifier in the lab itself.

The software used in this study was Electronics Workbench, developed by Interactive Image Technologies, Ltd. in Toronto, Canada. The program models a workbench for electronics. The large central area on the screen acts as a breadboard for circuit assembly. On the top is a shelf of test instruments and program controls, and on the right is a bin of parts. Clicking a mouse button causes an action to occur.

The program lets users construct a schematic for an electronic circuit on a computer display, simulate the activity of that circuit, display its activity on test instruments contained within the program, and print a copy of the circuit, the instrument readings and parts list. The program does not present lessons but rather allows users to follow any course they wish. At the same time, it contains on-line help including information on electronic parts and principles.

The simulation software was installed only on computers in a lab that did not have open lab hours. Thus, the students could access the software only during scheduled class time. Also, the electronics lab was made available to the students only during scheduled class hours. Therefore, there was limited concern regarding students' doing hands-on work on this assignment in excess of the time assigned by the instructor or, in this case, the researcher.

Both groups were told that because of limited licenses of Electronics Workbench, only one section could use the software at one time. It was made clear that the other section would use the computer simulation software for their next six-week assignment. This was done to reduce the effect of treatment bias.

Instruments

The instruments used in this study were developed by the author. The pretest and posttest were not identical but parallel forms. They were designed to measure student understanding of small-signal amplifiers and hence belonged to the cognitive domain. Bloom's taxonomy (1956) was used as a guide to develop a blueprint for the pretest and posttest. On analyzing the pilot study data, the Cronbach Reliability Coefficient for the tests was found to be 0.83. The paper-and-pencil tests consisted of 15 problems and 20 drill-and-practice items. The items that belonged to the knowledge, comprehension, and application classifications were categorized as drill-and-practice items. These items focused on design concepts. The items that belonged to synthesis, analysis, and evaluation classifications were categorized as problems. The problems dealt mostly with troubleshooting concepts.

Findings

A total of 32 subjects participated in this study. A seven item questionnaire was developed to collect descriptive data on the participants. Results of the questionnaire revealed that the average age of the participants was 21 years with a range of 19 to 36. The mean grade point average was 2.33 on a 4-point scale, with a range of 1.93 to 3.78. The questionnaire also revealed that three participants were females and 29 were males. Seventeen students were currently classified as sophomores while 15 were juniors. All students had taken only the prerequisite basic electronics course and had no additional coursework in electronics. Only three students stated that they had some work experience in electronics.

A t-test was conducted on the pretest scores for the two treatment groups. The mean of the pretest scores for the simulation group (3.45) was not significantly different from the lab group (3.56) (t = 1.29, p > 0.05). Hence, it was concluded that treatment groups were similar. By using Bartlett's test, the data were also tested for the assumption of homogeneity of variance. It was found that the variances were not significantly different (F = 0.31, p > 0.05).

Because the pretest and posttest were not equivalent, the difference between the pretest and posttest score was not meaningful. The posttest scores were analyzed to determine the treatment effects. In addition, an analysis of covariance procedure was used to reduce error variance by an amount proportional to the correlation between the pretest and the posttest using the pretest as a single covariate. The correlation between them was significant (r = 0.56, p < 0.05).

Research Question I

Will there be a significant difference in achievement based on a problem-oriented test between students in the experimental group versus the control group?

As shown in Table 1, the mean of the posttest scores for the simulation group (13.6) was significantly higher than the lab group (9.1). This difference was significant at the 0.05 alpha level (t = 2.89, p < .05). An analysis of covariance procedure yielded an F-value that was significant at the same alpha level (F = 2.97, p < 0.01).

Research Question II

Will there be a significant difference in achievement on a drill-and-practice test between students in the experimental group versus the control group?

The mean of the posttest scores for the simulation group (18.6) was slightly higher than the lab group (17.1), which is not a significant difference between the two groups (t = 1.53, p > .05). The result is shown in Table 1. Also, an analysis of covariance procedure yielded an F-value that was not statistically significant (F = 1.66, p > 0.05).

Table 1
Comparision of Test Performance by Instructional Method

Test Type Method N Mean SD t

Problems 2.89*
  Laboratory 16 9.1 4.32
  Simulation 16 18.6 3.26
Drill-and-Practice 1.53
  Laboratory 16 17.1 3.68
  Simulation 16 13.6 3.75

Note. *p < .05

Discussion

After conducting a statistical analysis on the test scores, it was found that students who used the computer simulation software integrated into lecture-lab activities performed significantly better on the problems than students who were taught using the traditional lecture-lab method of instruction. It was found that both groups did equally well on the drill-and-practice items. This finding is supported by the literature. According to Veenman, Elshout, and Busato (1994) problem-oriented simulations help develop higher-order thinking strategies and improve student cognitive abilities employed in the service of recall, problem-solving, and creativity.

The computer-based simulation software enabled students to experiment interactively with the fundamental theories and applications of electronic devices. It provided instant and reliable feedback. Thus, it gave students an opportunity to try out different options and evaluate their ideas for accuracy, almost instantly. The lab students presumed that the lab equipment was not always accurate and reliable and they sometimes made the mistake of attributing their design errors to experimental errors. Thus, the simulation activity focused mainly on the mental activity that took place within the learner. The lab activity focused on the physical as well as mental activity.

In addition, the time needed for hands-on work may have contributed to the difference between the two groups. The lab group had to physically implement their ideas with real components and then test them, which took a lot more time. The lab students could evaluate only a limited number of options within the allotted time. Also, based on informal observations, many students in the lab group appeared to be easily frustrated if they took time to build a circuit to test an idea and it did not work as expected. In contrast, the students in the simulation group appeared excited, perhaps because it took relatively less time to test new ideas and concepts and they received immediate accurate feedback.

Another important aspect of this research was guided learning. During previous semesters, the author observed that if students are given unguided assignments then the learning curve is relatively flat. This is supported by research that states that haphazard and poorly designed computer simulation activities do not enhance learning (Cope & Simmons, 1994; Rivers & Vockell, 1987). The students who are not certain where to start and what to try end up with inefficient learning. This applies to both laboratory and simulation activities.

Conclusions

Based on the results of this study, it can be concluded that effective integration of computer simulation into traditional lecture-lab activities enhances the performance of the students. Guided computer simulation activities can be used as an educational alternative to help motivate students into self-discovery and develop their reasoning skills. The lab activity can then focus on the actual transfer of knowledge. This strategy helps improve the effectiveness and efficiency of the teaching-learning process.

In situations where the objective of instruction is to learn the facts without application or transfer, method of instruction is not a significant factor. However, if the educational goal is for students to transfer and apply the knowledge to real-world problems, then simulations integrated into the class structure may be an effective learning strategy. Also, these activities should be based on guided exploratory learning and be designed to stimulate students' thinking processes.

It is recommended that further research be conducted to evaluate the effects of using guided-discovery instructional strategies on enhancing the problem-solving ability of students with different achievement levels, using different academic subject material. Also, there is a need to investigate the different cognitive models that students employ in understanding and evaluating technical concepts. This will provide the research community with vital insight into the design of computer simulations for improving higher-order cognitive skills.

Author

Gokhale is Associate Professor, Department of Industrial Technology, Illinois State University, Normal, Illinois.

References

Ausubel, D. P. (1968). Educational psychology: A cognitive view. New York: Holt, Rinehart, & Winston.

Bloom, B. S. (1956). Taxonomy of educational objectives, Handbook 1: Cognitive domain. New York: Longmans Green.

Cohen, J. (1987). Statistical power analysis for the behavioral sciences. Hillsdale, NJ: Erlbaum.

Cope, P., & Simmons, M. (1994). Some effects of limited feedback on performance and problem-solving strategy in a Logo microworld. Journal of Educational Psychology, 86(3), 368-379.

Dewey, J. (1938). Democracy and education. New York: Macmillan.

Faryniarz, J. V., & Lockwood, L. G. (1992). Effectiveness of microcomputer simulations in stimulating environmental problem solving by community college students. Journal of Research in Science Teaching, 29(5), 453-470.

Garcia, J. R. (1995). Use of technology in developing problem-solving/critical thinking skills. Journal of Industrial Technology, 11(1), 14-17.

Gokhale, A. A. (1991). Effectiveness of computer simulation versus lab, and sequencing of instruction, in teaching logic circuits. Journal of Industrial Teacher Education, 29(1), 1-12.

Leutner, D. (1993). Guided discovery learning with computer-based simulation games. Learning and Instruction, 3(2), 113-132.

Magnusson, S. J., & Palincsar, A. (1995). The learning environment as a site of science education reform. Theory into Practice, 34(1), 43-50.

Mayes, R. L. (1992). The effects of using software tools on mathematical problem solving in secondary schools. School Science and Mathematics, 92(5), 243-248.

Menn, D. (1993, October). Multimedia in education. PC World, M52-M60.

National Education Association Research Division Publication. (1994). Multimedia: Its promises and challenges for public education.

Pogrow, S. (1994). Students who just don't understand. Educational Leadership, 52(3), 62-66.

Quinn, C. N. (1993). Cognitive skills and computers: "Framing" the link. Proceedings of the Fifth International Conference on Thinking, Townsville, Australia.

Rivers, R. H., & Vockell, E. (1987). Computer simulations to stimulate scientific problem solving. Journal of Research in Science Teaching, 24(5), 403-415.

Smith, E. (1995). Where is the mind? Knowing and knowledge in Cobb's constructivist and sociocultural perspectives. Educational Researcher, 24(6), 13-22.

Tanner, L. N. (1991). The meaning of curriculum in Dewey's laboratory school (1896-1904). Journal of Curriculum Studies, 23(2), 101-117.

Thomas, R. A., & Hooper, E. (1989). Simulations: An opportunity we are missing. Paper presented at the annual meeting of the International Association for Computing in Education, San Francisco, CA.

Veenman, M. V., Elshout, J., & Busato, V. (1994). Metacognitive mediation in learning with computer-based simulations. Computers in Human Behavior, 10(1), 93-106.

Woolf, B., & Hall, W. (1995). Multimedia pedagogues: Interactive systems for teaching and learning. IEEE Multimedia, 74-80.

Reference Citation: Gokhale, A. A. (1996). Effectiveness of computer simulation for enhancing higher order thinking. Journal of Industrial Teacher Education, 33(4), 36-46.


DLA Ejournal Home | JITE Home | Table of Contents for this issue | Search JITE and other ejournals