JTE v9n1 - The Design of An Instrument to Assess Problem Solving Activities in Technology Education
The Design of an Instrument
to Assess Problem Solving Activities
in Technology Education
Roger B. Hill
Literally millions of dollars have been spent during recent years to build and equip new or renovated technology education laboratories and to implement contemporary instructional strategies ( R. Barker, personal communication, February 14, 1997 ). For instance, in the state of Georgia alone, over $23.9 million dollars has been spent since 1989 on modular-type programs ( Gossett, 1997 ). Modular curriculum designs have been widely adopted and the integration of math, science, and technology explored. Modular designs typically provide students, working in pairs, opportunities to progress through a series of guided learning activities with an emphasis on problem solving and a hands-on, minds-on approach to learning about technology. Modular lessons are available to address over twenty different specific technological topics and more are being developed on a regular basis.
Although adoption of modular curriculum models is one of the most visible trends in contemporary technology education, other significant issues are also impacting the profession. In particular, efforts to equip students with the ability to solve problems, to think analytically, and to apply technical knowledge to real world situations have become integral to technology education ( Technology for All Americans Project, 1996 ). Whether using a modular approach, a traditional unit lab approach, or some other organizational strategy for instruction, problem solving activities are a relevant and important part of technology education.
Proponents of adopting modular curriculum programs for technology education have cited numerous anecdotal accounts in support of the value and accomplishments of their programs, but systematic methods of defining and measuring student outcomes have not been sufficiently developed. As a result, the assessment of modular technology education programs and instruction has not been adequately implemented to guide allocation of resources, substantiate curriculum change, and establish the value of these educational activities within the larger educational community. While similar concerns might be raised about other trends in education, the movement toward modular curriculum designs in technology education has been one of the most prominent and significant issues for technology education professionals.
The Need for Assessment
Assessment is a process that uses information gathered through measurement to analyze or judge a learner's performance on some relevant work task ( Sarkees-Wircenski & Scott, 1995 ). The process can also be applied to a systematic examination of materials, programs, or activities for the purpose of formulating a value judgement about their suitability for a particular application. Procedures used in performing an assessment should be predicated upon a clear understanding of goals for instruction and the desired learning outcomes, whether assessing learner performance or some other aspect of the learning environment. Just as a compass on a ship allows the captain to determine direction of travel and make course corrections, assessment provides the feedback needed by an instructor to successfully guide student learning activities.
In response to public and political pressure to assure accountability and reduce expenditures, assessment of educational programs is viewed as being increasingly important ( Lewis, 1995; Sewall, 1996 ). It is therefore essential that technology education professionals be equipped with tools to effectively assess how instructional materials and teaching methodologies are facilitating learning ( Custer, 1996 ).
Assessment of technology education must go beyond the tacit approval sometimes afforded after a cursory look at facilities and activities. Whether observing the spellbound visitors on the floor of the annual ITEA Conference Trade Show or the expressions of awe from first-time visitors to recently renovated technology labs, it is evident that fascination with technological gadgetry can initially occur. Caution must be used so that this effect does not overshadow the outcomes that technology education should be producing. Students who successfully participate in technology education activities should develop a number of intellectual qualities including "understanding and competence in designing, producing, and using technology products and systems, and in assessing the appropriateness of technological actions" ( Wright & Lauda, 1993, p. 4 ). Creating appropriate assessment strategies as well as establishing effective technological literacy efforts at each level of schooling should be a primary goal of the profession ( Technology for All Americans Project, 1996 ).
A key element in the study of technology and the development of technological literacy is the task of solving problems. The Technological Method Model ( Savage & Sterry, 1990 ), described in the Conceptual Framework for Technology Education , spoke to the issues of how humans use technology to solve problems. This model specifically addressed problem solving as an essential component to working and competing in the modern day workforce. The professional literature in the field of technology education is replete with references to problem solving and the importance of this intellectual process within the contemporary world ( Johnson, 1987, 1994; McCade, 1990; Shlesinger, 1987; Tidewater Technology Associates, 1986; Waetjin, 1989). Therefore it is imperative that professionals in the field incorporate problem solving concepts and strategies as a significant element in curriculum design and implementation.
The task of solving problems can be undertaken in a variety of ways. Problem solving can be approached from simple trial-and-error efforts and range on a continuum to highly complex approaches. Many technology educators espouse the need to create opportunities for students to learn multiple approaches to problem solving with movement toward the development of models to facilitate student growth in strong mental methods of inquiry when solving technological problems ( Herschbach, 1989; Hutchinson & Hutchinson, 1991; Todd, 1990; Wicklein, 1993; Zuga, 1989).
One of the aspects that should distinguish technology education from other program areas that address technological content is the integrated study of technological processes, knowledge, and context. In presenting A Rationale and Structure for the Study of Technology , the Technology for all Americans Project (1996) identified these three components as universals for the study of technology. Knowledge related to technology, and processes related to technology, are taught within the context of manipulative activities with information, physical, or biological systems. Hands-on activities are important, but they are not aimed toward development of vocational competencies. They provide a setting for experiential learning related to technology. Of significance to this study, knowledge of technology and manipulative skills related to technology are relatively easy to measure and assess. Use of technological processes, with associated thinking and problem solving skills, is often challenging to measure and accurately assess.
Mental Processes as a Basis for Assessment
Halfin (1973) conducted the seminal research that identified the mental process used by practicing technologists. Beginning with a review of the writings of ten high-level technologists; including persons such as Thomas Edison, Frank Lloyd Wright, and Buckminster Fuller; Halfin identified, and used a Delphi technique to validate 17 processes which were universal to the work of technological professionals. Wicklein (1996) has since undertaken a follow-up study to re-evaluate these processes and further define each of them. In both instances, research was conducted for the benefit of industrial arts or technology education professionals, but the work would be applicable to anyone with an interest in the mental processes used by technologists.
The processes identified by Halfin were operationally defined in his work. In addition, Wicklein developed examples for each of the mental processes to more clearly describe their meanings so that each could be discriminated from the other ( Hill, 1996 ). This task was completed following a thorough study of Halfin's work. Wicklein used these operational definitions and examples in developing instrumentation used to re-evaluate the mental processes, but these materials were also a critical element in the development and use of the assessment described here. Table 1 lists the mental processes and operational definitions developed by Halfin for each of the mental processes.
From a practical perspective, the mental processes used by practitioners in technological occupations provide a useful guide for the assessment of instructional activities and content in technology education programs. In some respects, basing curriculum content on mental processes, a relatively constant set of constructs, is more logical than focusing on technological products which are constantly changing. Instructional use of mental processes and product technologies are not mutually exclusive. Technology education inherently includes hands-on experiences with materials and instruction about technical content. When considered in the proper perspective, however, content related to materials and technical processes is characterized by rapid obsolescence while technological mental processes remain relatively stable and continue to be useful for many years. Both should be included in technology education instruction, but the primary emphasis should be placed on the mental processes.
Table 1
OPTEMP codes and definitions for Halfin's mental processes
Code
Mental Process and Definition
(DF)
Defining the Problem or Opportunity Operationally. The process of stating or defining a problem that will enhance investigation leading to an optimal solution. It is transforming one state of affairs to another desired state.
(OB)
Observing. The process of interacting with the environment through one or more of the senses. (Seeing, hearing, touching, smelling, and tasting.) The senses are utilized to determine the characteristics of a phenomenon, problem, opportunity, element, object, event, system, or point of view. The observer's experiences, values, and associations may influence the results.
(AN)
Analyzing. The process of identifying, isolating, taking apart, breaking down, or performing similar actions for the purpose of setting forth or clarifying the basic components of a phenomenon, problem, opportunity, object, system, or point of view.
(VI)
Visualizing. The process of perceiving a phenomenon, problem, opportunity, element, object, event, or system in the form of a mental image based on the experience of the perceiver. It includes an exercise of all the senses in establishing a valid mental analogy for the phenomena involved in a problem or opportunity.
(CO)
Computing. The process of selecting and applying mathematical symbols, operations, and processes to describe, estimate, calculate, quantify, relate, and/or evaluate in the real or abstract numerical sense.
(CM)
Communicating. The process of conveying information (or ideas) from one source (sender) to another (receiver) through a media using various modes. (The modes may be oral, written, picture, symbols, or any combination of these.)
(ME)
Measuring. The process of describing characteristics (by the use of numbers) of a phenomenon, opportunity, element, object, event, system, or point of view in terms which are transferable. Measurements are made by direct or indirect means, are on relative or absolute scales, and are continuous or discontinuous.
(PR)
Predicting. The process of prophesying or foretelling something in advance, anticipating the future on the basis of special knowledge.
(QH)
Questioning and Hypothesizing. Questioning is the process of asking, interrogating, challenging, or seeking answers related to a phenomenon, problem, opportunity, element, object, event, system, or point of view. Hypothesizing is a process of stating a theory of tentative relationship between two or more variables to be tested which are aspects of a phenomenon, problem, opportunity, element, object, event, system, or point of view.
(ID)
Interpreting Data. The process of clarifying, evaluating, explaining, and translating to provide (or communicate) the meaning of particular data.
(MP)
Constructing Models and Prototypes. The process of forming, making, building, fabricating, creating, or combining parts to produce a scale model or prototype.
(EX)
Experimenting. The process of determining the effects of something previously untried in order to test the validity of a hypothesis, to demonstrate a known (or unknown) truth or try out various factors relating to a particular phenomenon, problem, opportunity, element, object, event, system, or point of view.
(TE)
Testing. The process of determining the workability of a model, component, system, product, or point of view in a real or simulated environment to obtain information for clarifying or modifying design specifications.
(DE)
Designing. The process of conceiving, creating, inventing, contriving, sketching, or planning by which some practical end may be effected, or proposing a goal to meet the societal needs, desires, problems, or opportunities to do things better. Design a cyclic or iterative process of continuous refinement or improvement.
(MO)
Modeling. The process of producing or reducing an act, art, or condition to a generalized construct which may be presented graphically in the form of a sketch, diagram, or equation; presented physically in the form of a scale model or prototype; or described in the form of a written generalization.
(CR)
Creating. The process of combining the basic components or ideas of phenomena, objects, events, systems, or points of view in a unique manner which will better satisfy a need, either for the individual of for the outside world.
(MA)
Managing. The process of planning, organizing, directing, coordinating, and controlling the inputs and outputs of the system.
A major impediment in past efforts to assess technology education outcomes has been the difficulty of defining and measuring such abstract concepts as technological literacy and problem solving ability. This dilemma has
been exacerbated by the mindset of ex-post facto assessment. The typical pattern has been to provide some type of instructional experience or treatment and then to test participants for learning outcomes. New forms of assessment, such as portfolios and journals, are gaining acceptance and are intended to enhance the
learning process in addition to providing evidence of learning ( Kane & Khattri, 1995 ; Travis, 1996 ). Unless actual development of materials is observed in some purposeful manner, however, even portfolios do not provide a means of assessing learning activities as they occur.
Educational research, particularly within the field of cognitive psychology, has provided a theoretical framework upon which to base an alternative form of assessment. In particular, research has shown a clear relationship between teaching behaviors and student achievement ( Brophy & Good, 1986 ). Teaching behaviors include not only the actions of the teacher, but the instructional activities facilitated by that teacher. With this as a premise, alternative forms of assessment can be considered. Rather than only testing after learning activities are completed, assessment can be conducted during the learning activities. Used in conjunction with traditional forms of testing, this form of assessment holds great potential to provide additional feedback regarding the learning process. If outcomes measured in ex-post facto testing are substandard, data gathered during the learning process can be used to analyze the causal factors. In addition, such a tool could assess processes that are not conducive to traditional testing, either due to lack of an appropriate test or because such tests are not sensitive enough to accurately measure the outcomes.
The purpose of this research was to develop and field test a technique for assessing the mental processes used by students as they participate in instructional learning activities in technology education. The technique focused on mental processes used by technology practitioners in their work, as identified by Halfin (1973), and provided an objective measurement that could be used to assess the procedural content during a learning activity. The processes developed in this study can be readily applied to programs that use a modular approach in instruction because of the relatively structured activities included, the focus on problem solving, and a level of student movement that is conducive to observation. The technique, however, could be adapted and used with other forms of instructional problem solving activities. The results would be relevant for any program focused on development of mental processes, of the type used by practicing professional technologists, for problem solving.
Method
The focal point of the assessment tool developed through this research was a measure of the duration and the frequency of selected mental processes necessary for effective problem solving used by students in the completion of technology education learning activities. The tasks necessary to accomplish this included (1) developing a procedure for identifying the mental processes as they were used by students, (2) creating a tool to aid in analyzing the duration and frequency of the mental processes used by students, and (3) testing the system for consistency and reliability.
It is relevant to note that while the term assessment often is used within a context where a value judgement is made and one thing is determined to better than another, the process described in this study uses the term operationally to describe procedures for identifying particular activities, determining how long these activities last, and how frequently activities are repeated in practice. The procedure would enable an observer to determine whether a learning activity accomplished objectives related to use of mental processes in problem solving. It was not, however, designed to directly measure the products or outcomes of the activities involved.
For purposes of this study, the mental processes identified by Halfin (1973) were used. A document was assembled in which a definition was stated and examples were listed for each of these processes. This document provided a ready reference to clarify each of the processes and was frequently referred to during the assessment procedures. Two-letter codes were also developed for each of the mental processes to be used for recording purposes. Table 1 provides a list of the two-letter codes, the mental processes, and their definitions. The document used during the study also included from 6 to 10 behavioral examples for each of the mental processes. It was not included in this manuscript for the sake of brevity, but a copy of it can be obtained directly from the author.
The basic procedure for identifying which mental processes were being used by students consisted of carefully studying the written materials and instructions for a particular learning activity, and then observing students as they completed the activity. Videotapes of students completing modular technology education learning activities were used for the development of the assessment procedure reported in this study. This was necessary to be able to test for consistency and reliability of the assessment procedure. With appropriate instruction and experience, it was anticipated that the assessment procedure could be performed during a live observation session if preferred. It should be noted that it was necessary to view the students themselves, the written materials they were following, the apparatus being used by the students, and to hear the student conversations in order to accurately identify the mental processes being used.
Three pairs of subjects were videotaped for purposes of this study. All students attended a high school or middle school located in the southeastern United States. One pair of male high school students, one pair of male middle school students , and one pair of middle school students consisting of a male and female were voluntary participants for the study. High school students were videotaped as they completed two activities of a construction module prepared by a commercial vendor of technology modules and equipment. The middle school students were videotaped as they worked through two activities of a color computer-aided publishing module produced by a supplier of computer peripherals. These activities covered word processing fundamentals and page layout. The authors who developed the computer-aided publishing module were experienced technology educators with a combined total of 28 years in the profession. The equipment used by the middle school students in the module consisted of a desktop microcomputer, desktop publishing software, and a color ink-jet printer.
To field test the assessment procedure, written materials for the instructional activities were first carefully reviewed and notations were made, using the two-letter mental process codes listed in Table 1, to identify the mental processes students were likely to use at various points during the lesson. The list of mental processes with codes, operational definitions, and behavioral examples was kept at hand during the actual assessment procedure.
Preliminary testing of the observation procedure was done using a timer to record the duration and frequency of each mental process observed. Two independent observers completed two observation procedures each using the videotape of the two high school students working with the construction technology module. This phase of field-testing demonstrated that agreement could be achieved between observers independently viewing videotaped technology education activities. In this initial test, rate of agreement was 100% for identifying the mental processes being used and the duration and frequency measures were within an 80% rate of agreement for the two independent observers. The recording process was cumbersome, and the tabulation of long observations tedious.
In response to difficulties noted in the preliminary testing phase, a computer program was developed to provide a tool for recording and analyzing the duration and frequency of use of the various mental processes by students. The function of the computer was to serve as both a timer and a counter, allowing the observer doing the assessment to simply key in the two-letter mental process codes as they were observed, and having the computer to tabulate the results upon completion. The program, written by the principal investigator of the study and coded in BASIC, was named Observation Procedure for Technology Education Mental Processes (OPTEMP).
The refined assessment procedure was initiated by running the OPTEMP program and selecting the Mental Processes Measurement option from the main menu. After responding to some questions related to observation subject and observer, timing was ready to start. The videotape was turned on and as actions reflecting various mental processes were observed, the two-letter codes were keyed into the computer. With each change of activity, mental process codes were entered. The computer program timed each event, tallied the frequency for each, and following completion of the session provided a printed summary for each mental process code entered.
To establish reliability of the OPTEMP procedure, two observers independently observed and recorded the duration and frequency of mental processes using the three videotaped sessions of middle school students completing activities using the computer-aided publishing module. Prior to beginning the first observations, the modular materials used by the students were reviewed and the observers discussed definitions of the mental processes. Observer 2 found it helpful to classify and label the anticipated mental processes in the instructor copy of the printed materials used by the students. This observer made adjustments and additions as the videotape was viewed, but the initial analysis aided the accuracy of the final observations recorded.
The first OPTEMP observation was completed using videotape of the male and female middle school students completing a page layout activity. The additional observations used videotapes of the male and female middle school students completing a word processing activity and the two male middle school students completing the page layout activity. The typical pattern used by the students as they worked together was for one student to read aloud the instructions and identify the significant steps while the other student worked with the computer. They worked together in this manner throughout each of the activities and discussed the instructions provided in the module as they completed the steps described there.
Results
The summary reports were the key artifacts used in assessing the outcome of the three videotaped field tests analyzed using the OPTEMP. The total duration in minutes and a frequency count was recorded for each mental process as interpreted independently by the two observers. Table 2 provides the results for the first test of inter-rater reliability. Pearson correlation coefficients were calculated to determine how reliable the OPTEMP results were for the two independent observers. The correlation coefficient for overall duration on the mental processes observed in the first inter-rater reliability test was .94 and the correlation coefficient for frequency was .95 (see Table 3). There were two discrepancies in the mental processes coded , with observer 1 coding instances of analysis and observer 2 coding experimenting. In addition, a 2.39 minute difference in total duration for the procedure indicated a need for better cueing of the videotaped session.
Results from the second inter-rater reliability test are presented in Table 4. The correlation coefficient for duration in this test was .88 and the correlation coefficient for frequency was .92. Two discrepancies in mental processes occurred with observer 1 recording experimenting and observer 2 recording instances of creating. Overall durations for the two observations were approximately equivalent for this session.
The results for a third inter-rater reliability test produced a correlation coefficient of .93 for duration and a correlation coefficient of .91 for frequency. The observers were in agreement concerning the mental processes that were observed (see Table 5) and the overall durations of the observations were suitably close.
Observer 1 completed a test and re-test for the videotaped session used in the first inter-rater reliability test, the tape of male and female students completing the page layout activity. The results of this test are presented in Table 6. The correlation coefficient for duration in this repeated test was .95 and the correlation coefficient for frequency was .97. One discrepancy in mental processes occurred with instances of experimenting being observed during the second viewing of the videotape.
Table 2
Inter-rater reliability for OPTEMP used by two independent observers with male and female middle school students completing page layout activity
Mental Process
Obs. 1 Time
Obs. 2 Time
Obs. 1 Freq.
Obs. 2 Freq.
DF
1.70
2.33
3
3
OB
.77
.65
7
3
AN
.42
--
5
--
CM
7.65
5.30
34
18
ME
2.18
2.98
10
8
QH
1.33
.75
11
5
MP
7.53
5.98
26
20
EX
--
.75
--
4
CR
3.40
3.85
4
2
Totals
24.98
22.59
100
63
Note . Time is in minutes. Obs.=Observer; Freq.=Frequency
Table 3
Pearson correlation coefficients for observations using OPTEMP
Observation
Pearson Correlation
Coefficient for Duration
Pearson Correlation
Coefficient for Frequency
Inter-rater Reliability for Two Observers with Male and Female Completing Page Layout Activity
.94
.95
Inter-rater Reliability for Two Observers with Male and Female Completing Word Processing Activity
.88
.92
Inter-rater Reliability for Two Observers with Two Males Completing Page Layout Activity
.93
.91
Repeated Observations by Same Observer with Male and Female Completing Page Layout Activity
.95
.97
Table 4
Inter-rater reliability for OPTEMP used by two independent observers with male and female middle school students completing word processing activity
Mental Process
Obs. 1 Time
Obs. 2 Time
Obs. 1 Freq.
Obs. 2 Freq.
DF
.88
1.95
5
8
OB
2.32
.97
20
6
AN
1.23
.97
10
3
CM
9.17
9.18
45
36
ME
1.13
1.38
5
3
QH
2.02
3.77
16
15
MP
9.32
6.80
43
27
EX
1.23
--
2
--
TE
5.28
4.38
1
4
DE
--
.48
--
3
CR
1.43
4.20
2
7
Totals
34.01
34.08
149
112
Note . Time is in minutes. Obs.=Observer; Freq.=Frequency
Table 5
Inter-rater reliability for OPTEMP used by two independent observers with two male middle school students completing page layout activity
Mental Process
Obs. 1 Time
Obs. 2 Time
Obs. 1 Freq.
Obs. 2 Freq.
DF
.60
1.38
7
5
OB
3.63
.97
43
12
AN
2.02
1.12
24
7
CM
11.88
13.43
80
64
ME
1.82
3.58
16
13
QH
1.27
.83
14
8
MP
10.77
8.57
76
39
EX
.85
1.12
4
4
CR
.68
2.28
4
7
Totals
33.52
33.28
272
159
Note . Time is in minutes. Obs.=Observer; Freq.=Frequency
In addition to the numerical results reported in the tables, several conclusions were noted regarding the OPTEMP system. Both observers found that use of the procedure caused them to analyze what was happening during the technology education instructional activities in a more detailed manner than they had done so before. They also noted that the nature of the modular curriculum materials aided the process because the observer could become familiar in advance with the basic path the students would follow and this facilitated identification of most of the mental processes that would be used by students.
The overall agreement among independent observers regarding which mental processes were being used provided some evidence that the OPTEMP was valid. To provide additional evidence of validity, feedback from students about their own interpretation of the mental processes being used during technology education activities could be obtained and compared with the OPTEMP results. This technique was precluded in the present study due to the young age of the middle school participants and their limited understanding of the mental processes as defined, but upper level high school or post-secondary students would be capable of comprehending and distinguishing their own use of the mental processes.
Table 6
Observation times and frequencies for OPTEMP repeated by the same observer for male and female middle school students completing page layout activity
Mental Process
1st Obs.
Time
2nd Obs.
Time
1st Obs.
Freq.
2nd Obs.
Freq.
DF
1.70
1.50
3
5
OB
.77
1.35
7
16
AN
.42
.63
5
5
CM
7.65
8.82
34
44
ME
2.18
2.08
10
15
QH
1.33
.87
11
10
MP
7.53
8.28
26
38
EX
--
.42
--
3
CR
3.40
.92
4
6
Totals
24.98
24.87
100
142
Note . Time is in minutes. Obs.=Observer; Freq.=Frequency
Conclusion and Recommendations
Based on the inter-rater reliability test results and the repeated test results, the OPTEMP was determined to be an effective tool for assessing the use of mental processes during completion of modular technology education learning activities. The data gathered using this technique would enable an instructor or researcher to accurately determine which mental processes, used by practicing technologists, were being implemented by technology education students. Activities could be modified and further assessed to incorporate key processes not being used and changes could be made to adjust the duration and frequency of processes presently in use.
The use of modular technology education instructional activities in the development and testing of the OPTEMP facilitated this study because students working in that environment were readily available, limited physical movement facilitated videotaping, and instructional content incorporated numerous instances for problem solving. The specific comments provided in the remainder of this section reference the use of modular curriculum materials, but these are not intended to preclude the use of the OPTEMP with other instructional formats.
These comments are also based on two key assumptions. The first is that duration and frequency of using a mental process are related to learning outcomes specific to that process. In other words, the more someone uses a mental process, the better one will be at using that mental process. The other assumption is that it is appropriate for technology education to facilitate student development of mental processes used by practicing technologists. Halfin (1973) and Wicklein (1993, 1996) have provided a sound rationale for this.
The modular technology education lessons used in this study were not purposefully designed to provide students with opportunities to apply the mental processes identified by Halfin. As is the case with all well-designed technology education modules, problem solving activities were included. Incidental to that, uses of mental processes were involved in the work necessary to complete the instructional activities. The OPTEMP has significant value for technology education professionals who intend to encourage development of mental processes in some deliberate way. By providing a tool for assessing the process content of modular learning activities, the OPTEMP facilitates comparison of the various instructional products on the market and makes possible more informed choices about use of scarce educational resources.
Further development and revision of previously installed technology education modules is another area of concern for most practicing technology educators. These changes and enhancements are typically guided by problems such as apparatus not working correctly, student confusion about instructions, or time requirements for an activity. Use of the OPTEMP could further enhance revisions of existing materials. By identifying mental processes with less duration and frequency of use, revisions could be directed to provide problem solving experiences that include a more balanced treatment of the mental processes used by technologists.
The OPTEMP could find meaningful use in the preparation of technology education teachers. In considering curriculum design and other related issues, the observer using the OPTEMP would gain a heightened awareness of what is really happening for students as a learning activity is conducted. Just as a color or other characteristic of an object can go unnoticed unless attention is called to it, student learning activities can be observed but not really understood in the absence of a systematic approach for analysis. Use of the OPTEMP during curriculum classes, intern experiences, and perhaps as a tool during student teaching could enhance the preparation of future technology education teachers.
Wicklein (1993) has proposed development of a process-based technology education curriculum. In such a design, technology education instructional activities would be organized around the mental processes of technologists rather than around a product-based technology. Content related to communication, transportation, manufacturing, and construction would still be included, but the arrangement of this content would differ. Goodlad (1966) recommended designing curriculum that had a systematic, carefully considered approach, but not necessarily a step-by-step sequence of topics in ascending order of difficulty. Structuring curriculum around the mental processes would emphasize the more permanent constructs associated with thinking and problem solving. Technical content would be included, but specific skills and knowledge of materials would be presented with the understanding that changes could be quickly expected. Emphasis on the relatively stable mental processes would provide learning experiences that would have lasting value for students.
For those that would develop and adopt a process-based approach to curriculum, the OPTEMP provides a readily available assessment tool for analyzing the results. Instructional materials designed around traditional approaches to technology education can be evaluated, but the instrument aligns with the stated purpose of process-based materials and would therefore provide a more accurate assessment of their effectiveness.
In this initial work with the OPTEMP, the focus was directed more toward the design of the learning activities than on individual learner performance. Further research is needed to explore uses of the OPTEMP for assessment of learning outcomes. At present no claim has been made other than that OPTEMP scores indicated the duration and frequency of mental processes that students used as they went about the problem solving activities included in technology education learning activities. It is anticipated that correlation between OPTEMP scores for student use of appropriate mental processes and gain scores on established tests would be moderate to high. Those students who spend time off task or who make significant use of inappropriate mental processes would be expected to be less efficient in their learning about technology. Use of the OPTEMP in conjunction with traditional tests might produce a diagnostic assessment which would aid in helping students develop more effective work habits and learning strategies.
Additional testing of the OPTEMP beyond this initial study is also needed to address issues of variability. Use of the procedure by a greater variety of observers and with different types of students, perhaps with differing cultural backgrounds, is requisite to further establishing reliability and validity. In addition, further testing with several additional types of modular technology education activities, and perhaps with other instructional strategies that incorporate problem solving, would determine the versatility of the procedure. The study described here has provided a description of the procedure and offered results of initial testing, but much additional work is needed to verify the usefulness of the OPTEMP and to further refine procedures for its application.
The issue of assessment will be of critical importance to every area of the educational enterprise as the new century dawns (Stiggins, 1995). The assessment system developed and tested in this study was shown to hold promise as a reliable and useful tool for analyzing important components of technology education problem solving activities and should be of benefit to the profession. The potential benefits range from aiding the identification of quality instructional materials to assisting in the preparation of technology education teachers. As previously mentioned, work is needed to further refine the OPTEMP procedures, to enhance the computer program, and to establish validity and utility of the system. Without adequate assessment procedures, technology education cannot reach its full potential and it will continue to struggle for recognition and acceptance within the greater educational community.
References
Johnson, S. D. (1987). Teaching problem solving. School Shop , 46 (7), 15-17.
Johnson, S. D. (1994). Research on problem solving instruction: What works, what doesn't. The Technology Teacher , 53 (8), 27-36.
Lewis, A. C. (1995). Performance testing. Education Digest , 61 (4), 73.
McCade, J. (1990). Problem solving: Much more than just design. Journal of Technology Education , 2 (1), 28-42.
Sarkees-Wircenski, M., & Scott, J. L. (1995). Vocational Special Needs . Homewood, IL: American Technical Publishers.
Tidewater Technology Associates. (1986). Problem solving. The Technology Teacher , 46 (3), 15-22.
Todd, R. D. (1990). The teaching and learning environment. The Technology Teacher , 50 (3), 3-7.
Travis, J. E. (1996). Meaningful assessment. Clearing House , 69 (5), 308-312.
Wicklein, R. C. (1993). Developing goals and objectives for a process-based technology education curriculum. Journal of Industrial Teacher Education , 30 (3), 66-80.
Zuga, K. F. (1989). Relating technology education goals to curriculum planning. Journal of Technology Education , 1 (1), 34-58.
Roger B. Hill is an Assistant Professor in the Department of Occupational Studies at the University of Georgia, Athens, GA.