Abstract
This study ranked constructs articulated by Childress and Rhodes (2008) and identified the key indicators for each construct as a starting point to explore what should be included on an instrument to measure the engineering design process and outcomes of students in high schools that use the PLTW and EbD™ curricula in Idaho. A case-study design was used. Data were collected in two stages. In the first stage, a content analysis was conducted for PLTW and EbD™ curricula to identify the indicators that are associated with the six constructs articulated by Childress and Rhodes (2008) . In the second stage, the constructs and key indicators or concepts were placed on a survey and sent to experts for them to rate their importance for assessment and their difficulty to assess. Main findings included engineering and human values and the application of engineering design being ranked as first and second, respectively, for inclusion on an instrument to measure the engineering design process and outcomes. In addition, a total of 141 indicators were identified for all constructs. The indicators identified provide a useful list of measures that can be used by technology and engineering teachers. Selected indicators can be used by math, science, technology, and engineering education teachers as they coordinate in the teaching of STEM concepts and collaborate in the designing of project-based activities that they engage students in solving.
Keywords : assessment; EbD™; engineering design process; PLTW; problembased learning; project-based learning; STEM.
Introduction
Problem-based learning (PBL) promotes deep thinking and problemsolving skills ( Woods, 1996 ). It has proven to be an effective way to learn subject knowledge, and in most PBL programs, “the goal is to empower the students with the task of creating the learning objectives that are important to them” ( Woods, 2000 , p. 2). Students are confronted with a scenario constructed around real-life problems, which by their nature are ill-structured, open-ended, and ambiguous, that launch students’ inquiry as they collaborate to find solutions ( Banks & Barlex, 2014; Woolfolk, 2013 ). Project-based learning is commonly used in technology and engineering education. Because project-based learning shares many of the instructional, multidisciplinary traits as PBL, the terms are often confused or used interchangeably ( Honey, Pearson, & Schweingruber, 2014 ).
According to Banks and Barlex (2014) , the difference between projectbased learning and PBL is that:
PBL has tended to be a way of configuring the curriculum and relating what the students know to actual, real-world problems which in turn leads them to find out new knowledge and skills to bring to bear on the problem. Rather, project-based learning has been more about a pupil choosing an extended activity that they are interested in and using it as a vehicle for demonstrating their current capabilities, but also including demonstrating their abilities in researching and investigating new knowledge and acquiring skills as required. (p. 141)
Project-based learning can also be built on authentic, real-world situations or problems ( Edström, Soderholm, & Knutson Wedel, 2007 ). As they work in groups, students are not restricted on where they may look for answers. In a review of the research on project-based learning, Thomas (2000) articulated five criteria that characterized projects:
- “Projects are central, not peripheral to the curriculum,”
- “focused on questions or problems that ‘drive’ students to encounter (and struggle with) the central concepts and principles of a discipline,”
- “involve students in a constructive investigation” (p. 3),
- “student-driven to some significant degree,” and
- “realistic, not school-like” (p. 4).
“PBL incorporates real-life challenges where the focus is on authentic (not simulated) problems or questions and where solutions have the potential to be implemented ( Gordon, 1998 ; as cited in Thomas, 2000 , p. 4).
Assessment refers to the process of determining the extent to which students are achieving the intended learning outcomes ( Gronlund, 1998 ). In PBL and project-based learning, assessment should emphasize problem solving, critical thinking, and reasoning skills. Creating problems that are similar to tasks accomplished in real life—authentic tasks—is a key principle for assessment used in both paradigms of instruction. However, assessment techniques that are repeatable are very challenging because of the subjective nature of PBL and project-based learning. In fact, McCracken and Waters (1997) believed that the requirement to have authentic tasks in problem solving conflicts with the requirement for assessments to be repeatable, because authentic tasks are themselves ill-structured and difficult to assess objectively.
Assessments in Technology and Engineering Education
The inclusion of engineering design as a part of the Standards for Technological Literacy has resulted in more curricula promoting engineering design activities. Engineering design is a systematic and often iterative approach to designing objects, processes, and systems to meet human needs and wants ( National Research Council, 2012 ). The Standards for Technological Literacy define engineering design as “the systematic and creative application of scientific and mathematical principles to practical ends such as the design, manufacture, and operation of efficient and economical structures, machines, processes, and systems” ( International Technology Education Association, 2007 , p. 238). Engineering design problems are project-based activities during which students use the engineering design process to solve the problem while working in groups.
Traditional engineering education programs at the college level use a variety of methods to collect evidence that students are achieving intended learning outcomes. These include written and oral questions, performance ratings, product reviews, journals, portfolios, and other self-reports such as inventories and questionnaires. Written assessments include multiple-choice and other closed items, calculations, and open-ended questions. Oral questions, on the other hand, enable teachers to uncover students’ misconceptions. They require students to think on their feet and speak coherently. Journals and portfolios provide records of students’ individual and collaborative efforts in design projects. “They reveal students’ critical thinking and reasoning skills, and record the steps students followed in an engineering [design] process” ( Gray, 2007 , p. 161). Performance rating can be used to assess students’ process and products in engineering design. Rating scales that define the degrees of quality along with rubrics (which are a list of the quality of a performance, process, or product) are used to assess the student. Self-report measures allow “students to reflect on their learning experiences” and help “them to see more clearly the connections among the concepts they have learned, as well as the applications of these concepts to new situations” ( Gray, 2007 , p. 161).
Addressing the infusing of engineering design at the K–12 level, some researchers have indicated that there are still areas in assessment that are open issues ( Lewis, 2005 ; Kelley, 2008 ; Wicklein, 2005 ). Technology educators face these issues or challenges when they seek to implement engineering design into their curriculum. For example, the past few years have seen more school districts in Idaho adopting either the Engineering by Design (EbD™) or the Project Lead the Way curricula in their technology and engineering education programs. Some teachers and administrators, including the program director for the Technology and Engineering program in the State of Idaho, have expressed the need to explore having some assessment tool to measure engineering design outcomes that is repeatable, irrespective of whether the school uses the Engineering by Design (EbD™) curriculum or the Project Lead the Way curriculum (PLTW). Other teachers, including science and math teachers in some smaller school districts who also teach technology education, think that it would be helpful to have some instrument that guides them in their assessment. As a starting point to explore this issue, the authors decided to identify indicators to measure the constructs associated with engineering design process and outcomes particularly for Grades 9–12 level—the level at which the EbD™ and the PLTW curricula are primarily used in Idaho.
The Framework
The conceptual framework for this study drew on the work of Childress and Rhodes (2008) in which they examined engineering design content that should be taught in high school curricula. They articulated a framework to define the engineering design curriculum content. The seven categories were identified through a modified Delphi approach. They are:
- “Engineering design . . . [that] emphasizes the importance of creativity in designing engineered solutions to problems . . . [as well as] design iteration . . . and tradeoffs” (p. 7).
- “Application of engineering design . . . [that] includes outcomes related to specific design activities . . . [including] experimentation, prototyping, and reverse engineering” (p. 7–8).
- “Engineering analysis . . . [that] includes using mathematics to optimize solutions, and . . . emphasizes the use of mathematics and science in the engineering design process” (p. 8).
- “Engineering and human values . . . [that consists of] the interaction of engineering design and society . . . [for instance,] safety and the environment versus costs and ethics” (p. 8).
- “Engineering communication . . . [that includes] all sorts of communications important to the engineering design process” (p. 8).
- “Engineering science . . . [that] includes many of the traditional engineering ‘sciences’ such as statics and dynamics . . . [as well as] material processes, ergonomics, energy power, etc.” (p. 8).
- “Emerging fields of engineering . . . [that includes] nanotechnology . . . [and] genetic engineering” (p. 8).
In this research study, we used six of the seven categories. The seventh category, emerging fields of engineering, was not used because it related mainly to nanotechnology, which is not covered in the high school curriculum.
Purpose of the Study
The purpose of this study was to identify indicators for each of the constructs identified by Childress and Rhodes (2008) that can be used by STEM teachers in Idaho as a guide when they are assessing design outcomes of students in high schools, irrespective of whether the curriculum in use is EbD™ or PLTW. In addition, these indicators can provide researchers in STEM with items that can be used in the development of an instrument for assessment in engineering design at the high school level. The research questions that guided this study are:
- How are the constructs identified by Childress and Rhodes (2008) ranked by professional engineers and educators in terms of criticality for inclusion on an instrument to measure engineering design outcomes in high schools in Idaho?
- What are the key indicators associated with each of the constructs identified by Childress and Rhodes (2008) to measure engineering design outcomes in high schools in Idaho?
Research Design
A case-study design was used. Case-study research involves the study of a case within a real-life, contemporary context or setting. It is a qualitative approach in which the investigator explores a real-life bounded case over time, using detailed data collection ( Yin, 2009 ). A letter was sent to the program manager for the Technology and Engineering Education program requesting permission for the two schools’ participation.
The Cases
Two cases were examined. One school in northern Idaho that uses the EbD™ curriculum and another school in southern Idaho that uses the PLTW curriculum. The school in northern Idaho had its own unique way of organizing and supplementing the EbD™ course material. The Fundamentals of Technology course is taught in Grade 9 and is only offered one semester. The technological design curriculum is taught in Grade 10 and covers topics such as career search, sketching, toy design (which the instructor uses for teaching shop safety, power tools, and finishing), Logo design concepts, mouse-trap cars, SolidWorks™ for bridge building, Co 2 cars, and an additional design problem. The curriculum emphasizes the engineering team concept and encourages creative design for all students. The Advanced Design Applications Class, taught in Grade 12, uses a material science curriculum developed by Energy Concepts Inc. that includes solid materials, metals, polymers, ceramics, and composites. The emphasis is on the importance of materials engineering to the manufacturing process. The engineering design courses included more SolidWorks™, robotics, and the VEX curriculum as well as total quality management. Each course requires the students to complete a project.
The school in southern Idaho uses the PLTW curriculum. Introduction to Engineering is taught in Grade 9 and focuses on the design process and its application. Principles of Engineering is taught in Grade 10 and introduces major concepts that students encounter in postsecondary engineering courses, such as mechanisms, statics, materials and kinematics. There are five specialization courses within PLTW: Aerospace Engineering (AE), Biotechnical Engineering (BE), Civil Engineering and Architecture (CEA), Computer Integrated Manufacturing (CIM), and Digital Electronics (DE). Digital Electronics and Aerospace Engineering are taught in Grade 11. Engineering Design and Development (EDD) is taught in Grade 12. This is the capstone course in which students work in teams to design and develop solutions to a problem by applying the engineering design process.
Procedure
Data Collection
Data were collected in two stages. In the first stage, a content analysis was conducted for PLTW and EbD™ curricula to identify the indicators that are associated with the six constructs identified by Childress and Rhodes (2008) . In the second stage, the constructs and key indicators or concepts were placed on a survey form and sent to experts for them to rate their importance for assessment and their difficulty to assess.
Content analysis
A qualitative content analysis of selected courses from the EbD™ and PLTW curricula used by each school was conducted to identify concepts of engineering design that were associated with the constructs identified by Childress and Rhodes (2008) . These concepts are referred to as indicators in this study. Content analysis is a research tool in which researchers quantify and analyze the meanings and relationships of words and concepts within a text ( Busch et al., 2012 ; Krippendorff, 2004 ).). Content analysis enables researchers to sift through large volumes of data in a systematic fashion with relative ease. It also allows inferences to be made that can then be corroborated using other methods of data collection. The curriculum materials that were analyzed from the PLTW and EbD curricula are displayed in Table 1 (below and continued on next page).
Table 1
Curriculum Materials Analyzed for PLTW and EbD
PLTW 10 th Grade Curriculum Materials | EbD 10 th Grade Curriculum Materials |
---|---|
Principles of Engineering
Lessons,
Activities, Projects, PowerPoint’s, Assessments, Teacher Notes, Student Resources, ABET Concepts, National Science Education Standards, Standards for School Mathematics, Standards for the English Language Arts, Standards for Technological Literacy, and Principles of Engineering PLTW textbook. |
Technological Design
Lessons,
Activities, Projects, Assessments, Teacher Notes, and Student Resources. |
PLTW 11 th Grade Curriculum Materials | EbD 11 th Grade Curriculum Materials |
---|---|
Digital Electronics
Lessons,
Activities, Projects, PowerPoint’s, Assessments, Teacher Notes, Student Resources, ABET Concepts, National Science Education Standards, Standards for School Mathematics, Standards for the English Language Arts, Standards for Technological Literacy, and Digital Electronics PLTW textbook. |
Advanced Design Applications
Lessons, Activities, Projects, Assessments, Teacher Notes, Student Resources, and Material Science Textbooks. |
Aerospace Lessons
, Activities,
Projects, PowerPoint’s, Assessments, Teacher Notes, Student Resources, ABET Concepts, National Science Education Standards, Standards for School Mathematics, Standards for the English Language Arts, and Standards for Technological Literacy. |
PLTW 12 th Grade Curriculum Materials | EbD 12 th Grade Curriculum Materials |
---|---|
Engineering Design & Development
Lessons, Activities, Projects, PowerPoint’s, Assessments, Teacher Notes, Student Resources, ABET Concepts, National Science Education Standards, Standards for School Mathematics, Standards for the English Language Arts, and Standards for Technological Literacy.. |
Engineering Design & Robotics
Lessons, Activities, Projects, Assessments, Teacher Notes, Student Resources and Robots program materials by Intelitek.. |
Coding . Two coders assigned codes to the six constructs. The curricula were then examined to identify engineering design concepts and then categorized each of these concept under one or more of the constructs of Childress and Rhodes (2008) . To ensure intercoder reliability, each coder was given a copy of Grade 10 curriculum materials for both PLTW and EbD™. The researcher provided instructions to the coders prior to the coding process. The coders independently highlighted words and phrases relating to engineering design concepts that were in the Grade10 curriculum materials of both PLTW and EbD™. The coders met to review and discuss their findings. Discrepancies were discussed and resolved. The process was repeated until an inter-coder reliability of 87% was obtained. Krippendorff (2004) indicated that in order to assure the data under consideration are at least similarly interpreted by two or more coders it is customary to require an intercoder reliability of 80% or more; therefore, the intercoder reliability for this study was well within acceptable levels. The coders then proceeded to perform a content analysis of the remaining sample of curriculum materials for both PLTW and EbD™.
Table 2
Constructs and Codes
Construct | Code |
---|---|
Engineering design that emphasizes the importance of
creativity in designing engineered solutions to problems, as well as design iterations and tradeoffs |
ED-CIT |
Application of engineering design that included outcomes
relating to design activities, experimentation, prototyping and reverse engineering |
ED- EPR |
Engineering analysis that includes mathematics in
optimizing solutions and the use of both science and math in the engineering design process |
ED- MSO |
Engineering and human values that consists of the
interactions between engineering design and society such as safety and the environment versus costs and ethics |
ED-HV |
Engineering communication that included all sorts of
communications important to the engineering design process |
ED-C |
Engineering science that includes the traditional sciences
such as statics and dynamics as well as material properties, energy, power, etc. |
ED-ESD |
After words relating to concepts of engineering design were identified, similar concepts were grouped together. The total number\ of words relating to engineering concepts that were identified by the coders amounted to 711,618 of which were common to both coders. Some of the words were a derivative of the same word, so they were reduced into a final manageable, qualitative descriptive frequency list. This process was done by including the highest frequency word found within a group of similar words. For example, a group of words found by the coders were: communicate , communication , and communications . The final word selected was communication because it had the highest frequency. A part of the final frequency list of words is shown in Table 3.
Table 3
Part of Final Frequency List
Descriptive Frequency | Word Frequency |
---|---|
activity | 1612 |
addition | 106 |
aerospace | 206 |
aircraft | 285 |
airfoil | 79 |
airplane | 32 |
analysis | 185 |
analyze | 239 |
After the final frequency list was identified, the curriculum material was again examined by the coders to better understand the context in which the words were used and determine which of the constructs they were related to ( Busch et al., 2012 ). Words that appropriately related to a construct were coded using the codes identified in Table 2. So, the constructs served as categories. Brief statements containing a verb, object, and sometimes a modifier were finally used to better capture the meaning of the concept or context in which it was used. These were indicators. For example, for the word communication , which was coded as ED-C, an examination of the meaning and context produced the statement Communicating knowledge professionally .
Survey
The survey instrument used was a modification of the Task Verification instruments used by Norton (1999) . In the instrument Norton used, duty statements of a job or occupation are stated, and the task statements relating to each duty were listed below the duty statement. Expert workers were asked: (a) do they perform the task; (b) rate the importance of the task on a Likert scale of 0–5, with 0 meaning No Importance and 5 meaning Great Importance ; and (c) rate the difficulty of a task on a Likert scale from 0–5, with 0 being Extremely Easy and 5 being Extremely Difficult . The criticality of each task was determined by multiplying the importance index by the difficulty index.
The instrument developed by the researchers replaced the duty statements with the six constructs of Childress and Rhodes (2008) :
- Engineering design,
- Application of engineering design,
- Engineering analysis,
- Engineering communication,
- Engineering and human values, and
- Engineering science.
The indicator statements replaced the task statements on Norton’s (1999) task verification instruments. The instrument asked expert participants to examine the indicators for each construct and rate each indicator on a 5-point Likert scale with 1 representing Strongly Disagree and 5 representing Strongly Agree for their (a) importance for assessment and (b) difficulty to assess engineering design process and outcomes. The criticality index for each indicator was determined by multiplying its importance score and its difficulty score. The criticality index for each construct was determined by multiplying the averaged importance score and the averaged difficulty score for the key indicators of that construct.
The survey was pilot tested by sending it to two teachers to fill out. Simple grammatical errors were corrected, and then it was sent to six experts. The experts were chosen for their experience in teaching engineering education in high school and at the college level and for practicing engineering in industry. The expert team consisted of two technology and engineering education teachers from two high schools in Idaho with combined years of teaching of over 30 years, two engineers from industry in Idaho with a combined working experience of 45 years, and two engineering education faculty from two universities with a combined experience of over 15 years in teaching and research.
Results
The results obtained from an analysis of the data are presented in respect to the two research questions posed at the beginning of the study. The first question was: How are the constructs identified by Childress and Rhodes (2008) ranked by professional engineers and educators in terms of criticality for inclusion on an instrument to measure engineering design outcomes in high schools in Idaho? Childress and Rhodes (2008) framework consisted of seven constructs, six of which were used for this study. The criticality index for each construct was derived by multiplying the indicators’ average importance index by the average difficulty index. The constructs were then rank ordered from the highest criticality index to the lowest criticality index. As indicated in Table 4, engineering and human values had the highest criticality index and so was ranked one, and engineering science had the lowest criticality index and was ranked six.
Table 4
Criticality Ranking of the Six Constructs
Construct Category |
Mf
Importance |
Mf
Difficulty |
Indicator
of Criticality |
---|---|---|---|
Engineering and Human Values | 4.2 | 3.3 | 13.9 |
Application of Engineering Design | 4.0 | 3.0 | 11.9 |
Engineering Communication | 4.1 | 2.9 | 11.8 |
Engineering Design Concepts | 4.0 | 2.9 | 11.6 |
Engineering Analysis | 3.8 | 2.7 | 0.3 |
Engineering Science | 3.5 | 2.3 | 8.3 |
The second research questions was: What are the key indicators associated with each of the constructs identified by Childress and Rhodes (2008) to measure engineering design outcomes in high schools in Idaho? The category engineering and human values had six indicators (see Table 5). Five of the six indicators were rated high in importance, receiving scores ranging from 4.0 to 4.8. Three of the indicators were perceived to be difficult to assess.
Table 5
Key Indicator Results for Engineering & Human Values
Engineering & Human Values |
Mf
Importance |
Mf
Difficulty |
|
---|---|---|---|
Participate in teams | 4.8 | 3.0 | |
Assess the effect of technology on the
environment |
4.3 | 3.7 | |
Understand ethical implications | 4.2 | 3.7 | |
Determine product’s safety in function | 4.2 | 3.5 | |
Apply the relationship between voltage,
current, & resistance |
4.0 | 2.7 | |
Understand relationships among technologies | 3.8 | 3.3 | |
Average Mean Value | 4.2 | 3.3 |
For the construct application of engineering design , 12 indicators were identified (see Table 6). Eleven of the 12 indicators were rated high in importance, receiving scores ranging from 4.0 to 4.8.
Table 6
Key Indicator Results for Application of Engineering Design
Application of Engineering Design |
Mf
Importance |
Mf
Difficulty |
|
---|---|---|---|
Provide accurate documentation | 4.8 | 3.0 | |
Calculate forces | 4.7 | 2.7 | |
Understanding measurements | 4.7 | 2.7 | |
Troubleshoot errors | 4.3 | 3.5 | |
Modify design | 4.2 | 3.5 | |
Use experimentation to make decisions | 4.2 | 3.2 | |
Apply constraints | 4.2 | 2.8 | |
Construct/evaluate working prototypes | 4.2 | 2.5 | |
Explore functions of systems | 4.0 | 3.5 | |
Participate in team activities | 4.0 | 3.0 | |
Identify manufacturing processes | 4.0 | 2.7 | |
Utilize flight simulators | 2.0 | 2.1 | |
Average Mean Value | 4.0 | 3.0 |
The construct engineering communication had 20 indicators (see Table 7). Fifteen of the 20 indicators had importance ratings at 4.0 and above. Interestingly, the indicator utilizing brainstorming methods was scored 4.5 for importance but received 4.2 for difficulty to assess, the highest difficulty score for this construct.
Table 7
Key Indicator Results for Engineering Communication
Engineering Communication |
Mf
Importance |
Mf
Difficulty |
---|---|---|
Communicate knowledge professionally | 4.7 | 2.8 |
Utilize modeling software | 4.7 | 2.7 |
Communicate the design solution process | 4.5 | 3.0 |
Engage in Problem-based learning | 4.5 | 3.0 |
Apply standards | 4.5 | 3.0 |
Utilize brainstorming methods | 4.5 | 4.2 |
Engage in project-based learning | 4.5 | 3.3 |
Develop skills in using tools | 4.3 | 3.2 |
Utilize presentation software | 4.3 | 1.8 |
Develop sketches | 4.3 | 2.3 |
Evaluate feedback | 4.2 | 3.3 |
Solve design problems | 4.0 | 3.5 |
Create/deliver formal presentations | 4.0 | 2.5 |
Communicate using symbols | 4.0 | 2.3 |
Understand the importance of project management | 4.0 | 3.3 |
Understand communication technologies | 3.8 | 3.2 |
Create detailed flow charts | 3.5 | 1.8 |
Improve design process & outcome | 3.3 | 3.5 |
Use symbols in communicating processes | 3.3 | 2.5 |
Utilize automation system programming functions | 3.2 | 2.3 |
Average Mean Value | 4.1 | 2.9 |
For the construct engineering design concepts , 16 indicators were identified from the content analysis (see Table 8). Eleven of the indicators had scores ranging from 4.0 to 4.8. Each of these 11 indicators had difficulty to assess, with scores ranging from 2.3 to 3.3, indicating they are not difficult to assess in class.
Table 8
Key Indicator Results for Engineering Design Concepts
Engineering Design Concepts |
Mf
Importance |
Mf
Difficulty |
---|---|---|
Use creativity in solving problems | 4.8 | 3.3 |
Document project’s progress in engineering notebook | 4.7 | 2.3 |
Understand attributes of a design process | 4.5 | 3.5 |
Understand core concepts of technology | 4.5 | 2.5 |
Develop models | 4.5 | 3.0 |
Conduct research | 4.3 | 3.5 |
Create portfolios in documenting work | 4.0 | 2.3 |
Understand material & equipment requirements | 4.0 | 2.5 |
Optimize design solutions | 4.0 | 3.3 |
Employ strategies | 4.0 | 2.8 |
Understand system energy requirements | 4.0 | 2.5 |
Use construction technologies | 3.8 | 2.5 |
Use the method of joints strategy to determine forces in a truss | 3.7 | 2.7 |
Create system control programs | 3.5 | 2.8 |
Create new systems/processes | 3.2 | 3.5 |
Justify discoveries or innovations | 3.2 | 3.0 |
Average Mean Value | 4.0 | 2.9 |
The construct engineering analysis had 30 indicators (see Table 9). Thirteen of these indicators had scores ranging from 4.0 to 5.0. The indicator utilizing mathematics to solve problems received the highest importance score.
Table 9
Key Indicator Results for Engineering Analysis
Engineering Analysis |
Mf
Importance |
Mf
Difficulty |
---|---|---|
Utilize mathematics to solve problems | 5.0 | 2.7 |
Utilize mathematical formulas to solve design problems | 4.7 | 2.8 |
Use mathematical concepts in design | 4.7 | 3.0 |
Know to calculate a moment | 4.5 | 2.3 |
Develop solutions to problems | 4.5 | 3.7 |
Understand quantitative data | 4.5 | 2.8 |
Conduct testing | 4.3 | 3.2 |
Evaluate design solutions | 4.2 | 3.2 |
Use assessment techniques | 4.0 | 2.8 |
Use decision matrix for design problems | 4.0 | 2.7 |
Evaluate output work of mechanisms | 4.0 | 2.5 |
Describe basic logic functions | 4.0 | 2.3 |
Understand criteria in assessment rubrics | 4.0 | 3.5 |
Determine angles | 3.8 | 2.5 |
Identify magnitude, direction, & sense of a vector | 3.8 | 2.2 |
Understand mechanical advantage ratios | 3.8 | 2.3 |
Calculate mean, median, & mode | 3.8 | 2.0 |
Calculate gear ratio | 3.8 | 2.0 |
Weigh tradeoffs | 3.6 | 3.2 |
Calculate drive ratios of mechanisms | 3.5 | 2.0 |
Choose appropriate input devices of technological systems | 3.3 | 3.0 |
Apply statistics | 3.3 | 2.8 |
Choose appropriate output devices of technological systems | 3.2 | 3.3 |
Differentiate flow rate and flow velocity | 3.2 | 2.5 |
Calculate probability | 3.2 | 2.2 |
Perform competitive product analyses | 3.0 | 3.0 |
Locate the centroid of structural members | 3.0 | 2.3 |
Understand matrix & reinforcement in composite materials | 2.8 | 2.0 |
Evaluate input work of mechanisms | 2.7 | 2.7 |
Average Mean Value | 3.8 | 2.7 |
The last category or construct on the instrument, engineering science , had 61 indicators (see Table 10, below and continued on next page). The importance rating data indicated that 20% of the indicators ranged at 4.0 or above, which means twelve of the 61 key indicators were rated high in importance for inclusion in an engineering design assessment tool. Six of the indicators were ranked below 3.0 for importance. Only three indicators were rated at 3.0 or above in their difficulty to assess. The two indicators that were scored as least difficult to assess were differentiating and calculating velocity and differentiate digital and analog systems .
Table 10
Key Indicator Results for Engineering Science
Engineering Science |
Mf
Importance |
Mf
Difficulty |
---|---|---|
Calculate mechanical advantage | 4.5 | 2.3 |
Identify material properties | 4.3 | 2.5 |
Use computers to organize & communicate data | 4.3 | 2.3 |
Understand static equilibrium of bodies | 4.3 | 2.3 |
Calculate mechanical efficiency | 4.2 | 2.3 |
Develop technological knowledge | 4.2 | 3.3 |
Calculate velocity | 4.0 | 1.8 |
Calculating speed | 4.0 | 2.5 |
Apply the relationship between voltage, current & resistance | 4.0 | 2.3 |
Understand properties of metals | 4.0 | 2.2 |
Distinguish between the six simple machines | 4.0 | 2.0 |
Calculate mass | 4.0 | 2.0 |
Use scientific concepts in design | 3.9 | 2.8 |
Understand characteristics of technology | 3.8 | 3.0 |
Understand compound machines | 3.8 | 2.3 |
Applying thermodynamic principles | 3.8 | 2.8 |
Differentiate the basic properties of materials (electrical, magnetic, etc.) | 3.8 | 2.2 |
Design, build, & test truss designs | 3.8 | 2.2 |
Differentiate digital & analog systems | 3.8 | 1.8 |
Calculate material properties using a stress strain curve | 3.7 | 2.3 |
Construct simple & compound gear systems | 3.7 | 2.3 |
Identify properties of elements | 3.7 | 2.2 |
Calculate torque ratio | 3.7 | 2.0 |
Understand characteristics of lever systems | 3.7 | 2.0 |
Calculate stress | 3.7 | 2.0 |
Calculate circuit resistance, current & voltage | 3.7 | 1.8 |
Identify science concepts | 3.7 | 2.8 |
Understand of electrical circuits | 3.7 | 2.7 |
Understand of electrical energy | 3.7 | 2.5 |
Understand thermal energy transfer | 3.7 | 2.7 |
Identify impacts of energy | 3.5 | 2.8 |
Design, create, & test hydraulic devices | 3.5 | 2.8 |
Understand the advantages & disadvantages of circuit design | 3.5 | 2.5 |
Understand electronics | 3.5 | 2.5 |
Define types of power | 3.5 | 2.0 |
Understanding inclined plane systems | 3.5 | 2.0 |
Employ kinematics equations | 3.3 | 2.2 |
Identify properties & characteristics of solids | 3.3 | 2.2 |
Identify & categorize energy sources | 3.3 | 2.0 |
Identify components & functions of fluid power | 3.3 | 2.0 |
Identify characteristics of composites | 3.3 | 2.3 |
Identify engineering disciplines | 3.3 | 2.3 |
Provide technical feasibility | 3.2 | 3.3 |
Work with electronic assemblies | 3.2 | 2.8 |
Design, create, & test pneumatic devices | 3.2 | 2.2 |
Design/create/& test pulley systems | 3.2 | 2.2 |
Understand recycling technology | 3.2 | 2.2 |
Conduct tensile testing | 3.2 | 2.2 |
Understand fuel cell technology | 3.0 | 2.5 |
Classify properties of Polymers | 3.0 | 2.5 |
Use transportation technologies | 3.0 | 2.5 |
Design/create/& test sprocket systems | 3.0 | 2.0 |
Experiment with solar hydrogen systems | 2.8 | 2.5 |
Understand chemical properties | 2.8 | 2.5 |
Create a simple airfoil | 2.8 | 2.2 |
Understand basic aircraft design | 2.7 | 2.5 |
Understand aerospace materials & structures | 2.7 | 2.0 |
Differentiate ceramic materials in industry | 2.5 | 2.0 |
Average Mean Value | 3.5 | 2.3 |
Discussion
Engineering and human values was ranked with the highest criticality for inclusion in an instrument to measure engineering design outcomes. Not only do the experts see this construct as important, but they also see it as difficult to assess. Childress and Rhodes (2008) refer to engineering and human values as the big picture when it comes to the interaction of engineering design and society, which includes the weighing of limitations in decisions about safety and the environment versus costs and ethics. So, the expert participants believe that this should be given priority in assessment. Engineers are often required to work with teams that are diverse and interdisciplinary to solve complex problems that may have local, regional, and global consequences, and in doing so, they have to be cognizant of the ecological impact of their design. Therefore, good engineering goes beyond being technically competent but also involves understanding and making judgments about the moral implications of designs. Lau (2013) points out that engineers are largely responsible for the artifacts of the modern world, and this constructed world has both risks and benefits ranging from obvious safety and health issues, to issues of equity and environmental degradation. Engineers therefore need to “have an understanding of how their activity affects progress, and how to do that benevolently” (p. 1). In addition, he indicated that the process of solving ethical problems has many similarities to the engineering design process.
Engineering analysis and engineering science received the two lowest rankings. This might be a reflection of the perception that engineering analysis and the sciences that are associated with it must not be the predominant emphasis of engineering design at the high school level. Overall, however, the experts think that students’ engineering outcomes should be determined by their performance relating to several key indicators relating to mathematical computation and the sciences, such as calculate mechanical advantage , identify material properties , and know how to calculate a moment . It should also be noted that indicators such as utilize mathematics to solve problems , utilize mathematical formulas to solve design problems , and use mathematical concepts in design received some of the highest importance scores, emphasizing the perception that these experts have of students being able to model math as part of the engineering design outcomes in high school. This consistently reflects the opinion of other experts in science and engineering. As the National Research Council (2012) noted in their framework:
Although there are differences in how mathematics and computational thinking are applied in science and in engineering, mathematics often brings these two fields together by enabling engineers to apply the mathematical form of scientific theories and by enabling scientists to use powerful information technologies designed by engineers. Both kinds of professionals can thereby accomplish investigations and analyses and build complex models, which might otherwise be out of the question. (p. 65)
Assessments that are guided by indicators related to all six constructs should give technology and engineering educators that use the EbD™ and PLTW curricula in high school in Idaho a more holistic representation of students’ performance in engineering design. Importantly, the list of indicators relating to each construct can also help to reinforce to math and science teachers the depth of students’ immersive STEM experiences when their schools use the EbD™ and the PLTW curricula. This might motivate more collaboration across these disciplines. These indicators can also provide technology, math, and science education teachers with a list of items that can be included on performance rating forms to assess students’ process and products in engineering design. Rating scales along with rubrics are used to assess students. Selected indicators can also be included on self-report measures that allow students to reflect on their learning experience and help them see the connections among the concepts that they learned as well as the applications of these concepts in new situations ( Gray, 2007 ). It must be mentioned that many of these indicators can be broken down further into discrete actions that can provide useful measures of student’s competency in a particular designing activity. In fact, the indicators that were viewed as difficult to assess (such as develop solutions to problems, understand attributes of a design process, and utilize brainstorming methods ) may need to be broken down into more discrete action statements to provide clarity for assessment.
Conclusion
This study explored ranking engineering design constructs identified by Childress and Rhodes (2008) and identifying their indicators. The results represent preliminary work in addressing assessment of engineering design outcomes in schools in Idaho, irrespective of the curriculum in use. Admittedly, more questions still need to be answered. For example, can an instrument be developed from the indicators that validly and reliably assesses students’ outcomes in design? What indicators should be included on such an instrument? More study needs to be done to answer these questions. The indicators identified for each construct in this study, however, provide a useful list of measures that can be used by technology and engineering teachers. Selected indicators can be identified by math, science, technology, and engineering education teachers as they coordinate in the teaching of STEM concepts and collaborate in the designing of project-based activities that they will engage students in solving. But, at present, the list provides a menu that teachers can choose from that relates to their instructional objectives, which they can use to assess students learning outcomes.
References
Banks, F., & Barlex, D. (2014). Teaching STEM in the secondary school: Helping teachers meet the challenge . New York, NY: Routledge.
Busch, C., De Maret, P. S., Flynn, T., Kellum, R., Le, S., Meyers, B.,... Palmquist, M. (2012). Writing@CSU: Content Analysis . Retrieved from http://writing.colostate.edu/guides/pdfs/guide61.pdf
Childress, V., & Rhodes, C. (2008). Engineering student outcomes for Grades 9–12. The Technology Teacher . 67 (7), 5–12.
Edström, K., Soderholm, D., & Knutson Wedel, M. (2007). Teaching and learning. In E. Crawley, J. Malmqvist, S. Östlund, & D. Brodeur (Eds.), Rethinking engineering education: The CDIO approach (pp. 130–151). New York, NY: Springer. doi:10.1007/978-0-387-38290-6_6
Gray, P. J. (2007). Student learning assessment. In E. Crawley, J. Malmqvist, S. Östlund, & D. Brodeur (Eds.), Rethinking engineering education: The CDIO approach (pp. 152–165). New York, NY: Springer. doi:10.1007/978-0-387-38290-6_7
Gronlund, N. E. (1998). Assessment of student achievement (6th ed.). Needham Heights, MA: Allyn & Bacon.
Honey, M., Pearson, G., & Schweingruber, H. (Eds.). National Academy of Engineering & National Research Council. (2014). STEM integration in K– 12 education: Status, prospects, and an agenda for research . Washington, DC: National Academies Press. doi:10.17226/18612
Kelley, T. R. (2008). Examination of engineering design in curriculum content and assessment practices of secondary technology education (Unpublished doctoral dissertation). University of Georgia, Athens, GA.
Krippendorff, K. (2004). Content analysis: An introduction to its methodology (2nd ed.). Thousand Oaks, CA: Sage.
Lewis, T. (2005). Coming to terms with engineering design as content. Journal of Technology Education , 16 (2), 37–54. Retrieved from http://scholar.lib.vt.edu/ejournals/JTE/v16n2/pdf/lewis.pdf
Lau, A. (2013, January 13). Is ethics important to engineers? Rock Ethics Institute . Retrieved from http://rockethics.psu.edu/resources/documents/engineering.pdf
McCracken, W. M., & Waters, R. (1997). Assessment and evaluation in problem based learning. In Proceedings of the 27th annual Frontiers in Education Conference: Teaching and learning in an era of change (pp. 689–693). New York, NY: Institute of Electrical and Electronics Engineers. doi:10.1109/fie.1997.635894
National Research Council. (2012). A framework for K–12 science education: Practices, crosscutting concepts, and core ideas . Washington, DC: National Academies Press. doi:10.17226/13165 .
Norton, R. E. (1999). SCID handbook. SCID: a competency-based systematic curriculum and instructional development model . Columbus, OH: Ohio State University.
International Technology Education Association. (2007). Standards for technological literacy: Content for the study of technology (3rd ed.). Reston, VA: Author.
Thomas, J. W. (2000). A review of research on project-based learning . San Rafael, CA: Autodesk Foundation. Retrieved from http://www.bobpearlman.org/BestPractices/PBL_Research.pdf
Wicklein, R. C. (2005). Critical issues and problems in technology education. The Technology Teacher , 64 (4), 6–9.
Woods, D. R. (1996). Problem-based learning: Helping your students gain the most from PBL . Hamilton, Ontario, Canada: McMaster University.
Woods, D. R. (2000). Helping your students gain the most from PBL. In Tan, O. S., Little, P., Hee, S. Y., & Conway, J. (Eds.). Problem-based Learning: Educational innovation across disciplines . Singapore: Learning Academy, Temasek Polytechnic. Retrieved from http://www.tp.edu.sg/staticfiles/TP/files/centres/pbl/pbl_donaldwoods.pdf
Woolfolk, A. (2013). Educational psychology (12th ed.). Upper Saddle River, NJ: Pearson
Yin, R. K. (2009). Case study research: Design and methods (4th ed.). Thousand Oaks, CA: Sage.
About the Authors
Cheryl A. Wilhelmsen ( cherylw@uidaho.edu ) is Director of the Industrial Technology Program at the University of Idaho.
Raymond A. Dixon ( rdixon@uidaho.edu ) is Assistant Professor in the Department of Curriculum and Instruction at the University of Idaho.