JITE v46n2 - Examination of Assessment Practices for Engineering Design Projects in Secondary Education (Second in a Three Part Series)
Examination of Assessment Practices for Engineering Design Projects in Secondary Technology Education (Second article in 3-part series)
Todd R. Kelley, Ph.D.
Purdue University
Robert C. Wicklein, Ed.D.
University of Georgia
Introduction
There is a growing interest in the topic of engineering design for technology education. At the 2007 and 2008 International Technology Education Association (ITEA) conference held in San Antonio, over 80 presentations were related to engineering topics. Further evidence of the influence and impact of engineering design content comes from the large number of well documented curriculum projects designed to infuse engineering content into technology education such as Engineering by Design; Project ProBase; Project Lead the Way, and Introduction to Engineering ( Dearing & Daugherty, 2004 ). Likewise, state curriculum standards exist for the teaching of engineering design in technology education ( Massachusetts Department of Education, 2001 , Advisory Committee on Engineering and Technology Education in Georgia. (2008) . Moreover, authors in the field of technology education have provided a strong rationale for engineering design to be the focus for technology education ( Hill, 2006 ; Lewis, 2004 ; Wicklein, 2006 ). In a very short time, the field has moved from “coming to terms” with engineering design ( Lewis, 2005 ) to research studies that suggest the technology education teachers value this focus and are already on the move towards infusing engineering design into technology education ( Dearing & Daugherty, 2004 ; Gattie &Wicklein, 2007 ; Kelley, 2008 ). Based on these efforts to infuse engineering practices within the technology education curriculum it is appropriate to now investigate how technology education teachers are assessing engineering design activities within their classrooms. This research study was guided by the following questions:
- To what degree do current assessment practices of secondary technology educators reflect engineering design concepts?
- What are the similarities and differences of assessment practices of secondary technology educators when grouped by traditional and block schedules?
- What are the greatest and least emphasized engineering design assessment practices by secondary technology education teachers?
Related Literature
Welch (2001) indicated that research on assessment practices in technology education was sparse. Furthermore, Lewis (2005) indicated that assessment of the teaching and learning of design was still an undeveloped aspect of technology education. Arguably, design has been at the center of technology education teaching and learning for some time and therefore should also be at the center of assessment criteria. Lewis (2005) provides a strong rational that design is the single most important category in the Standards for Technological Literacy ( ITEA, 2000/2002 ). Design, as a subject and as a process as outlined in the Standards, is the catalyst to explain and understand how all man-made things work which fall within the domain of engineering. Lewis identified that of the twenty standards in the document, four directly address design. However, Lewis also indicated that assessment of the teaching and learning of design was still an undeveloped aspect of technology education. Several studies in technology education have focused on the assessment of design, engineering design, and problem solving. Halfin (1973) was a pioneer in the development of a coding process to assess an individual’s design and problem solving thought process. Halfin used biographical and autobiographical data to evaluate the intellectual processes used by ten high-level designers (e.g., Buckminster Fuller, Thomas Edison, Frank Lloyd Wright) to solve technological problems. Halfin employed the Delphi research technique to identify 17 mental processes that were universal for these expert engineers and designers. Halfin’s coding process has been used in several research studies using an observation protocol methodology to assess students’ design and problem solving capabilities ( Hill, 1997 ; Kelley, 2008 ). Similar studies have also used observation assessments to evaluate students engaged in the design process and these methods have been found to be an effective assessment technique ( Lewis, Adams, Punnakanta, Littleton, & Atman, 2001 ). Custer, Valesey, & Burke (2001) developed and validated an instrument for assessing student learning in design and problem solving. This research was founded on the concept that problem solving can be condensed into a set of discrete, observable behaviors able to be captured using appropriate rubrics. The examples of research in technology education that focuses on assessing students’ abilities in design and problem solving listed above have provided a foundation of knowledge to build upon, but there is clearly a need for more research on assessment of engineering design thinking.
One recent study sought to identify appropriate assessment strategies for engineering design at the secondary level. This Asunda and Hill (2007) study determine the critical features of engineering design that can be incorporated within technology education learning activities. The researchers also developed a rubric for assessing these identified features. The study used a phenomenological approach through a semi-structured interview process working with three professors of engineering education. The interview process revealed four core themes for emphasis in technology education with an engineering design focus. The four core themes were (a) the process of engineering design; (b) societal benefits of engineering design; (c) attributes of engineering design; and (d) assessment. Qualitative data from the interviews of the participants revealed that participants used a variety of assessment practices to evaluate students design projects including; a) student portfolios, b) assessment by a panel of engineering faculty for industry based-projects, and c) individual and group presentations. This data was used to construct an assessment rubric for evaluating the design (process and product), the communication (oral and written), and the teamwork demonstrated throughout the activity.
Methodology
This descriptive study drew a full sample of high school technology teachers from the current ITEA membership list (September 2007). The sample consisted of all high school technology teachers regardless of whether they indicated they were teaching engineering design in their classroom. The identified population of this study consisted of a total of N=1043) high school technology education. The original research design for this study called for an increase of the initial mailing of the survey by 48.1 percent, the average success rate of an initial mailing ( Gall, Gall, & Borg, 2007 ). However, close communication with ITEA personnel revealed that ITEA survey mailings typically yield a 20-25% rate of return (Price, personal communication). The researcher determined that a full sample mailing to all ITEA high school members was necessary. A cover letter was sent electronically through e-mail for all ITEA members in the sample who listed an active e-mail address in the fall of 2007. The electronically delivered cover letter contained a URL for the on-line questionnaire. The on-line questionnaire was managed by HostedSurvey.com. The on-line questionnaire was developed using the guidelines and recommendations outlined by Dillman, Tortora, and Bowker (1999) . There was a request to return the survey on a specified date.
The researchers sent out the surveys to the population of 1043 high school ITEA teachers. After waiting three days past the specified date of return, which was three weeks after the initial mailing, the researcher contacted non-respondents by sending a follow-up e-mail delivered letter containing the URL for the on-line survey link. This has been a proven method used by other researchers to achieve compliance from non-respondents ( Gall et al., 2007 ).
Instrument
Results of Asunda and Hill’s (2007) study created a framework in the survey instrument to identify appropriate assessment strategies for secondary technology educators when assessing engineering design activities. The researchers used the elements from Asunda and Hill’s rubric to create eight instrument items related to assessment practices for engineering design projects. See Table 1 for a complete list of the eight individual instrument items for assessment practices.
Individual Items of assessment practices for engineering design projects |
---|
1. use support evidence / external research (research notes, illustrations, etc) |
2. provide evidence of formulating design criteria and constraints prior to designing solutions |
3. use design criteria such as budget, constraints, criteria, safety, and functionality |
4. provide evidence of idea generation strategies (e.g. brainstorming, teamwork, etc.) |
5. properly record design information in an engineer's notebook |
6. use mathematical models to optimize, describe, and/or predict results |
7. develop a prototype model of the final design solution |
8. work on a design team worked as a functional interdisciplinary unit |
Participants were required to respond to each curriculum content item in two ways, (1) the frequency of using the assessment practices and (2) the amount of time per typical use of the assessment practice. A six-point Likert type scale was used to collect this data, see Table 2.
How Often? (Frequency) | |||
---|---|---|---|
Likert | Wording | Traditional (meets 5 days a week) | Block |
0 | Never | 0 | 0 |
1 | A few times a year | 5 days | 5 days |
2 | 1 or 2 times a month | 14 days (1.5*9.1) | 7 days (1.5*4.6) |
3 | 1 or 2 times a week | 55 days (1.5*36.8) | 28 days (1.5*18.4) |
4 | Nearly everyday | 129 days (3.5*36.8) | 64 days (3.5*18.4) |
5 | Daily | 184 days | 92 days |
How Many Minutes? (Time) | |||
---|---|---|---|
Likert | Wording | Traditional (50 minutes per period) | Block (90 minutes per period) |
0 | None | 0 min. | 0 min. |
1 | A few minutes per period | 5 min. | 9 min. |
2 | Less than half the period | 15 min. | 30 min. |
3 | About half | 25 min. | 45 min. |
4 | More than half | 37.5 min. | 67.5 min |
5 | Almost all period | 50 min. | 90 min. |
Assumptions: Traditional schedule meets 5 days a week, 50 minute period, 184 day school year. Typical A/B and 4x4 block scheduling meets for 92 days for 90 minutes. |
Limitation
In order to determine statistical significance for this population size N =1043, Krejcie and Morgan’s (1970) method was to locate sample size for a given population size; the required sample size for the size of this population was set at 285 ( Gay & Airasin, 2000 ). Again, the survey was sent out to all secondary education ITEA members in order to increase the chances of achieving an appropriate response rate. The final results of the study yielded a total of 226 respondents; therefore, the results of this study cannot be generalized to the entire population. However, the researchers compared the demographic data results of this study with demographic results of a similar national status study of technology education ( Gattie & Wicklein, 2007 ) that did receive a response rate level to generalize to the population. The demographic results of both studies were very similar, thus suggesting that these results were representative to the population. However, the researchers acknowledged that statistical significance was not achieved in this study.
Results
The top mean scores for individual items were as follows: provide evidence of idea generation strategies (e.g. brainstorming, teamwork, etc.) (mean of 2.92), develop a prototype model of the final design solution (mean of 2.69), and work on a design team as a functional inter-disciplinary unit (mean of 2.53). Overall, the assessment practice category yielded relatively low mean scores for a 5 point Likert scale, none of which yielded a mean of 3 or higher. The lowest mean scores were items using mathematical models to optimize, describe, and/or predict results (mean of 1.72), while proper record design information in an engineer’s notebook also yielded a low mean of 2.01. See Table 3 for total results of the assessment practice category.
Assessment practices |
M
f |
SD
f |
M
Time |
SD Time |
---|---|---|---|---|
use support evidence /external research (research notes, illustrations, etc) | 2.32 | 1.38 | 2.25 | 1.37 |
provide evidence of formulating design criteria and constraints prior to designing solutions | 2.33 | 1.45 | 2.19 | 1.43 |
use design criteria such as budget, constraints, criteria, safety, and functionality | 2.45 | 1.34 | 2.31 | 1.39 |
provide evidence of idea generation strategies (e.g. brainstorming, teamwork, etc.) | 2.92 | 1.46 | 2.69 | 1.50 |
properly record design info in an engineer's notebook | 2.01 | 1.76 | 1.78 | 1.64 |
use mathematical models to optimize, describe, and/or predict results | 1.72 | 1.43 | 1.62 | 1.39 |
develop a prototype model of the final design solution | 2.69 | 1.43 | 2.87 | 1.55 |
work on a design team worked as a functional interdisciplinary unit | 2.53 | 1.50 | 2.79 | 1.60 |
Total Group Mean | 2.37 | 2.31 |
A composite score was generated for assessment strategies for traditional and block scheduling (see Figure 1). Computing a composite score for the assessment practices of high school technology teachers by using mean scores for time per typical use and frequency of use provided an indicator to reveal areas of emphasis and deficiencies regarding assessment practices. The researchers split the files; separating traditional and block scheduling results in order to accurately calculate a composite score. Splitting the file was necessary because the units of day and units of duration were different between the groups. A comparison of the difference between the total hour composite score for each of the assessment strategies between the two groups is reported in Table 4.
Engineering Design Assessment Strategies |
Total Hours
(T) |
% Hours
(T) |
Total Hours
(B) |
% Hours
(B) |
---|---|---|---|---|
use support evidence/external research (research notes, illustrations, etc) | 8.15 | 10.18 | 7.53 | 9.75 |
provide evidence of formulating design criteria and constraints prior to designing solutions | 6.92 | 8.65 | 9.00 | 11.66 |
use design criteria such as budget, constraints, criteria, safety, and functionality | 9.76 | 12.19 | 9.61 | 12.45 |
provide evidence of idea generation strategies (e.g. brainstorming, teamwork, etc.) | 18.00 | 2.47 | 18.5 | 23.96 |
properly record design information in an engineer's notebook | 2.58 | 3.23 | 4.76 | 6.16 |
use mathematical models to optimize, describe, and/or predict results | 1.93 | 2.42 | 2.86 | 3.70 |
develop a prototype model of the final design solution | 18.33 | 22.84 | 13.30 | 17.22 |
work on a design team worked as a functional inter-disciplinary unit | 14.46 | 18.02 | 11.66 | 15.10 |
Total Hours | 80.13 | 77.22 |
Comparisons of the difference between the total hours and % of total hours for each of the assessment strategies between the two groups are reported in Table 4. The differences in total hours between traditional and block scheduling was analyzed to determine if there were major differences between the two groups for each of the assessment strategies. The assessment strategy that assessed the developing a prototype model of the final design solution received the greatest total hour difference of 5.03 hours. The assessment strategy that required students to use design criteria such as budget, constraints, criteria, safety, and functionality resulted in the greatest consensus among responders with only a 0.15 of an hour difference with traditional scheduling dedicating 9.76% and block scheduling dedicating 9.61% of their time on this assessment strategy. The assessment strategy that focused on the use mathematical models to optimize, describe, and/or predict results resulted in the lowest emphasized item for assessment practices with traditional scheduling teachers dedicating 2.42 % and block scheduling teachers dedicating 3.70% of their time utilizing this assessment practices. Over one third of the time technology education teachers spent on assessing students engineering design projects was devoted to two items: evidence of idea generation strategies (e.g. brainstorming, teamwork, etc.) with 22.47% for traditional and 23.96% for block scheduling, and the item develop a prototype model of the final design solution with 22.84% for traditional and 17.22% for block scheduling.
Conclusions
According to the results of this study, secondary technology education teachers place the lowest emphasis on assessing the use of mathematics to optimize and predict design results (Traditional 2.42 %, Block 3.70% of assessment practice time). These results are strong indicators that the engineering analysis phase of the engineering design process is not emphasized very much in assessment practices. This is a major concern considering a number of leaders in technology education have indicated that a major difference between the technological design process and the engineering design process is analysis and optimization ( Hailey, et al., 2005 ; Hill, 2006 ; Gattie & Wicklein, 2007 ). Without a strong and consistent emphasis on the analytical process to solve technological problems students and teachers are limited in their ability to utilize a comprehensive engineering design process therefore defaulting to the standard trial and error methodology to solve problems. It can be argued that the mathematical modeling and analysis is the heart of engineering design and that without this focus on the design process little or no actual engineering is taking place. This is an important issue to consider especially when it has ramification of damages to the reputation of the technology education field. Individuals inside as well as outside the field of technology education might have rationale to accuse technology education of once again changing the name on the door and not changing the practice ( Clark, 1989 ). Sanders (2008) has observed that many technology education teachers are fond of the appeal of integrating math and science into technology education; when in reality it is rare for technology teachers to identify specific science and mathematical concepts as student learning outcomes for their lessons or activities. Sanders goes on to state “…it is even rarer for technology teachers to assess a science or mathematics learning outcome” (2008, pp. 20-26). Technology education teachers are still emphasizing the importance of building prototyping in their assessment practices. The assessment item developing a prototype model of the final design solution just edged out the idea generation item as the top assessment strategy for traditional schedule teachers with 22.84% of their time dedicated to assessing prototypes; this assessment strategy was the second highest emphasized for block scheduling teachers with 17.22% of their assessment time dedicated to this category. Allowing students to build prototypes is an appropriate and important part of the engineering design process. However, constructing prototypes without first using mathematics and science to optimize and predict design results is not authentically engaging in the engineering design process. A strong rationale for implementing the engineering design process over other design processes (e.g., trial and error) is that engineering design requires mathematical and scientific analysis to fully inform the designers to allow them to make educated decisions regarding optimal design before prototype building begins. Technology education teachers who indicate that they are implementing an engineering design process and not requiring or assessing students engaged in some mathematical predictions before prototyping are still using the “trial and error” method and are not truly engaging in the power of the engineering design process.
Another area of lesser emphasis was assessing student’s record keeping of design information in an engineer’s notebook (2.01 mean for frequency of use, 1.78 mean for time per typical use). It is unclear if technology educators are implementing the use of engineer’s notebooks in the classroom and just not using them as an assessment tool. Engineer’s notebooks are not only used in engineering schools at the collegiate level, they are also used in engineering practice; therefore technology educators who use engineering notebooks to assess students’ design thinking and record keeping skills would be implementing an authentic assessment technique. Moreover, Hill (2006) suggests implementing the use of an engineering design notebook can help students use a systematic approach to design and problem solving.
Another low mean score item was providing evidence of formulating design criteria and constraints prior to design solutions (Mean of 2.33 (time); Mean of 2.19 (frequency)). Identifying constraints and criteria early in the design process is an important feature of the engineering design process but is a practice not widely adopted within the field of technology education ( Hill, 2006 ). The low mean score of this individual item confirms this statement.
Summary
As a field, we should review the results of this study (see Figure 1) and ponder on the statement by Young and Wilson : “assessment is a public declaration of what is valued” (2000, p ii). This is an appropriate time to reflect upon the purpose of technology education. Can technology education provide a real-life context for the application of mathematics and science through an engineering design focus? Or is this approach to curriculum revision just another way to legitimize the subject of technology education by using the term engineering? ( Lewis, 2004 ).
The researchers recognize that it is unlikely each of the assessment practices identified in the instrument would or should have equal emphasis by the classroom teacher. However, when research results indicate that items such as using mathematical models to optimize, describe, and/or predict results receive less than 4% of the total year of assessment time, it strongly indicates that this is a category of engineering design assessment not widely used as assessment criterion. For years, technology educators have been encouraging students to design and build the fastest model car ( LaPorte, 2005 ), the strongest model bridge ( Volk, 1996 ), or the highest reaching rocket ( Hill, 2006 ) it is the researchers belief that the time has come for technology educators to aspire to help students to use mathematics and science to make the most educated decisions regarding their design solutions. One strong indicator that the field of technology education has truly begun to infuse engineering design into the classroom will be when students begin approaching technology teachers and say “According to my calculations, we are not ready to build the prototype because the current design will not work”. This statement will likely never happen and the field will not authentically infuse the engineering design process unless technology educators implement and assess the use of mathematical models to predict design results and optimize student’s final design solutions.
Todd R. Kelley, Ph.D., is an Assistant Professor in the Department of Industrial Technology at Purdue University. He can be reached at
trkelley@purdue.edu.
Robert C. Wicklein, Ed.D., is a Professor at the University of Georgia. He can be reached at
wickone@uga.edu.
References
Advisory Committee on Engineering and Technology Education in Georgia. (2008). Final report: Investigation of engineering design as a focus for Georgia technology education. The Georgia Department of Education, Atlanta, Ga.
Asunda, P. A., & Hill, B.R. (2007).Critical features of engineering design in technology education. Journal of Industrial Technology Education. 44 (1), 25-48.
Clark, S.C. (1989). The industrial arts paradigm: Adjustment, replacement, or extinction? Journal of Technology Education, 1 (1),1 -9.
Dearing, B.M., & Daugherty, M.K. (2004). Delivering engineering content in technology education. The Technology Teacher, 64(3), 8-11.
Dillman, D.A., Tortora, R.D. & Bowker, D. (1999). Principles of constructing web surveys. Retrieved December 2, 2006, from http://survey.sesrc.wsu.edu/dillman/papers/websurveyppr.pdf
Gall, M.D., Gall, J.P., & Borg, W.R.(2007). Educational research: An introduction (8th ed.). Boston: Pearson Education, Inc.
Gattie, D.K. & Wicklein, R.C. (2007). Curricular value and instructional needs for infusing engineering design into K-12 technology education. Journal of Technology Education, 19 (1), 6-18.
Gay, L.R. & Airasian, P. (2000) Educational research: Competencies for analysis and application (6 th ed.) Columbus, OH: Merrill
Hailey, C. E., Erickson, T., Becker, K., & Thomas, T. (2005). National center for engineering and technology education. The Technology Teacher, 64 (5), 23-26.
Halfin, H.H. (1973). Technology: A process approach. (Doctoral dissertation, West Virginia University, 1973) Dissertation Abstracts International, 11 (1) 1111A.
Hill, R.B. (2006). New perspectives: Technology teacher education and engineering design. Journal of Industrial Teacher Education, 43 (3), 45-63.
Hill, R.B. (1997). The design of an instrument to assess problem solving activities in technology education. Journal of Technology Education, 9 (1), 31-46.
International Technology Education Association. (2000/2002) Standards for technological literacy: Content for the study of technology. Reston, VA.
Kelley, T. (2008). Cognitive processes of students participating in two approaches to technology education. Journal of Technology Education, 19 (2), 50-64.
Kelley, T. (2008). Examination of Engineering Design in Curriculum Content and Assessment Practices of Secondary Technology Education. Unpublished doctoral dissertation, University of Georgia, Athens.
Krejcie, R.V., & Morgan, D.W. (1970). Determine sample size for research activities. Educational and Psychological Measurement, 30 (3), 607-610.
Laporte, J. (2005). Mega-projects, ticky tacky, and our team. Journal of Technology Education, 17 (1), 2-5.
Lewis, C.D., Adams, R., Punnakanta, P., Littleton, J., & Atman, C. (2001, October). Assessing engineering design performance and teamwork: Cross-validating written self-reports and performance observations. Paper presented at ASEE/IEEE Frontiers in Education Conference, Reno, NV.
Lewis, T. (2004). A turn to engineering: The continuing struggle of technology education for legitimatization as a school subject. Journal of Technology Education, 16 (1), 21-39.
Lewis, T. (2005). Coming to terms with engineering design as content. Journal of Technology Education, 16 (2), 37-54.
Massachusetts Department of Education (2001). Science and Technology / Engineering Curriculum Framework. Retrieved July 24, 2007, from http://www.doe.mass.edu/frameworks/scitech/2001/standards/te9_101.html
Sanders, M. (2008). STEM, STEM Education, STEMmania. Technology Teacher, 68 (4) 20-26.
Volk, K. S. (1996). Industrial arts revisited: An examination of the subject’s continued strength, relevance and value. Journal of Technology Education, 8 (1) 27-38.
Welch, M. (April, 2001). Assessment in Technology Education: What, why, and how? Proceedings of the Second AAAS Technology Education Research Conference. American Association for Advancement of Science.
Wicklein, R. C. (2006). Five good reason for engineering design as the focus for technology education. The Technology Teacher, 65 (7), 25-29.
Young, S.F., & Wilson, R. J. (2000). Assessment and Learning: The ICE model. Winnipeg, MB: Portage & Main.