Volume 35, Number 3
Spring 1998


Minnesota’s Tech Prep Outcome Evaluation Model

James M. Brown
University of Minnesota
David Pucel
University of Minnesota
Cathy Twohig
University of Minnesota
Steve Semler
University of Minnesota
K. Peter Kuchinke
University of Minnesota

The importance of program evaluation and program improvement for vocational education programs in general, and Tech Prep in particular, has been well established ( Bragg, 1995 ; Hammons, 1995 ; Hoachlander, Levesque, & Rahn, 1992 ; Pucel, Brown, & Kuchinke 1996 ; Ruhland, Custer, & Stewart, 1995 ). Issues such as recruitment, retention, and dropout rates represent important concepts that should be addressed by efforts to evaluate Tech Prep initiatives. Bragg ( 1995 ) reviewed the progress and pitfalls of Tech Prep in recent years and concluded that

evaluation of any kind has been one of the most neglected components of Tech Prep… many of the nation’s most well developed Tech Prep efforts were found to be engaged only in routine, compliance-oriented evaluation rather than more ongoing, informative practices. ( p. 23 )

In a more comprehensive study that focused on student outcomes, Bragg ( 1997 ) also examined national and state level systems for Tech Prep evaluation. Referring to national evaluation efforts in the area of formal evaluation activities and efforts to collect data on Tech Prep student outcomes, Bragg stated

When attempting to understand how students move through the Tech Prep system, secondary to post-secondary and beyond, the number of local consortia that were able to provide student outcomes data in the area of participation and completion was so limited as to make most of the estimates meaningless. ( p. 10 )

Hoachlander, Levesque, and Rahn ( 1992 ), indicated that the 1990 Perkins Act requires states "to develop accountability systems that include performance measures and standards for secondary and post-secondary vocational education programs" ( p. 1 ). To assist in this development, Hoachlander, et al. outlined six basic requirements for effective outcome accountability systems that can be used as criteria for judging the usefulness of the required evaluation methodology:

  1. The desired measures and standards should be defined clearly and precisely.
  2. Outcomes should be easily and accurately measured, while minimizing data burden and cost.
  3. A manageable number of outcome measures should comprise the accountability system.
  4. Standards must have validity, justified in terms of success in either the workplace or further education and training; they must also be fair, avoiding bias by race, gender, or special needs.
  5. Data for each measure should be collected at appropriate intervals of time.
  6. The information generated by the accountability system should be routinely accessible by students, teachers, administrators, parents, employers, board members, and others interested in educational policy and performance. ( p. 5 )

The need for more systematic data collection and the subsequent use of data resulting from such efforts to specifically improve and enhance Tech Prep is evident throughout the literature. Hammons ( 1995 ) described the lack of direction that exists for the collection and review of quantitative Tech Prep data:

The 1990 Carl Perkins Legislation requires that states address five program performance measures. These include student retention, job placement, competency gains in academics, work or job skill attainment, and vocational competency attainment. The federal legislation failed to note any standardized methodology for states to use in determining and reporting these data. ( p. 6 )

As states initially struggled with the 1990 Perkins Act’s Tech Prep-related mandates, the U.S. Department of Education intentionally allowed states considerable freedom to design Tech Prep systems to be responsive to local conditions and constraints ( American Vocational Association, 1992 ). Minnesota’s secondary and post-secondary state agency personnel and Tech Prep consortia representatives elected to develop a flexible, decentralized Tech Prep system. Minnesota’s consortia have had the freedom to design and experiment with Tech Prep program designs that were felt to best fit local needs.

Minnesota developed an evaluation strategy that focuses on student-related outcomes (accountability), as well as self-assessment (continuous quality improvement). This article focuses on the accountability component of Minnesota’s Tech Prep Evaluation System. The continuous quality improvement component of Minnesota’s Tech Prep Evaluation System was discussed in a previous article ( Pucel, Brown, & Kuchinke, 1996 ). The guiding principles for the design and implementation of the student-related component of Minnesota’s Tech Prep Evaluation System assumes that the evaluators will (a) actively involve consortia and community partners in data collection efforts, (b) make use of existing data collection efforts and extant databases to as large a degree as possible, and (c) seek to avoid duplication of efforts and decrease the burden on local institutions, agencies, and personnel. Local program stakeholders were actively involved during the design phase, in order to assure a sense of ownership and understanding of evaluation design concepts and implementation processes.

A subsequent review of the literature has identified questions, both locally and nationally, about the selection, collection, analysis, and use of Tech Prep-related data. These questions arise from state-to-state evaluation system comparisons, as well as from comparisons of evaluation efforts at sites within states. These concerns often focus on the difficulties related to the seemingly simple process of identifying Tech Prep students. The identification problem typically consists of two primary components; (a) the lack of consistent processes for identifying Tech Prep students, and (b) the lack of specific definitions or criteria for identifying Tech Prep students. Ruhland et al. ( 1995 ) conducted a study of Tech Prep sites throughout the United States and concluded that Tech Prep student identification would be a critical factor in efforts to implement general systemic change ( p. 17 ). While some states have fairly extensive survey instruments in place for some schools ( Logan & Briscoe, 1994 ), procedures for Tech Prep student identification vary from school to school and do not even exist in many schools. Therefore, no adequate, generalizable statewide data collection processes designed to identify Tech Prep students are currently known to exist.

A review of the literature identified several citations that discussed the qualitative evaluation of Tech Prep. Studies by Hammons ( 1995 ), MacQueen ( 1995 ), and Pucel, Brown, and Kuchinke ( 1996 ) have addressed the importance and use of findings based upon informal surveys of Tech Prep sites, self-assessment of Tech Prep coordinators and sites, informal interviews, and other related practices. Such information is crucial to the further development and improvement of Tech Prep programs. It is important, however, that these qualitative methods be systematically combined with the use of quantitative data. Ruhland et al. ( 1995 ) discussed the need for this combined approach to analyzing Tech Prep.

Given the complexity and variety of Tech Prep programs, it is critical that multiple means be used to conduct program assessment. This is necessary in order to triangulate and interpret data as well as to capture the uniqueness of various local configurations. ( p. 19 )

Thus, the Minnesota Tech Prep Evaluation System was developed because of the lack of an appropriate Tech Prep evaluation system. The evaluation system consists of two major components: (a) a student-related data analysis schema that can provide accountability data for funding agencies and policy makers; and (b) a consortium self-assessment process ( Pucel, Brown, & Kuchinke, 1996 ) that provides information that can be used by local, state, and national agency and program representatives to make improvements in ongoing Tech Prep initiatives. The self-assessment component of this Evaluation System was developed using principles related to continuous quality improvement ( Roland, Cronin, Guberman, & Morgan, 1997 ). Together, these two evaluation approaches make up Minnesota’s Tech Prep Evaluation System. The remainder of this article focuses on the student-related data component of this evaluation system.

Purpose of the Tech Prep Evaluation System

The Carl Perkins Vocational and Applied Technology Education Act of 1990 (Perkins Act) decreed that each state receiving funding under the Perkins Act is responsible for annually evaluating student progress and assessing the overall effectiveness of the state’s Tech Prep delivery systems. The primary purpose of the Minnesota Tech Prep evaluation model was to design and validate a system that would help Tech Prep consortia conduct such assessments. In addition, the information gained by performing self-evaluations and objective outcome evaluations will be of value to state and local educational decision-makers and stakeholders. Accordingly, the Minnesota Department of Children, Families, and Learning, and the Minnesota State Colleges and Universities System contracted with the MRDC to design a system that could be used successfully to address the evaluation requirements of the Perkins Act and to provide information useful within Minnesota. The conceptual framework of that system is depicted in Figure 1.

Student Related Evaluation Data

The key outcomes addressed by student-related data used to produce a composite picture of each Tech Prep consortium’s processes and outcomes are depicted in Figure 1, as well as the self-evaluation component of the evaluation system. Data describing Tech Prep students at the secondary and post-secondary levels are combined with data from an annual Minnesota High School Follow-Up Study (MHSFS). The database management system containing these data will enable system administrators to produce aggregate Tech Prep outcome reports. Personnel in Tech Prep consortia, the Minnesota State Universities and Colleges System, and the Minnesota Department of Children, Families, and Learning can also use information from this evaluation system to communicate with various stakeholders throughout the state of Minnesota regarding Tech Prep program processes and outcomes.

Figure 1
Minnesota Tech Prep Consortia Evaluation System

Outcome: Outcome:
Student Related Data Collection Tech Prep Consortium Self-Evaluation

1 Number of Students Matrix of Tech Prep Systems
and System Activities
2 Retention
3 Related Job Placement
4 Higher Education
5 Leavers/Drop-Outs
6 Diplomas, Degrees, & Certificates

Outcomes of the Process Process Reviewed for CQI

Notes:The system includes a summative outcome component and a formative process improvement component.
From Brown, J. M., Pucel, D. J., & Kuchinke, K. P. ( 1994 ). Tech Prep Consortia Evaluation System: Training Manual.

Conceptual Model for Tech Prep Evaluation

The conceptual model for the Tech Prep outcome data component of the Tech Prep Evaluation System focuses on four primary data categories; (a) consortium name, (b) career cluster name, (c) student gender, and (d) proportions of students classified as members of special population categories (see Figure 2 ). Key considerations that influenced the design of this system reflect the needs of agency staff at the consortium, state, and federal levels. Thus, these needs dictate what information consortia should collect and report. The data collection mechanisms determine how the information should be collected and delivered to agencies responsible for analyzing the respective data. Data analysis processes and outputs enable the evaluation system to (a) integrate different sources of data, (b) minimize data entry errors and duplications, and (c) generate valid and reliable reports. Finally, this model provides a set of feedback mechanisms to help decision-makers and participants in Tech Prep consortia communicate key outcome findings internally, as well as throughout the state and nation.

Advisory Panel Input In Evaluation System Design

In accordance with the 1990 Perkins Act, the Minnesota Tech Prep Evaluation System includes both self-assessment and student-related components. Initially an advisory panel of state agency representatives, Tech Prep consortia leaders, and higher education faculty were enlisted to guide the development of the conceptual design of the evaluation system. The development of the student-related outcome component was based on a management information systems (MIS) design process. "System driving issues" were used to indicate what outcomes the system should address. These high level issues were used to determine specific questions that the system would need to answer. These specific questions guided the design of data collection mechanisms that, in turn, provided the conceptual framework for the design of the database management and data analysis functions. The entire student-related data system was designed to provide clear answers to the driving issues outlined as desired evaluation outcomes in the Perkins Act.

Issues Driving the Tech Prep Evaluation System

The 1990 Perkins Act specified that states receiving federal money for Tech Prep must collect and report accountability data related to student outcomes. The context for Tech Prep evaluation is specified in 34 CFR Part 405, Subpart D (405.30) of the 1990 Perkins Act, which specified that:

  1. Each grantee must provide (and allocate a budget for) an independent evaluation of grant activities.
  2. The evaluation shall be both formative and summative in nature.
  3. The evaluation must be based on student achievement, completion, placement rates, and project and product spread and transportability. ( American Vocational Association, 1992, p. 44 )

Thus, MRDC researchers used the Perkins Act evaluation specifications, listed above, to determine specific data collection requirements. To capture information about Minnesota’s Tech Prep outcomes, the evaluation system was designed to answer 17 questions. Those key questions are listed below. Definitions of key terms in these questions are provided later in this document.

  1. How many "Tech Prep Students" began commitments this year (at the beginning of grade 11) to complete four-year Tech Prep programs?
  2. What was the consortium’s total number of "Tech Prep Students" working toward completing four-year Tech Prep programs?
  3. How many "Tech Prep Participants" enrolled in the consortium’s Tech Prep courses this year, but have not made a commitment to complete the four-year Tech Prep program?
  4. How many of the consortium’s "Tech Prep Students" are members of special populations?
  5. How many of the consortium’s "Tech Prep Participants" are members of special populations?
  6. Describe the consortium’s "Tech Prep Student" population in terms of gender.
  7. Describe the consortium’s "Tech Prep Participant" population in terms of gender.
  8. What were the consortium’s "Tech Prep Student" job placement outcomes in related occupations among those completing four-year Tech Prep programs?
  9. What were the consortium’s "Tech Prep Participant" job placement outcomes in related occupations after the students left their secondary or post-secondary vocational education programs?
  10. How many "Tech Prep Students" subsequently enrolled in Higher Education programs in related career clusters after completing four-year Tech Prep programs?
  11. How many "Tech Prep Participants" subsequently enrolled in "Higher Education" programs in related career clusters after they left their secondary or post-secondary vocational education programs?
  12. How many "Tech Prep Students" became "Drop-Outs/Leavers" before completing all four years of a Tech Prep program?
  13. How many "Tech Prep Participants" became "Drop-Outs/Leavers" before completing all four years of a Tech Prep program?
  14. How many 12th-grade "Tech Prep Students" received high school diplomas?
  15. How many twelfth-grade "Tech Prep Participants" received high school diplomas?
  16. How many "Tech Prep Students" received degrees, diplomas, or certificates after completing all four years of their Tech Prep program?
  17. How many "Tech Prep Participants" earned degrees, diplomas, or certificates in post-secondary vocational education or Tech Prep programs?

This set of questions established the data requirements for the student-related data component of the Minnesota Tech Prep Evaluation System. The Advisory Panel reviewed the questions for completeness and utility. After revisions and additions, the panel approved the list for use in Minnesota. Each question addresses four levels of analysis: consortium, program cluster, gender, and special populations. The four levels are depicted below in Figure 2.

Figure 2
Levels of analysis for Tech Prep evaluation

Levels of Analysis

Consortium Name The unit of analysis for evaluation reports
Cluster Name The Tech Prep Clusters, as defined by the
Perkins Act:
  • Engineering Technology
  • Applied Science
  • Mechanical, Industrial, or Practical Art or Trades
  • Agriculture
  • Business
Student Gender The gender of Tech Prep students
Students' Special Groups identified by the 1900 Perkins Act:
Population Affiliation
  • Individuals with Disabilities
  • Educationally Disadvantaged Persons
  • Economically Disadvantaged
  • Limited English Proficiency
  • Participants in Programs Designed to
Elinate Gender Bias Individuals in Correctional Institutions

Note: Student outcomes are aggregated at four levels of analysis.

Outcome Reporting Requirements

In addition to the four levels of analysis specified in Figure 2 for the evaluation of Tech Prep consortia, there are six broad process outcomes that were specified in the Perkins Act that should be addressed.

  1. Number of enrolled Tech Prep students (or participants).
  2. Retention of students or participants.
  3. Related job placement for Tech Prep program completers.
  4. Number of Tech Prep students entering higher education programs.
  5. Number of Tech Prep students or participants who left or dropped out of the Tech Prep program.
  6. Number of diplomas, degrees, and certificates awarded to Tech Prep program students or participants.

Definitions of the Outcome Measures

Some states have adapted the Tech Prep Implementation Self-Assessment Inventory (Partnership for Academic and Career Education, 1992), which identifies Tech Prep or College Prep students by using information drawn from the post-high school goals identified in students’ four-year plans. The developers of the Minnesota system, however, found it advantageous to focus on both committed and non-committed students. Other states that evaluate Tech Prep programs based on numbers that include all students who enroll in Tech Prep courses report exaggerated graduation and dropout rates for their Tech Prep participants. When using such an approach, completion rates of students in Tech Prep programs tend to appear to be low. Excluding incidental Tech Prep enrollments (students enrolled in Tech Prep-related courses, but with no intent to complete Tech Prep programs), while counting only students committed to completing Tech Prep programs, tends to under-represent the impact of Tech Prep programs. This approach can also lead to inaccurate data by suggesting lower drop out rates.

In an effort to avoid problems associated with the approaches used in some states (as described above), MRDC researchers designed, developed, and/or adopted the following definitions for use throughout Minnesota’s Tech Prep consortia. The use of these definitions focuses both on "committed" and "uncommitted" students enrolled in Tech Prep.

What is a Tech Prep Program? "[A] Tech Prep education program means a combined secondary and post-secondary program that: (a) leads to an associate degree or two-year certificate; (b) provides technical preparation in at least one field of engineering technology, applied science, mechanical, industrial or practical art or trade, or agriculture, health, or business; (c) builds student competence in mathematics, science, and communications (including through applied academics) through a sequential course of study; and (d) leads to placement in employment." ( Department of Education, 1992 )

What is a Tech Prep Student? A Tech Prep student is someone enrolled in a Tech Prep program and who makes a commitment to completing the entire program.

What is a Tech Prep Participant? A Tech Prep participant is someone who participates in one or more Tech Prep courses but does not make a commitment to completing the entire program.

What is a Tech Prep Drop Out (Leaver)? A Tech Prep Drop Out is someone who made a commitment to completing the entire Tech Prep program, but then left the program without completing all of its requirements. A student is classified as a Tech Prep Drop Out/Leaver if he or she does not re-enroll and complete the program within five years after first dropping out.

What is a High School Drop Out (Leaver)? A High School Drop Out is someone who left high school without completing all of the requirements to receive a high school diploma or certificate.

What is Tech Prep Retention? Retention indicates that a student stays in a Tech Prep program until he or she has completed all required components of the program. This assumes that some students may occasionally leave a program temporarily but re-enroll within five years and subsequently complete the program.

Who is a Tech Prep Completer ? A Tech Prep Completer is someone who successfully meets all the Tech Prep program requirements and completes (a) a secondary Tech Prep program, (b) a post-secondary Tech Prep program, and/or (c) an apprenticeship program. Tech Prep completion is certified by either a degree, diploma, or certificate from one of the cooperating Tech Prep consortia institutions.

The definitions of key Tech Prep-related terminology in Minnesota were developed, based on an examination of current Tech Prep literature and with input and feedback from the MRDC’s project advisory committee. During definition development efforts, particular attention was given to the distinction between students who were committed to completing four-year Tech Prep programs and those who were participating in Tech Prep courses without such a commitment. The students making commitments to the full four years (Tech Prep Students) are identified when they complete forms and indicate their plans to complete a four-year sequence of study in a specific career cluster. The other group (referred to as Tech Prep Participants) is comprised of those who enroll in one or more Tech Prep courses but do not indicate that their plan is to complete the full four-year course of study. With these key terms defined, the researchers were able to develop specific data requirements to guide the design of practical data collection and analysis components that would meet the state and federal Tech Prep evaluation requirements.

Database Development

One of the greatest difficulties that researchers faced in designing the Evaluation System’s database management system was the reorganization and merger of Minnesota’s state universities, community colleges, and technical colleges in 1996. Prior to this merger there was no centrally accessible storehouse of student data in all three types of education institutions. Without a central data management mechanism, each Tech Prep consortium had to collect enrollment, diploma degree, and certificate information from its partnering technical colleges and then pass this information on to state agency personnel. The researchers concluded that a database management system with access to a statewide pool of student data would be a great asset to the Evaluation System. In addition, the Evaluation System’s use of the recently developed MnSCU database management system would minimize the data collection and reporting burden on Tech Prep consortia.

Efforts to design the student-related database were coordinated with the ongoing development of the data collection tools. Most of the information needed for Tech Prep evaluation efforts was already being collected for other purposes. The data elements from the Tech Prep Identifier Form (TPID) represent the only additional data that had to be collected to implement this Evaluation System. Each data element addresses a specific goal by (a) directly answering an evaluation question, (b) eliminating information duplication, and/or (c) providing tracking information between secondary and post-secondary systems.

Data Collection Mechanisms

Klay ( 1991 ) suggested that stakeholder involvement is a critical ingredient for leadership. "A principal assumption underlying assertions that stakeholders should be involved in evaluation, or that evaluation should be furnished to each stakeholder group, is that leadership capacity to utilize information and shape policy in each group" ( p. 283 ). Thus, Minnesota’s Tech Prep Evaluation data collection efforts have involved stakeholders throughout the development of data collection strategies. As a result, data are drawn from previously existing data collection processes. The Tech Prep data sources consist of the Minnesota High School Follow-up System (HSFS) (an annual state-wide process), the Tech Prep Identifier Form (TPID) (a new data collection process), and post-secondary enrollment data already collected by the Minnesota State Colleges and Universities System.

HSFS is a data collection and reporting system administered by the Minnesota Department of Children, Families, and Learning. Local education agencies (LEAs) use the system to collect information from current and former students. Each LEA uses a common set of instruments and procedures. The selection of this decentralized model was based on the following assumptions:

  1. Local district participation maximizes the possibility that resulting data will be appropriate, and therefore used for [practical policy- and] decision-making efforts;
  2. Students are more likely to respond to a questionnaire from the high school, as compared to a central state agency; and
  3. The large number of school districts involved would make an additional form of centralized data collection cost prohibitive. ( Seday, 1992, p. 2 )

The goals of the HSFS are to:

  1. Provide information necessary to meet local, state, and federal reporting requirements;
  2. Provide data that will be useful in assisting local and state personnel in planning and evaluating educational programs;
  3. Provide a basis for comparisons of students’ plans prior to leaving high school with their actual activities one year later;
  4. Produce a high rate of response;
  5. Be of reasonable cost; and
  6. Provide consistent information across schools. ( Seday, 1992, p. 2 )

With a response rate consistently over 90%, the HSFS provides a reliable and consistent means of collecting data on high school students, in general, and Tech Prep students and participants, in particular.

In addition, the Tech Prep Identifier Form (TPID) was developed to augment the HSFU survey, since it does not directly address Tech Prep. The primary purpose of the TPID is to identify and distinguish between Tech Prep Participants and Tech Prep Students. The TPID Forms are filled out by all students enrolled in all courses considered to be part of each Consortium’s Tech Prep-related curricula. The TPIDs are completed twice during each school year, in October and in March.

Information related to program outcomes of post-secondary Tech Prep students is obtained from a newly developed MnSCU database management system. Using data provided by post-secondary institutions in Minnesota and out-of-state institutions with reciprocity agreements, the Tech Prep Evaluation System tracks students throughout the post-secondary phase of their four-year Tech Prep programs. This enables state and local personnel to calculate program completion rates and to facilitate long-term research and evaluation regarding the job-related success of Tech Prep students. Such efforts can also be analyzed in terms of factors such as earning power, length of employment, and other data that may be provided by the Minnesota Department of Economic Security (DES).

Data Analysis Mechanisms

The previously described data sources provide the raw data needed to answer the original evaluation questions and to address the issues posed by the Perkins Act. Data analysis processes convert the raw data into usable information. The MRDC’s researchers were able to design a blended Tech Prep student-related database by combining information from the three data sources described above. Subsequently, procedures were specified for combining data from both secondary and post-secondary education sources.

The merger of secondary and post-secondary databases. The construction and pilot-testing of the Tech Prep student-related database had to wait until two events had occurred. The first event was the completion of the MnSCU merger (which replaced the State Board of Technical Colleges) and the second event was the development of its system-wide "data warehouse." Once MnSCU’s database management system was implemented, database field names and post-secondary data sources were identified. Simultaneously, the first cohort of Tech Prep students began to complete the post-secondary phase of Minnesota’s Tech Prep programs. Thus, certificates and degrees began to be awarded to Tech Prep program completers. As of June 1998, actual outcome data will begin to be collected for the first time in Minnesota.

Pilot-tests of post-secondary student data collection efforts began early in 1997. Both the secondary and post-secondary systems use Microsoft FoxPro‘ software, making the merger relatively easy. The merging procedure matches student social security numbers from existing database tables to create new records in the Tech Prep database. The combined data enable researchers and administrators to generate reports based on the aforementioned 17 Tech Prep evaluation system questions.

Query analysis and report generation . Data queries and data analyses will be accomplished relatively easily because the student-related database was designed with the 17 evaluation questions in mind. Appropriate combinations of data from the Tech Prep database will be used to answer each evaluation question. The relevant data elements required to address each question are depicted in Table 1. The data fields are organized into a matrix that lists specific information regarding students’ characteristics and Tech Prep program status (see Table 2). The data matrices use one potential format for reporting key data from the student-related database. The database management system is also formulated so that ad-hoc reporting efforts, using the query generation functions of the database software, enable agency staff to answer additional questions that were not initially envisioned by the Evaluation System’s designers.

Feedback mechanisms . After student-related Tech Prep outcome information becomes available, staff at MnSCU and the Minnesota Department of Children, Families, and Learning will have several options for providing feedback to stakeholders (e.g., Tech Prep consortia and local school districts). Among these potential feedback strategies are research reports, summary documents to consortium coordinators, and information to be disseminated through the MRDC’s Tech Prep Evaluation Web Page on the Internet ( http://mail.wcfe.coled.umn.edu/techprep/ ). There are two overall goals of the feedback component of the student-related data system. The first goal is to provide outcome evaluation information to decision-makers at the state and consortium levels. The second goal is to provide federal legislators and government agency personnel with evidence regarding the effectiveness of Tech Prep programs, as required by the Perkins Act.

Remaining Challenges

As Minnesota’s Tech Prep Evaluation System has been developed, it has encountered unexpected barriers to success. These are not insurmountable, but they do pose substantial barriers to the implementation of a Tech Prep Evaluation System. These difficulties include (a) resistance to the use of individual secondary student social security numbers, (b) duplication of data, and (c) the increasing unwillingness of some consortia to participate in the Evaluation System. In addition, Tech Prep is in the process of being reconceptualized into part of Minnesota’s larger School-to-Work initiative. While transforming, the assessment framework and mechanisms developed for Tech Prep implementation will likely inform the assessment of School-to-Work programs.

Unique Student Record Identifier

The ability to use unique student record identifier numbers represents a key assumption that affects the ability of an evaluation system to effectively access, monitor, and evaluate student outcome data. A data element that identifies each unique individual Tech Prep student is needed to allow the secondary and post-secondary systems to track students as they make the transition from secondary education programs into post-secondary institutions. It is difficult to match data from the HSFS with Tech Prep student records without a unique identifier number for each student. In essence, a unique student record identifier code lies at the heart of any database or information system, and the Minnesota Tech Prep Evaluation System is no different.

The only currently existing unique student identifier code is each student’s social security number. Many individuals and groups within the state of Minnesota and elsewhere, however, strenuously and vocally object to using social security numbers on the grounds of data privacy concerns. These individuals are unsure of the ability (or willingness) of governmental agencies to protect information from being used in an inappropriate manner. Many of those protesting the use of social security numbers also fear the implementation of computer systems that potentially might allow public employees to view the educational, employment, health, penal records, etc. of private citizens for any reason. This vocal faction has strong representation in Minnesota and has established policies that may restrict the ability of the Tech Prep Evaluation System to collect and analyze data using social security numbers to sort, track, and identify aggregated student outcomes.

Minnesota’s Tech Prep Evaluation System and the Tech Prep Identifier Form were originally designed to use social security numbers as a unique student record identifier code. Administrators should consider using an alternate identifier method until the conditions under which the social security number may be used have been adequately clarified. One alternate method uses a combination of data based on each individual’s first name, middle name, last name, and date of birth as a unique identifier. The effect of this approach would significantly increase the complexity, processing time, and expense required to sort and analyze data within the evaluation system. However, the use of names and birth date information may eventually prove more useful than social security numbers because approximately 40% of secondary education students do not provide social security numbers on TPID forms.

Duplicate Forms for Multiple Classes

Originally, Tech Prep students at the secondary level filled out separate Tech Prep Identifier Forms for each Tech Prep class in which they enrolled. This created duplicate records in the reporting database and led to inaccurate outcome reports. Currently, the system is being modified so that students will just fill out one identifier form per semester, regardless of how many Tech Prep classes they take.

Unwillingness of Some Consortia to Participate

One large school district in Minnesota has refused to implement efforts to collect data with the TPID, which is necessary to identify Tech Prep students and participants. If this behavior were to become widespread, the Evaluation System would be required to use another process to identify which students are benefiting from Minnesota’s Tech Programs.

Evolution of the School-To-Work System and Federal Funding

The funding initially provided by the Perkins Act to Tech Prep consortia has diminished substantially for existing Minnesota consortia. Congress has issued statewide system implementation grants for School-to-Work initiatives and many educators and agency personnel in Minnesota anticipate Tech Prep funding to be integrated into the state’s School-to-Work (STW) initiative. This has created uncertainty about the viability and continuance of Tech Prep programs, even though Tech Prep represents one of the foundation components of the STW initiative. Still, some consortia are expected to question the need to participate in Tech Prep evaluation and assessment as Tech Prep is integrated into STW. This has weakened local and state-level support for the implementation of a fully integrated Tech Prep Evaluation System even before it could begin to show how Tech Prep affects student outcomes.

Summary

During the summer of 1997, Minnesota’s first cohort of Tech Prep students will enter the work force or additional higher education programs. Minnesota’s Tech Prep system administrators now wrestle with issues of data privacy and integrity, decreasing willingness of consortium personnel to participate in evaluation efforts, and the implications of the STW initiative’s for the Tech Prep Evaluation System. Despite these challenges, Tech Prep evaluation data collected during 1997 represent a valuable tool for assessing the outcomes for Minnesota’s first students to successfully complete their four-year Tech Prep programs.

A complete and comprehensive evaluation of Tech Prep in Minnesota will be achieved as the evaluation system is fully implemented and merged with the requirements of Minnesota’s (yet to be designed or developed) STW evaluation system. Minnesota’s Tech Prep evaluation efforts will be one of the first to comprehensively address the requirements for effective accountability systems set out by Hoachlander, Levesque, and Rahn ( 1992 ). Minnesota’s Tech Prep Evaluation System can also provide timely and accurate information about Tech Prep programs and can help lead the continuous improvement of the Tech Prep concept in Minnesota and pave the way for an integrated School-to-Work evaluation system. The lessons learned, however, can benefit more than just the students of Minnesota. The student-related data system detailed here can help other states and territories improve their Tech Prep and STW evaluation programs as well. Only sound evaluation programs can give us valid, reliable information that can reveal how well Tech Prep programs have actually served the needs of their stakeholders. The Minnesota Tech Prep Evaluation System offers one tool that educators can use to increase their understanding of the how Tech Prep has impacted the students it was designed to serve.

Authors

Brown is Associate Professor, Department of Work, Community, and Family Education at the University of Minnesota, St. Paul, Minnesota.

Pucel is Professor, Department of Work, Community, and Family Education, University of Minnesota, St. Paul, Minnesota.

Twohig and Semler are doctoral candidates in the Department of Work, Community, and Family Education, University of Minnesota, St. Paul, Minnesota.

Kuchinke is Assistant Professor, Department of Human Resource Education, University of Illinois, Urbana, Illinois.

References

American Vocational Association. (1992). The Carl D. Perkins vocational and applied technology education act of 1990: The final regulations . Alexandria, VA: Author.

Bragg, D. B. (1997). Educator, student, and employer priorities for tech prep student outcomes . (MDS-790). Berkeley: National Center for Research in Vocational Education, University of California at Berkeley.

Bragg, D. B. (1995). Working together to evaluate training. Performance and Instruction, 34 (10), 26-31.

Brown, J. M., Pucel, D. J., & Kuchinke, K. P. (1994). Tech prep consortia evaluation system: Training manual . St. Paul, MN: University of Minnesota, Minnesota Research and Development Center for Vocational Education.

Department of Education. (1992). State vocational and applied technology programs and national discretionary programs of vocational education: Final regulations. Part 406. Subpart A.406.5. Federal Register , August 14, 1992, 36764.

Hammons, F. T. (1995). Florida tech prep education evaluation model . Florida State Department of Education. Division of Applied Technology and Adult Education. (ERIC Document Reproduction Service NO. ED 391098)

Hoachlander, E. G., Levesque, K., & Rahn, M. L. (1992). Accountability for vocational education: A practitioner’s guide (Report No. MDS-407). Berkeley, CA: National Center for Research in Vocational Education.

Klay, W. E. (1991). Strategic management and evaluation: Rivals, partners, or just fellow travelers? Evaluation and Program Planning, 14 , 281-289.

Logan, J., & Briscoe, M. (1994). Kentucky’s evaluation system for tech prep programs and data reported by secondary and post-secondary Kentucky tech prep programs in 1993 . Kentucky University, Lexington. Institute on Education Reform. (ERIC Document Reproduction Service No. ED 379423)

MacQueen, A. B. (1995, February). Assessing tech prep, A Rhode Island perspective . Paper presented at Workforce 2000, the Annual Conference on Workforce Training of the League for Innovation in the Community College, San Diego, CA. (ERIC Document Reproduction Service No. ED 380155)

Pucel, D. J., Brown, J. M., & Kuchinke, K. P. (1996) . Evaluating and improving tech prep: Development, validation, and results of the Minnesota self-assessment model. Journal of Vocational Education Research, 21 (2), 79-106.

Roland, C., Cronin, K., Guberman, C., & Morgan, R. (1997). Insights into improving organizational performance. Quality Progress, 30 (3), 82.

Ruhland, S. K., Custer, R. L., & Stewart B. R. (1995, April). Evolving a model for evaluating tech prep implementation . Paper presented at the meeting of the American Educational Research Association Meeting, San Francisco, CA. (ERIC Document Reproduction Service No. 384650)

Seday, J. (1992). Follow-up 1992: Minnesota high school follow-up class of 1991–one year later . St. Paul, MN: Minnesota Department of Education.

Reference Citation: Brown, J., et al. Minnesota’s tech prep outcome evaluation model. Journal of Industrial Teacher Education, 35 (3) 44-66.