JVER v28n1 - Evaluating Tech Prep Education Programs: Implications for Reporting Program and Student Outcomes

Volume 28, Number 1
2003


Evaluating Tech Prep Education Programs: Implications for Reporting Program and Student Outcomes

Sheila K. Ruhland
University of Minnesota

Abstract

Tech Prep education programs play a vital role in the education of American youth. During the past decade, with funding from the Perkins Act, Tech Prep consortia have consolidated and developed programs for students. The Carl D. Perkins Vocational and Technical Education Act of 1998 required each state to identify performance levels relevant to career and technical education. Seven essential program elements and four core indicators provide the foundation to evaluate Tech Prep education programs. This work is based upon a review and synthesis of state and local Tech Prep evaluation efforts. This paper will identify an evaluation model to guide those responsible for the planning, data collection, and analysis of Tech Prep program and student outcomes data.

Introduction

The Carl D. Perkins Vocational and Technical Education Act provides federal funds "…to help provide vocational-technical education programs and services to youth and adults" ( U.S. Department of Education, n.d. , How is the Perkins Act administered by the education department, ¶ 2). Funds from the Perkins Act are awarded to state education agencies and have supported the development and expansion of Tech Prep education programs. Nationally, between 1991 and 1997, more than 1,000 Tech Prep consortia were created, covering approximately 70% of secondary school districts and serving about 90% of all U.S. American high school students ( Hershey, Silverberg, Owens, & Hulsey, 1998 ). Perkins reauthorization in 1998 required states to assess the effectiveness of achieving the goals outlined in their State plan. For most states this meant the need to develop an evaluation plan and to identify the data to collect to be accountable for the use of Federal funds.

The planning, data collection, and analysis of Tech Prep data at the local, state, and national level has been minimal. Research conducted by Bragg (1997) with state and local Tech Prep coordinators indicated that time, resources, and turnover of local Tech Prep coordinators are factors that impact their efforts to collect and report Tech Prep data on an annual basis. Bragg further reported that recent evaluations of Tech Prep education programs identified many promising trends and challenges:

Of nearly 50% of all local Tech Prep consortia in the United States, 40% reported they had not even begun to implement formal evaluations of their Tech Prep programs. Another 30% indicated their consortia were in the planning stage of evaluation, showing only a minority of Tech Prep consortia were actively implementing formal evaluations, and most of these were very preliminary. (p. 7)

Research indicates that, although Tech Prep program implementation has been widespread, the reporting of student outcomes is unclear or limited ( Bragg, Puckett, Reger, Thomas, Ortman, & Dornsife, 1997 ; Silverberg, Hulsey, & Hershey, 1997 ). A four-year longitudinal study began in January 1998 to better understand the relationship between Tech Prep program implementation and reporting of student outcomes ( Bragg, 2001 ). Data collected from eight Tech Prep consortia assessed Tech Prep initiatives and how they influenced students' educational experiences and outcomes. "On average, Tech Prep enrolled about 15 percent of the high school students in these selected consortia during the 1996-97 academic year, and have undoubtedly grown more since that time" ( Bragg, 2001 , p. ix). Results indicated that a least 65% of the Tech Prep participants enrolled in postsecondary education within one and three years of high school graduation.

Most consortia report plans to develop comprehensive student databases, but thus far they have not implemented them ( Silverberg, et al., 1997 ). Research conducted by Brown, Pucel, Twohig, Semler, and Kuchinke (1998) identified two major problems related to Tech Prep evaluation efforts: (a) a lack of specific definitions or criteria to identify a Tech Prep student, and (b) a lack of consistent processes to identify a Tech Prep student. A national Tech Prep study conducted by Ruhland, Custer, and Stewart (1995) concluded that Tech Prep student identification is a critical factor in implementing systemic change in Tech Prep education programs. Without a Tech Prep student definition, consortia are unable to identify the data required to report and evaluate Tech Prep program and student outcomes data. This leads to the inability to evaluate and report Tech Prep education program results within secondary schools and two-year colleges.

The Tech-Prep Education Act does not provide a definition of a Tech Prep student, concentrator, or completer ( U.S. Department of Education, 1998 ). The need to develop a state definition would provide consistent and useful Tech Prep program and student outcomes data. When a range of Tech Prep definitions are used difficulties emerge. Barnett (2002) states, "…when data from students identified under different systems is compiled together, the resulting information is not very useful" (p. 61). The task of defining a Tech Prep student has not been easy, and most states continue to struggle with developing a definition.

The National Association of Tech Prep Leadership (NATPL) developed definitions for Tech Prep secondary and postsecondary students by surveying state Tech Prep coordinators. The NATPL Executive Committee and Research Committee (C. Jurgens, personal communication, November 14, 2000) provided the following definitions. A Tech Prep secondary student has indicated a Tech Prep career pathway and is enrolled in a Tech Prep course of study that: (1) includes a technical component; (2) consists of a minimum of two years secondary and two years of postsecondary study; (3) is carried out under a written articulation agreement; (4) may allow the student to earn postsecondary credit while in secondary school; and (5) leads to a specific postsecondary two-year certificate, degree, technical diploma, apprenticeship, or baccalaureate degree.

A Tech Prep postsecondary student is enrolled in a two-year certificate, degree, technical diploma, or apprenticeship program and has participated in a secondary Tech Prep course of study that: (1) included a technical component, (2) consisted of a minimum of two years at the secondary level, (3) was carried out under a written articulation agreement, and (4) may have allowed the student to transfer in postsecondary credit earned at the secondary school. NATPL defines Tech Prep completer as a student who has participated in both the secondary and postsecondary portions of the recognized education plan and has received an appropriate postsecondary two-year certificate, degree, technical diploma, or apprenticeship license.

Purpose

The primary purpose of the work described herein was to identify and recommend a Tech Prep evaluation model to assist state and local Tech Prep personnel with the evaluation of Tech Prep education programs. Due to the variability of state and local Tech Prep evaluation requirements, consortia have considerable flexibility in designing an evaluation model to meet their individual needs. A secondary purpose was to identify Tech Prep program outcomes data to assist with the evaluation of the Tech Prep seven essential program elements, and Tech Prep student outcomes data to assist with the evaluation of the Perkins four core indicators to meet the accountability reporting requirements of the Carl D. Perkins Vocational and Technical Education Act of 1998.

Literature Review

The literature review presents an analysis of four evaluation models, describes the Tech Prep seven program elements, and defines the Perkins four core indicators for performance reporting.

Evaluation Models

Evaluation is defined as "a systematic study of a particular program or set of events over a period of time in order to assess effectiveness" ( Hitchcock & Hughes, 1989 , p. 7). Program evaluations assess how well a program has worked in terms of its stated goals. Methods of evaluation range from individual reviews of performance to statewide assessments. Evaluation may occur at regular intervals throughout a program to measure progress (formative), or may occur at the end of a time period to summarize the results (summative) ( Dutton, Hammons, Hudis, & Owens, 1994 ).

A critical aspect of program evaluation is designing an evaluation model. "An evaluation model not only provides the overall framework for evaluation but also gives shape to the research questions, organizes and focuses the evaluation, and informs the process of inquiry" ( Conrad & Wilson, 1985 , p. 19). Previous research has not identified any one model as the best approach to evaluating Tech Prep education programs. Table 1 provides a summary of four evaluation models and identifies the individuals who have written about the model, primary uses of each model, and the benefits and limitations of each evaluation model ( Worthen, Sanders, & Fitzpatrick, 1997 ).

Table 1
Analysis of Evaluation Models
Analysis of Evaluation Models Analysis of Evaluation Models Cont.
Note . Adapted from Worthen, Sanders, and Fitzpatrick, 1997 , pp. 179-181.

The objectives-oriented model ( Worthen, et al., 1997 ) determines the extent to which objectives are being achieved. This model has measurable objectives and uses instruments to gather data. The management-oriented model (Worthen, et al.) assists with decision-making. This model evaluates all stages of program development and is often used for accountability. The expertise-oriented model (Worthen, et al.) provides professional judgments and is often used with self-study and accreditations. The participant-oriented model (Worthen, et al.) responds to an audience's requirement for information. This model focuses on description and judgment, with emphasis on understanding the information collected.

An important purpose of conducting a Tech Prep evaluation is to enhance program improvement. Evaluation processes should be integrated into Tech Prep education program planning so that the results of the evaluation can be used to guide decision-making and future planning. Ultimately, this should lead to action involving "program change, innovation or improvement" ( Barak & Breier, 1990 , p. 58). Three strategies are described to link Tech Prep evaluation results with program improvement: action-planning, continuous quality improvement, and Malcolm Baldrige.

Action-planning . Action-planning helps individuals or groups follow through on what they have learned following the program evaluation. This evaluation strategy can be initiated at the state or local consortium level and involves participants from secondary schools and two-year colleges. Action-planning starts by engaging stakeholders in reflecting carefully on the results of the evaluation and putting the results into context. This process identifies the strengths and weaknesses of the Tech Prep education program and the evaluation process. Following this reflection, stakeholders prioritize the issues and set new goals, focusing on activities that have a high impact on student and program outcomes.

Once the priorities are set, the action plan can be developed. A typical action plan often describes the goals, objectives, strategies to address those objectives, potential barriers, and needed resources (human, technical, and funding). The action plan describes the state deliverables, responsibilities, and timelines for achieving the goals. Criteria and monitoring methods (e.g., who, when, how) should be specified. Ultimately, the action plan should provide the framework for achieving the Tech Prep education program goals.

An example of a consortium that has incorporated action-planning is the Mid- Minnesota's School-to-Career/Tech Prep ( Schroeder, 2000 ). The consortium planning starts with the Tech Prep leadership team identifying the goals for each of Minnesota's seven Tech Prep indicators. The Tech Prep leadership team identifies the strategies to assist the nine secondary schools within the consortium to develop their local plans. Each secondary school has the flexibility of identifying the individuals responsible for completing the action plan. The plan is designed to meet the individual secondary school's needs. Resources are identified for each strategy. During the spring, summaries of the action-planning results are submitted to the local Tech Prep consortium director. The nine secondary schools meet to discuss their individual activities and the goals achieved.

Continuous quality improvement . Continuous quality improvement (CQI) is an approach to quality management that focuses on the process rather than the individual. CQI stems from a range of sources including the quality movement, total quality management, and the Japanese kaizen view of quality which focuses on the process rather than the results ( Dixon & Swiler, 1990 ). Tech Prep consortia can apply this strategy by involving a range of consortium members when conducting the Tech Prep program evaluation.

Tech Prep's definition of quality may be defined by legislation or local, regional, or state policies. Data collected to evaluate Tech Prep education programs should document state or local consortia quality indicators. The National Association of Tech Prep Leadership (NATPL) (1999 , 2003 ) has developed a list of quality indicators to provide a consistent vision for Tech Prep education programs. The quality indicators are written for five integral Tech Prep program components: (a) accountability/sustainability, (b) student opportunities, (c) curriculum, (d) articulation, and (e) professional development. These indicators provide a benchmark for Tech Prep continuous quality improvement.

Texas has developed a minimum of two quality measures for each of the 10 review areas included in their Tech Prep consortium site visit ( Texas Higher Education Coordinating Board, 2000 ). A site visit includes the review of each measure. The reviewer rates each measure as "meets standard" or "does not meet standard." As a result, measures meeting the standard can be expanded, and measures that do not meet the standards can be addressed and corrected. This process permits the consortium to be in a continuous quality improvement cycle.

Malcolm Baldrige . The Malcolm Baldrige National Quality Award provides criteria as a management guide for quality improvement in America ( Baldrige National Quality Program, n.d. ). Ross (1993) states that the common themes of the Malcolm Baldrige award are customer-driven quality, continuous improvement, measurement, participation, leadership, and management by data (rather than experience or intuition). Within the Malcolm Baldrige National Quality Program criteria has been identified for U.S. Education Organizations to improve their performance. The core values and concepts for educational excellence are embodied in seven categories and include: (a) leadership, (b) strategic planning, (c) student and stakeholder focus, (d) information and analysis, (e) faculty and staff focus, (f) educational and support process management, and (g) organizational and performance results. The criteria focus on five organizational performance areas: (a) student performance results, (b) student and stakeholder focused results, (c) budgetary and financial results, (d) faculty and staff results, and (e) organizational effectiveness results. The Baldrige Education Criteria for Performance Excellence provide a valuable framework to assess and measure Tech Prep education programs.

In Minnesota, a Tech Prep Self-Evaluation System model ( Pucel, Brown, & Kuchinke, 1996 ) was designed using the Malcolm Baldrige National Quality Award criteria. The model " …was designed to gather data on program outcome measures to monitor actual consortium productivity, and to gather self-evaluation data as a basis for program improvement" (p. 82). Tech Prep stakeholders were involved in the process, and data were collected to help consortia effectively identify areas of improvement.

Tech Prep Seven Program Elements

The evaluation of Tech Prep education programs offers many benefits, including the compliance with the rules and regulations of the federal act. Two key benefits are improving programs and providing accountability ( Boulmetis & Dutwin, 2000 ; Connell & Mason, 1995 ; Logan, 1999 ). Section 204 of the Carl D. Perkins Vocational and Technical Education Act 1998 ( U.S. Department of Education, 1998 )outlines the content for Tech Prep education programs. Each consortium receiving Perkins funding is required to submit as part of its state plan a five-year plan for the development and implementation of Tech Prep education programs. Evaluating Tech Prep for program improvement purposes will enable state and local consortia to identify both strengths and areas for improvement. Program improvement efforts can be targeted for short-term (i.e., less than one year) or long-term (i.e., more than a year) planning and implementation.

State and local consortia will need to collect program outcomes data for each of the Tech Prep seven essential program elements required in Perkins II and Perkins III. The seven elements are: (a) articulation agreement, (b) appropriate curriculum design, (c) curriculum development, (d) in-service teacher training, (e) counselor training, (f) equal access for special populations, and (g) preparatory services. A brief description of each Tech Prep program element follows.

Articulation agreements . The Perkins Act (Section 204) requires that each Tech Prep education program be carried out under an articulation agreement between the participants in the consortium ( U.S. Department of Education, 1998 ). The articulation agreements link secondary schools with 2-year postsecondary institutions through nonduplicative sequences of courses in career fields. Some states have developed one articulation agreement for all schools in the consortium (i.e., all pertinent courses in both secondary and postsecondary institutions), and others have developed individual articulation agreements for each course within the consortium.

Appropriate curriculum design . Section 204 of the Perkins Act ( U.S. Department of Education, 1998 ) requires that each Tech Prep education program have appropriate curriculum design:

Consist of at least 2 years of secondary school preceding graduation and 2 years or more of higher education, or an apprenticeship program of at least 2 years following secondary instruction, with a common core of required proficiency in mathematics, science, reading, writing, communications, and technologies designed to lead to an associate's degree or a postsecondary certificate in a specific career field. (112 STAT. 3119)

Curriculum development . Section 204 of the Perkins Act ( U.S. Department of Education, 1998 ) requires that each Tech Prep education program include the development of Tech Prep curricula for both secondary and postsecondary participants in the consortium that: (1) meet academic standards developed by the state, (2) link secondary schools and two-year postsecondary institutions, and if possible and practicable, four-year institutions of higher education, (3) use, if appropriate and available, work-based learning, and (4) use educational technology and distance learning.

In-service teacher training . Section 204 of the Perkins Act ( U.S. Department of Education, 1998 ) requires that each Tech Prep education program include in-service training for teachers that:

(A) is designed to train vocational and technical teachers to effectively implement Tech Prep programs;
(B) provides for joint training for teachers in the Tech Prep consortium;
(C) is designed to ensure that teachers and administrators stay current with the needs, expectations, and methods of business and all aspects of an industry;
(D) focuses on training postsecondary education faculty in the use of contextual and applied curricula and instruction; and
(E) provides training in the use and application of technology. (112 STAT. 3119)

Counselor training . Section 204 of the Perkins Act ( U. S. Department of Education, 1998 ) requires that Tech Prep education programs include training for counselors designed to enable them to more effectively:

(A) provide information to students regarding Tech Prep education programs;
(B) support student progress in completing Tech Prep programs;
(C) provide information on related employment opportunities;
(D) ensure that such students are placed in appropriate employment; and
(E) stay current with the needs, expectations, and methods of business and all aspects of an industry. (112 STATE. 3119)

Equal access for special populations . Section 204 of the Perkins Act requires that each Tech Prep education program "…provide equal access to the full range of technical preparation programs to individuals who are members of special populations, including the development of tech-prep program services appropriate to the needs of special populations" ( U. S. Department of Education, 1998 , 112 STAT.3120).

Preparatory services . Section 204 of the Perkins Act requires that each Tech Prep education program provide for preparatory services that assist participants in tech-prep programs ( U.S. Department of Education, 1998 ). Preparatory services include outreach to potential career and technical education students, career and personal counseling, and vocational assessment and testing. Preparatory services are provided to students not yet enrolled in Tech Prep. The delivery of services is before the 11 th grade.

Perkins Four Core Indicators

The federal act requires evaluation of Tech Prep education programs. Under section 113 of the Carl D. Perkins Vocational and Technical Education Act of 1998 ( U.S. Department of Education, 1998 ), the law states that each eligible agency shall identify its state plan for core indicators of performance for vocational and technical education that include, at a minimum, measures for each of the following:

(i) Student attainment of challenging State established academic, and vocational and technical skill proficiencies.
(ii) Student attainment of a secondary school diploma or its recognized equivalent, a proficiency credential in conjunction with a secondary school diploma, or a postsecondary degree or credential.
(iii) Placement in, retention in, and completion of, postsecondary education or advanced training, placement in military service, or placement or retention in employment.
(iv) Student participation in and completion of vocational and technical education programs that lead to nontraditional training and employment. (112 STAT. 3087)

The Office of Vocational and Adult Education (OVAE) has developed a core indicator framework to assist with the Perkins III requirements for performance reporting. Each state plan must identify performance measures for the core indicators ( OVAE, 2000 ). A performance measure is defined as "…the type of outcome that is considered appropriate for monitoring" ( Hoachlander, Levesque, & Rahn, 1992 , p. 9). For each of the Perkins four core indicators, states must establish valid and reliable performance measures that specify levels of performance which can at a minimum "…(I) be expressed in a percentage or numerical form, so as to be objective, quantifiable, and measurable, and (II) require the State to continually make progress toward improving the performance of vocational and technical education students" ( U.S. Department of Education, 1998 , 112 STATE. 3088).

The core indicator framework provides a guideline for all career and technical education programs. Because Tech Prep is a subset of career and technical education programs, not all of the core indicators may apply. Tech Prep evaluators select the core indicators and performance measures that are relevant to their Tech Prep education program and align them with any pertinent state Tech Prep efforts. The core indicators of performance require states to report secondary and postsecondary Tech Prep student outcomes data.

Student outcomes data is defined as changes that occur in individuals as a result of participation in an educational experience ( Bragg, 1992 ). Student outcomes data can be collected for each of the Perkins III core indicators: (a) student attainment, (b) credential attainment, (c) placement and retention, and (d) participation in and completion of non-traditional programs. The student attainment indicator seeks to assess student attainment of challenging state established academic and vocational and technical skill proficiencies at both the secondary and postsecondary levels. Credential attainment seeks to assess student attainment of a secondary school diploma or its recognized equivalent, a proficiency credential in conjunction with a secondary diploma, or a postsecondary degree or credential. The placement and retention core indicator seeks to assess vocational and technical education student placement in, retention in, and completion of postsecondary education or advanced training, placement in military service, or placement or retention in employment. The participation in and completion of non-traditional programs assesses student participation in and completion of vocational and technical education programs that lead to non-traditional training and employment.

Methodology

An extensive literature review was conducted and conversations were held with individuals' known to have researched and published about Tech Prep. Conversations about Tech Prep, Tech Prep evaluation, and reporting of Tech Prep program and student outcomes data were obtained by telephone and on-site interviews between March and September 2000. Individuals who indicated in a 1997 Local Tech Prep Implementation Follow-Up survey ( Bragg, 1997 ) that their consortium was at the "advanced stage" of Tech Prep evaluation were contacted by telephone between March and April 2000. Of the 63 individuals initially identified, 41 (65%) telephone interviews were conducted.

These individuals were asked questions related to Tech Prep evaluation efforts in their local consortium. Questions included: (a) What techniques are used to evaluated the Tech Prep education program?, (b) What types of data collection methods are used to collect Tech Prep program and students outcomes data?, and (c) Does your consortium have a Tech Prep evaluation plan? Individuals contacted were asked to send copies of Tech Prep evaluation documents that the consortium had developed. Fifteen (37%) individuals followed up and sent information or evaluation documents for the researcher to review.

The researcher reviewed 34 Perkins III 2000 - 2004 state plans at the Office of Vocational and Adult Education (OVAE) in Washington, DC in April 2000. The state plans were reviewed to gather examples of Perkins core indicator and program performance measures submitted by states that would assist with identifying student outcomes data. In addition, state plans were reviewed to identify evaluation plans, if any, submitted by states.

In addition to the telephone interviews and review of the Perkins III state plans, telephone contacts were made in April 2000 with state departments of education personnel and individuals who have conducted research related to Tech Prep. These individuals were asked to identify consortia in their state that were making progress in the area of Tech Prep evaluation. Seven Tech Prep consortia were identified and contacted for an on-site interview. Conversations were held with state Tech Prep directors and local Tech Prep consortium directors from Florida, Minnesota, Missouri (2), Montana, Oregon, and Wisconsin between May and August 2000. Questions included: (a) What is the structure of the Tech Prep consortium?, (b) What program outcomes are essential to determine program quality, effectiveness, and goal attainment?, (c) What student outcomes are essential to determine program quality, effectiveness, and goal attainment?, (d) What methods are used to collect data for the Tech Prep seven essential program elements?, and (e) What methods are used to collect data for the Perkins four core indicators? Information collected from these conversations and review of documents assisted with identifying a Tech Prep evaluation model and program and student outcomes data collection questions.

Tech Prep Evaluation Model

Some states have developed statewide Tech Prep evaluation models that provide a framework for evaluating Tech Prep for accountability and program improvement purposes. The Tech Prep evaluation efforts that follow describe current evaluation efforts in Connecticut, Florida, Illinois, Texas, and West Virginia. Complete copies of the evaluation documents described are available from the Measuring Tech Prep excellence: A practitioner's guide to evaluation ( Ruhland & Timms, 2001 ).

Connecticut has developed the Tech Prep Success Analysis and Measurement Indicators for state Tech Prep secondary and postsecondary participants ( Connecticut State Department of Education, n.d. ). The 11-item indicator analysis centers on various Tech Prep components. The analysis uses quantitative responses related to program and student outcomes. The indicators cover articulation agreements, 2 + 2 program design, student diversity, employer satisfaction, and student participation, completion, and employment.

Florida's Tech Prep Consortia Annual Report has been designed to enhance the quality, effectiveness, and achievement of Tech Prep goals for each consortium. Beginning January 1, 1993, Florida International University was granted a project entitled Performance-Based Project for the Development of a Florida Statewide Plan for Evaluation ( Hammonds, 1995 ). The project's activities included the planning, development, and implementation of a statewide plan to evaluate Tech Prep activities. The review included collecting data to assist with the preparation of an annual report from each consortium. The annual report provides information on a consortium's accomplishments and identifies measurable benchmarks that can be used for future comparisons.

In Illinois, prior to 1998, local consortia and the Illinois State Board of Education (ISBE) carried out evaluation of Tech Prep education programs, but these activities typically addressed a distinct aspect of Tech Prep rather than an entire program. To address this void, the Tech Prep Evaluation System for Illinois (TPESI) was developed through an initiative involving the Office of Community College Research and Leadership (OCCRL) at the University of Illinois at Urbana- Champaign (UIUC), the ISBE, and the Illinois Community College Board (ICCB) ( Bragg, 1998 ). Goals that guide the TPESI system and provide a rationale for Tech Prep evaluation include: (1) describe the status of Tech Prep implementation in Illinois, (2) identify participants in Tech Prep and describe how the participation of various Tech Prep student groups changes over time, (3) identify the benefits (outcomes) of Tech Prep for students, especially outcomes linked to student learning, (4) identify the benefits (outcomes) of Tech Prep for other stakeholder groups, and (5) discern strategies that support the continuous improvement of Tech Prep within consortia statewide and at the state level.

For each secondary school and two-year college site visit, team members rate the implementation stage and quality of the eight Tech Prep essential elements and the eight Tech Prep supporting elements as part of the School Assessment Form. The eight Tech Prep essential elements were (a) 2+2 program that leads to associate degree, (b) articulation, (c) curriculum development, (d) inservice training for teachers, (e) inservice training for counselors, (f) equal access for special populations, (g) preparatory services, and (h) work-based learning experiences. The eight Tech Prep supporting elements were: (a) leadership, organization and administrative support, (b) parental support, (c) business/labor/community involvement, (d) transition of students to postsecondary education, (e) secondary/postsecondary collaboration, (f) identification and accurate reporting of Tech Prep students, (g) evaluation and program improvement, and (h) integrated, contextual strategies. The Consortium Assessment Form assesses each essential element and supporting element for the consortium overall. This form includes (a) stage of implementation, (b) quality of element, and (c) additional comments and recommendations. A matrix is provided for each element that includes a description for program and student outcome measures.

Texas developed a site-based peer review process to assess each consortium in a range of areas and sub areas ( Texas Higher Education Coordinating Board, 2000 ). Review areas include program, instruction, counseling, professional development, marketing, budgeting, planning, student success, and evaluation. For each sub area, criteria are provided along with measurement statements, core standard descriptions, and recommended resources. Reviewers must assess whether the consortium does not meet, meets, or exceeds the standard, and they must provide explanatory comments.

West Virginia's Tech Prep standards are based on 20 "STARS" (Strategies That Advance Reform in West Virginia Schools) that cover areas including curricula, stakeholder support, marketing, and assessment measures ( West Virginia Department of Education, n.d. ). Each of the STARS identifies specific performance concepts, associated documentation data, and suggested strategies to achieve STARS standards. The documentation data section provides a list of items that consortium members can review to identify standards. All of the STARS have been compared with the National Association Tech Prep Leadership's (NATPL) quality indicators and Perkins III four core indicators. The STARS are rated based on the presence or absence of STARS documentation data. Consortia conduct a self-assessment of the STARS each year and submit the findings to the state Tech Prep director. Once every three years, an on-site technical review by the state Tech Prep director and a team of local Tech Prep coordinators follows this self-assessment.

The management-oriented evaluation model is recommended for evaluating Tech Prep education programs. The rationale for recommending the managementoriented model is that "evaluative information is an essential part of good decision making and that the evaluator can be most effective by serving administrators, policy makers, boards, practitioners, and others who need good evaluative information" ( Worthen, et al., 1997 , p. 97). In a management-oriented approach, Tech Prep education programs can be evaluated following a four-step approach. An overview of the four-step approach is provided in Table 2 ( Dutton, et al., 1994 ; Fleishman, 1995 ; Levesque, Bradby, Rossi, & Teitelbaum, 1998 ). It is important to reiterate that when evaluating Tech Prep education programs, both program and student outcomes data should be collected. The four-step approach to planning and conducting an evaluation provides an evaluation model to state and local Tech Prep personnel to begin the process of evaluating Tech Prep education programs.

Table 2
A Four-step Approach to Planning and Conducting an Evaluation
Four-step Approach to Planning and Conducting an Evaluation

The management-oriented evaluation model is useful to guide program improvement. Each year when local Tech Prep consortia prepare their local plans, information obtained from the Tech Prep program evaluation can identify new activities and modify existing activities. This process will also assist with the allocation of funds to support both state and local Tech Prep activities. "This evaluation approach has also been used for accountability purposes" ( Worthen, et al., 1997 , p. 103). Using the management-oriented evaluation model will provide the data in response to Perkins III accountability requirements.

Tech Prep Program Outcomes Data

Each consortium receiving Perkins funding is required to submit as part of its state plan a five-year plan for the development and implementation of Tech Prep education programs. These plans are expected to report on the seven essential program elements required of Tech Prep education programs. The Perkins Tech Prep seven program elements are: (a) articulation agreements, (b) appropriate curriculum design, (c) curriculum development, (d) in-service teacher training, (e) counselor training, (f) equal access for special populations, and (g) preparatory services. Information obtained from conversations with Tech Prep personnel and review of Tech Prep documents assisted with identifying Tech Prep program outcomes data collection questions. The following list of questions serves as a beginning step for local consortia to assist with Tech Prep program outcomes data collection efforts. Those responsible for the evaluation of Tech Prep education programs can modify any of the questions, and develop additional questions specific to their local Tech Prep program goals and evaluation needs as outlined in their local plan.

Articulation agreements . Suggested Tech Prep data collection questions to evaluate articulation agreements:

1. What process is used for developing articulation agreements?
2. What evidence exists that articulation agreements reflect a minimum of a 2 + 2 program of study for each Tech Prep career pathway?
3. What is the process to evaluate articulation agreements?
4. What evidence exists that articulation agreements are being used?

Appropriate curriculum design . Suggested Tech Prep data collection questions to evaluate curriculum design:

1. How is the Tech Prep education program structured (e.g., 2 + 2, 2 + 2+ 2, or 4 + 2)?
2. What is the percent increase in the number of students completing a two-year college program within three years of initial entry compared to the previous year's baseline data?
3. How is the curriculum designed to ensure a common core of required proficiencies in mathematics, science, reading, writing, communications, and technologies that leads to an associate's degree or two-year certificate in a specific career field?
4. What is the process to evaluate curriculum design?

Curriculum development . Suggested Tech Prep data collection questions to evaluate curriculum development:

1. Is there a decrease in the number of students in remedial courses who enroll in a two-year college program the semester following high school graduation compared to the previous year's baseline data?
2. Is there an increase in the percentage of students enrolled in work-based learning experiences linked to industry skills standards and state-issued skill certificates compared to the previous year's baseline data?
3. How are secondary faculty and two- and four-year college faculty working together to plan, develop, and implement a Tech Prep education program of study?
4. How are career exploration and planning courses made available to students?

In-service teacher training . Suggested Tech Prep data collection questions to evaluate in-service teacher training:

1. What have teachers learned as a result of participating in staff development activities?
2. How do new or substantially revised academic courses emphasize contextual learning?
3. What internship opportunities are provided to inform teachers and administrators of industry work sites and labor force expectations?
4. What is the process to evaluate in-service teacher training activities?

Counselor training . Suggested Tech Prep data collection questions to evaluate counselor training:

1. What staff development activities have been provided to counselors to assist with the counseling and advising of Tech Prep students in secondary schools and two-year colleges?
2. Is there an increase in the number of students enrolling in a two-year college the semester following high school graduation compared to the previous year's enrollment in two-year colleges?
3. What are the indicators to show an increase in Tech Prep awareness among high school and two-year college counselors?
4. Do students in grades 9 to 12 prepare a written career plan that outlines high school work and/or high school to two- or four-year education plans leading to future employment?

Equal access to special populations . Suggested Tech Prep data collection questions to evaluate equal access for special populations:

1. How do promotional items for Tech Prep marketing reflect educational equity for special populations?
2. How is Tech Prep serving special populations?
3. Describe the Tech Prep experiences that have benefited special populations in secondary schools and two-year colleges.
4. How are services provided to allow equal access for special populations?

Preparatory services . Suggested Tech Prep data collection questions to evaluate preparatory services:

1. What types of services does the consortium provide to assist students in secondary schools in the selection of or preparation for appropriate Tech Prep education program of study?
2. Does the consortium have a Tech Prep marketing plan? How is the marketing plan implemented and evaluated?
3. What is the process to evaluate preparatory services?
4. What are the promotional activities for students in grades 8 to 12, parents, businesses, and community members?

Tech Prep Student Outcomes Data

In the literature review, the Perkins III four core indicators accountability requirements were described. The four core indicators are: (a) student attainment, (b) credential attainment, (c) placement and retention, and (d) participation in and completion of non-traditional programs. Based upon the conversations with Tech Prep personnel and review of Tech Prep documents, the following student outcomes data collection questions are suggested. For each of the examples suggested, the quantifiable measure and/or timeline are not included. These have been left blank, and data would be completed based upon a state or local consortium's benchmark or standards. The "S" reports secondary student outcomes data, and the "P" reports postsecondary student outcomes data. These examples demonstrate different approaches to measuring the Perkins four core indicators. Each example specifies (a) a quantifiable measure (e.g., percentage, number), (b) a timeline (e.g., two semesters, six months, one year), and (c) the sample population (e.g., Tech Prep student, completer, concentrator). The student outcomes data can be analyzed, summarized, and submitted as part of the state plan for vocational education and the reporting of Perkins III accountability requirements.

Core indicator 1 student attainment . Suggested Tech Prep data collection questions to evaluate student attainment:

1S1. ___ percent of Tech Prep concentrators will complete the high school graduation requirements.
1S2. The Tech Prep student score on a licensure or certification examination, for those fields in which licensure or certification is required, industry-endorsed competency examination, or a staterecognized test will increase by ___ percent by ___ (insert year).
1P1. ___ percent of matriculated postsecondary Tech Prep students who enrolled in the fall of each year in academic and career and technical courses will successfully complete the courses as measured by credits earned at the end of the semester.
2P2. ___ percent of postsecondary Tech Prep students will have attained a degree, a certificate, apprenticeship, or industry certification two years following enrollment in the degree program.

Core indicator 2 credential attainment . Suggested Tech Prep data collection questions to evaluate credential attainment:

2S1. The rate at which secondary Tech Prep concentrators become completers will be ___ percent for _____(insert year) and ___ percent over four years.
2S2. ___ percent of Tech Prep students who graduate with a high school diploma will equal or exceed the statewide graduation rate each year.
2P1. ___ percent of the postsecondary Tech Prep students will obtain an associate degree or technical certificate within three years of enrolling in the degree program.
2P1. The rate at which postsecondary Tech Prep concentrators become completers will be ___ percent for ___ (insert year) and ___ percent over four years.

Core indicator 3 placement and retention . Suggested Tech Prep data collection questions to evaluate placement and retention:

3S1. Within one year of high school graduation, at least ___ percent of Tech Prep concentrators will matriculate into a postsecondary education program or registered apprenticeship.
3S2. ___ percent of Tech Prep concentrators who respond to the follow-up survey will still be engaged in postsecondary education and/or employment within one year of graduation.
3P1. The number of Tech Prep students who obtained employment directly related to their postsecondary degree has increased by ___ percent.
3P2. ___ percent of postsecondary completers (two-year) articulated credits to a four-year institution and are pursuing baccalaureate degrees.

Core indicator 4 participation in and completion of non-traditional programs . Suggested Tech Prep data collection questions to evaluate participation in and completion of non-traditional program:

4S1. At least ___ percent of Tech Prep students in underrepresented gender groups will be enrolled in courses that have been identified as leading to nontraditional employment for that gender.
4S2. The number of nontraditional Tech Prep students who enrolled in and completed a career and technical education program within industry clusters will be ___.
4P1. ___ percent of postsecondary Tech Prep students participating in a nontraditional career and technical education program will be from underrepresented gender groups.
4P2. The percentage of postsecondary Tech Prep students by gender graduating from nontraditional degree programs during the most recent academic year will increase by ____ percent.

Conclusions

As with any new initiative, those promoting change must be careful to educate and gain the commitment and involvement of stakeholders. Including stakeholders in the evaluation of the Tech Prep education program can facilitate this. As the evaluation process begins, communication should occur frequently among stakeholders to allow for questions to be asked and information to be distributed ( Dutton, et al., 1994 ; Fleishman, 1995 ; Levesque, et al., 1998 ).

Due to the variability of state and local Tech Prep education programs, the evaluation model selected should meet the reporting requirements of the local and/or state plan. This paper recommends the management-oriented evaluation model to evaluate Tech Prep education programs. This model involves four steps: (1) identifying the objectives of the Tech Prep program evaluation, (2) choosing the evaluation method, (3) collecting the data, and (4) analyzing and communicating results. With the increased accountability requirements at the federal level, this model provides the information that policymakers will need to support the reauthorization of Perkins III.

Those responsible for Tech Prep evaluation should not anticipate undertaking a major evaluation effort the first year, but rather focus on two or three evaluation objectives that will provide data related to the Tech Prep program goals. These objectives will lead to additional objectives that will assist with evaluation occurring at regular intervals throughout the program ( Dutton, et al., 1994 ). After the Tech Prep evaluation is completed, it is important to review the results, identify future Tech Prep education program plans, and set new goals for program improvement. Tech Prep consortia should build systems and structures to support and promote continuous improvement within the Tech Prep education program. State leadership is critical if we are to optimize the return on investments of federal funds and to support the reauthorization of Perkins III.

Those evaluating Tech Prep education programs will continue to face challenges. The lack of a common definition of a secondary and postsecondary Tech Prep student, concentrator and completer, continues to result in data not being useful. Efforts need to continue to support common definitions for data collection and reporting efforts. If all states were required to use common definitions, Tech Prep data would be reported, analyzed, and summarized from a national perspective. This would eliminate one of the criticisms we continue to hear from the federal level, that data collected does not represent a national perspective.

The research summarizing efforts to collect and report Tech Prep program and student outcomes data as reported in this paper has been minimal. States report the communication between secondary and postsecondary schools is minimal, and reporting of Tech Prep students is often lost in matriculation. Further the reported lack of data collection systems at the postsecondary level has resulted in minimal data collected to report the impact of program and student outcomes data. State Tech Prep directors and local Tech Prep coordinators need to begin by identifying what data needs to be collected to report program and student outcomes data. Discussions need to take place with those who can assist with the data collection process. An initial step may be in changes to the graduate follow-up survey to include questions related to Tech Prep program and student outcomes data questions suggested in this paper. Agreeing upon at least one Tech Prep data collection question, and deciding how to collect the data, is a major step in beginning the Tech Prep evaluation process. If all states would take this approach, reporting of Tech Prep data on a national level may provide the information policymakers would need to support reauthorization of Perkins III to include continued funding for Tech Prep education programs.

References

Baldrige National Quality Program. (n.d.) . Education criteria for performance excellence . Retrieved April 17, 2003, from http://www.quality.nist.gov/education_criteria.htm

Barak, R. J., &mp Breier, B. E. (1990) . Successful program review . San Francisco: Jossey-Bass.

Barnett, E. (2002) . Counting Tech Prep students. Techniques , 77(1), 60-61.

Boulmetis, J., & Dutwin, P. (2000) . The ABCs of evaluation: Timeless techniques for program and project managers . San Francisco: Jossey-Bass.

Bragg, D. D. (1992) . Alternative approaches to outcomes assessment for postsecondary vocational education . (MDS 239). Berkeley: University of California at Berkeley, National Center for Research in Vocational Education.

Bragg, D. D. (1997) . Educator, student, and employer priorities for Tech Prep student outcomes . (MDS 790). Berkeley: University of California at Berkeley, National Center for Research in Vocational Education.

Bragg, D. D. (1998) . Tech Prep evaluation system for Illinois (TPESI) . Champaign, IL: University of Illinois-Urbana/Champaign.

Bragg, D. D. (2001) . Promising outcomes for Tech Prep participants in eight local consortia: A summary of initial results . St. Paul: University of Minnesota, National Research Center for Career and Technical Education.

Bragg, D. D., Puckett, P. A., Reger, W. M., IV, Thomas, H. S., Ortman, J., & Dornsife, C. J. (1997) . Tech Prep/School-to-work partnerships: More trends and challenges . (MDS 1078). Berkeley: University of California at Berkeley, National Center for Research in Vocational Education.

Brown, J. M., Pucel, D., Twohig, C., Semler, S., & Kuchinke, P. (1998) . Minnesota's Tech Prep outcome evaluation model. Journal of Industrial Teacher Education , 35(3), 44-66.

Connecticut State Department of Education. (n.d.) . Tech Prep success analysis and measurement indicators . Middleton, CT: Author.

Connell, T. J., & Mason, S. A. (1995) . School to work transition: Issues and strategies for evaluation and program improvement . Paper presented at the annual meeting of the American Educational Research Association, San Francisco. (ERIC Document Reproduction Service No. ED 383 905).

Conrad, C. F., & Wilson, R. F. (1985) . Academic program review: Institutional approaches, expectations, and controversies . ASHE - ERIC Higher Education Report No. 5. Washington, DC: Association for the Study of Higher Education.

Dixon, G., & Swiler, J. (Eds.) . (1990). Total quality handbook: The executive guide to the new way of doing business . Minneapolis: Lakewood.

Dutton, M., Hammons, F., Hudis, P., & Owens, T. (1994) . Evaluating your Tech Prep program . Waco, TX: Center for Occupational Research and Development.

Fleishman, H. L. (1995) . Is it working? Self-help guide for evaluating vocational and adult education programs . Washington, DC: Office of Vocational and Adult Education.

Hammons, F. T. (1995) . Florida Tech Prep evaluation model . Tallahassee: Florida Department of Education. (Document Request Number GE 334 BK 94). Available from the World Wide Web: http://www.fiu.edu/~xiwc/techprep_eval.htm .

Hershey, A. M., Silverberg, M. K., Owens, T., & Hulsey, L. K. (1998) . Focus for the future: The final report of the national Tech Prep evaluation . Princeton, NJ: Mathematica Policy Research.

Hitchcock, G., & Hughes, D. (1989) . Research and the teacher . New York: Routledge.

Hoachlander, E. G., Levesque, K., & Rahn, M. L. (1992) . Accountability for vocational education: A practitioner's guide . (MDS-407). Berkeley: University of California at Berkeley, National Center for Research in Vocational Education.

Levesque, K., Bradby, D., Rossi, K., & Teitelbaum, P. (1998) . At your fingertips: Using everyday data to improve schools . Walnut Grove, CA: George Lithograph.

Logan, J. P. (1999) . Kentucky's Tech Prep evaluation system: A five-year review . Paper presented at the annual meeting of the American Educational Research Association, Montreal. (ERIC Document Reproduction Service No. ED 431 098).

National Association of Tech Prep Leadership. (1999) . Tech Prep program quality indicators . author.

National Association of Tech Prep Leadership. (2003) . Tech Prep program quality indicators . Author.

Office of Vocational and Adult Education. (2000) . Core indicator framework . Washington, DC: U.S. Department of Education.

Pucel, D. J., Brown, J. M., & Kuchinke, K. P. (1996) . Evaluating and improving Tech Prep: Development, validation, and results of the Minnesota selfassessment model. Journal of Vocational Education Research , 21(2), 79-106.

Ross, J. E. (1993) . Total quality management: Text, cases and readings . Delray Beach, FL: St Lucie.

Ruhland, S. K., & Timms, D. M. (2001) . Measuring Tech Prep excellence: A practitioner's guide to evaluation . St. Paul, MN: The National Research Center for Career and Technical Education.

Ruhland, S. K., Custer, R. L., & Stewart, B. R. (1995, April) . Evolving a model for evaluating Tech Prep implementation . Paper presented at the annual meeting of the American Educational Research Association meeting, San Francisco. (ERIC Document Reproduction Service No. ED 384 650).

Schroeder, D. E. (2000) . Local school implementation committee Tech Prep action plan . Hutchinson, MN: Mid-Minnesota School-to-Careers/Tech Prep Consortium.

Silverberg, M. K., Hulsey, L. K., & Hershey, A. M. (1997) . Heading students towards career horizons: Tech Prep implementation progress, 1993-1995 . Washington, DC: U.S. Department of Education.

Texas Higher Education Coordinating Board. (2000) . Tech Prep consortium site evaluation . Austin, TX: Community and Technical Colleges Division.

U.S. Department of Education. (1998) . The Carl D. Perkins Vocational and Technical Education Act, Public Law 105-332 . Retrieved April 17, 2003, from http:frwebgate.access.gpo.gov/cgi-Bin/getdoc.cgi?dbname=105_cong_public_laws&docid=f:pub1332.105.pdf

U. S. Department of Education. (n.d.) . The Carl D. Perkins Vocational and Technical Education Act, Public Law 105-332 . Retrieved April 17, 2003, from http://www.ed.gov/offices/OVAE/CTE/perkins.html

West Virginia Department of Education. (n.d.) . West Virginia's "Stars" . Charleston, WV: Author.

Worthen, B. R., Sanders, J. R., & Fitzpatrick, J. L. (1997) . Program evaluation alternative approaches and practical guidelines (2 nd edition). New York: Longman.

Notes

This article has been extracted from "Measuring Tech Prep Excellence: A Practitioner's Guide to Evaluation" as part of the program of work of the National Research Center for Career and Technical Education at the University of Minnesota funded by the U.S. Department of Education, Office of Vocational and Adult Education.

SHEILA K. RUHLAND is Assistant Professor in The Department of Work, Community, and Family Education, University of Minnesota, 420 Vocational and Technical Education Building, 1954 Buford Avenue, St. Paul, MN 55108. e-mail: ruhla006@umn.edu