JVER v26n2 - Implementing an Assessment Plan to Document Student Learning in a Two-Year Technical College

Volume 26, Number 2
2001


Implementing an Assessment Plan to Document Student Learning in a Two-Year Technical College

Sheila K. Ruhland
University of Minnesota
Jerrilyn A. Brewer
Western Wisconsin Technical College

Abstract

During the past several years, reports have indicated the inadequate skills of graduates and the changing demands of the workplace. This has prompted the need to develop assessment plans to determine if students have mastered the knowledge, skills, and abilities required to meet the needs of business and industry. This research study describes a process to engage faculty and administration in the development and implementation of an assessment plan to document student learning in a two-year technical college setting. The college's assessment plan emerged with faculty involvement and was not driven solely by administration. Student learning outcomes and assessment measures were identified for 55 associate degree and technical diploma programs at Western Wisconsin Technical College. The most common student learning outcome identified by faculty for 25 of the 55 programs at Western was "demonstrate effective communication skills." The most common assessment measures identified by faculty included performance tasks, portfolio, and checklist.

As a result of increased calls for institutional accountability in general and continuing emphasis on assessment of student learning in particular, higher education, including 2-year technical colleges, has been actively engaged in focusing its efforts to implement effective assessment programs. Student learning outcomes are rapidly taking center stage as the principal gauge of higher education's effectiveness. Student learning outcomes are measures of how a student's college experience supported their development as individuals ( Frye, 1999 ). Outcomes include the knowledge, skills, and attitudes that determine what students know now that they didn't know before their college experience. Assessing student learning outcomes denotes any process used to gather data in order to make a judgment about student learning ( Glatthorn, 1999 ). Ewell ( 2001 ) suggests, "assessment of student learning outcomes is most appropriately defined…as the processes that an institution or program uses to gather direct evidence about the attainment of student learning outcomes, engaged in for purposes of judging (and improving) overall instructional performance" ( p. 7 ).

Assessment measures are designed to determine whether student learning outcomes have been achieved. Academic units use a variety of assessment measures. The measures identified reflect differences within program areas, faculty members and their experiences, and constraints within a department or college. In the case of two-year technical colleges, the need to identify student learning outcomes and assessment measures comes partly from policymakers, who believe it is essential to measure the degree to which students have mastered the knowledge, skills, and abilities necessary to obtain successful employment after graduating from a two-year college program ( Stecher, et al., 1997 ). "Employers and elected officials have never been clearer in their demand that the graduates of America's colleges and universities should possess an increasingly specific set of higher order literacies and communication skills" ( Ewell, 2001, p. 1 ).

Purpose and Research Questions

This research study describes the process Western Wisconsin Technical College (Western), a two-year, Associate Degree-granting institution, undertook to develop and implement an assessment plan designed to document student learning. The assessment plan's key components included three phases: identification of student learning outcomes for each program; identification of appropriate assessment measures and criteria by which student learning could be judged; and use of the assessment information to improve programs.

College leaders believed that identifying measurable student learning outcomes, identifying and using multiple measures to assess those outcomes, collecting and interpreting data from those instruments, disseminating the information about the results in structured feedback loops, and using the information derived from the assessment of student learning to make pedagogical and curricular changes could improve the quality of student learning at Western, thereby enhancing the college's commitment to student success. Two essential elements of this assessment plan included administrative support and faculty involvement. Assessment of student learning was considered an integral component of the college's continuous quality improvement process and a strategy to meet the College's goals.

The process researched in this study aimed to address the third criterion the North Central Association of Colleges and Schools Commission on Institutions of Higher Education (NCA) requires for accreditation: "the institution is accomplishing its educational and other purposes" ( North Central Association of Colleges and Schools Commission on Institutions of Higher Education, 1997, p. 65 ). For the purpose of this study each criterion is examined by patterns of evidence. The patterns are areas of institutional activity or concern related to the satisfaction of each criterion. Within this criterion, colleges are to document the assessment of appropriate student learning in all its programs. Three kinds of student evidence are required:

  1. Proficiency in skills and competencies essential for all college-bound adults;
  2. Completion of an identifiable and coherent undergraduate level general education component; and
  3. Mastery of the level of knowledge appropriate to the degree granted. ( NCA, 1997, p. 65 )

Specifically, this research study attempted to address the following research questions:

  1. How did Western engage faculty and administration to develop and implement its assessment plan?
  2. How were student learning outcomes identified within divisions and across all four divisions at Western?
  3. How were assessment measures identified within divisions and across all four divisions at Western?

Literature Review

Student Learning Outcomes

Focusing on student learning outcomes creates a paradigm shift from a teacher-centered model to a student-centered model. According to Glenn ( 2000 ) a student-centered focus means "creating objectives that identify what students need to do to demonstrate learning rather than ones that identify what they need to understand " ( p. 12 ). A student-centered environment involves letting go of traditional structures that have been in place in higher education. To do this Cross ( 1998 ) suggests "students and their learning should become the focus of everything we do…from the instruction that we provide, to the intellectual climate that we create, to the policy decision that we make" ( p. 1 ).

Developing student learning outcomes requires outcomes that are clearly stated so students know what is expected of them and what they will be able to do as a result of the activities completed that were designed to achieve the learning outcomes. The key word do requires the use of action verbs when developing learning outcomes. Outcomes encompass a range of student activities and abilities and are written in the cognitive, affective, and psychomotor domains ( Bloom, 1956 ).

There are many benefits to identifying learning outcomes. Jenkins and Unwin ( 1996 ) assert that learning outcomes:

  1. Help students learn more effectively. They know where they stand, and the curriculum is made more open to them.
  2. Make it clear what students can hope to gain from following a particular course or lecture.
  3. Help instructors to design their materials more effectively by acting as a template for them.
  4. Help instructors select the appropriate teaching strategy.
  5. Help instructors more precisely to tell their colleagues what a particular activity is designed to achieve.
  6. Assist in setting examinations based on the materials delivered.
  7. Ensure that appropriate assessment strategies are employed.

The League for Innovation identified 21st century skills that students will need to acquire on graduation ( Wilson, Miles, Baker, & Schoenberger, 2000 ). These include: (a) communication, (b) computation, (c) community, (d) critical thinking and problem solving, (e) information management, (f) interpersonal, (g) personal, and (h) technology. The authors argued that community colleges should define student learning outcomes and assessment measures to certify skill attainment.

Copa and Ammentorp ( 1997 ) identified learning expectations for two-year institutions that could be written for the college or written as student learning outcomes for a specific program. The learners exiting from a two-year institution would have competencies in the context of work, family, and community including: (a) function in a diffuse and complex environment, (b) work independently and collaborately, (c) make decisions, (d) use information, (e) communicate ideas, (f) solve problems and take advantage of opportunities, (g) produce results in an area of endeavor, and (h) manage one's own continuous learning.

For the purpose of this study, student learning outcomes are defined as a culminating demonstration of learning as applied in the workplace, and what is expected of the learner who successfully completes all of the course work and learning experiences as part of a technical program ( Ruhland, Samson, Brewer, & Hague, 1997 ). Student learning outcomes are: (a) identified by faculty, (b) validated by employers and advisory committee members, (c) communicated to students, and (d) assessed at program completion ( Ruhland, 1998 ).

Assessment Measures

Following the development of student learning outcomes, the next step is to identify appropriate assessment measures to determine if students have achieved the outcomes. Identifying these measures requires faculty to work together, set common goals and standards, devise methods of assessment, interpret the results, and use the results to improve and coordinate teaching ( Holyer, 1998 ). Assessment has been interpreted as a means to improving (a) student learning; (b) accountability for the quality of learning; (c) traditional and authentic measures of student learning; and (d) measures that show students have mastered the knowledge, skills, and abilities essential for employment.

In 1995 Bragg identified outcomes assessment practices in two-year postsecondary institutions. Bragg defined outcomes assessment as an "evaluative process that determines the results of education at any level, i.e., student, program, or institutional" ( p. 23 ). The primary outcomes assessed were "academic attainment, work skill attainment, and employer satisfaction with the job performance of current students or graduates" ( Bragg, 1995, p. 30 ). Bragg found the most common assessment measures and methods identified included portfolios, capstone (senior projects), exams, and faculty and student surveys.

Using direct and indirect assessment measures and methods enables teachers to determine if students have achieved the student learning outcomes. From a student's perspective, assessment measures tell learners whether or not they have mastered the knowledge, skills, and abilities that hold students accountable for achievement of student learning outcomes. Assessment measures range from written tests to performance tasks to cumulative portfolios ( Stecher et. al., 1997 ). Direct assessment measures of student learning include:

  1. Tests (e.g., commercial; standardized; locally-developed; oral examinations; and national licenses, certification or professional examinations).
  2. Competency-Based Measures (e.g., performance appraisals and simulations).
  3. External Reports (e.g., senior projects, student exhibitions, performances, interviews, portfolios capstone experiences, and classroom observations). (Assessment Measures and Methods, n.d.; Christopher Newport University, n.d.)

Tests may be given at the beginning of a course or program (pre-test) or at the completion of a course or program (post-test). Tests are a common method of assessing student learning since they take little time to administer, and results are simple to report and understand. Competency-based measures are a form of assessing demonstration of acquired skills. This type of assessment measure is appropriate for "professional training programs with well-defined skill development" ( Christopher Newport University, n.d., p. 3 ). Performance appraisals and simulations provide external and internal validity of skill assessment. External reports are comprehensive and provide feedback for both student and program evaluation. External reports are examples of alternative or authentic measures. "These assessment measures require students to 'perform' in some way-by writing, demonstrating, explaining, or constructing a project or experiment" ( What are promising ways to assess student learning, 1996, para. 1 ).

Archival records, surveys and questionnaires, and interviews are indirect assessment measures. Indirect assessment measures often gather data about a student's educational experience rather than the knowledge, skills, and abilities acquired from participation in or completion of a degree. Examples of archival records are retention rates, graduation rates, and transfer rates. Surveys and questionnaires can obtain feedback from stakeholders regarding the institution. Interviews provide immediate feedback and are useful when obtaining sensitive information. Indirect measures are sources that "enhance the information gathered from direct measures of a student's academic achievement" ( Assessment measures and methods, n.d., p. 1 ).

It is important for faculty to understand the different types of direct and indirect assessment measures before they identify the measures that will be used to collect data to report if student learning outcomes have been achieved. "The best assessment at the classroom level generally consists of a combination of several different types of measures that capitalize on the strengths and compensates for the weaknesses of each measure" ( VanHuss, 1996, p. 9 ). Academic units within a college will use different assessment measures. These differences are due to the kinds of knowledge, skills, and abilities students will master and the variations among programs.

Forces Driving Assessment Planning

Assessment Planning

Most institutions of higher education still struggle with creating an assessment plan that works. The assessment plan should demonstrate the quality of education the institution is providing. Students today want a learning experience that is relevant and stimulating. The California State University (CSU) at Chico implemented a student learning outcomes assessment plan in 1995 ( Jacob, 1998 ). "We came to see assessment as an important tool for measuring the success of student learning in our units, and hence the quality of our programs, as well" ( p. 4 ).

A college's assessment plan should emerge with faculty involvement and not be driven solely by administration. Three questions formed the basis of CSU's assessment plan:

  1. What does it mean to be a in a major, and what should a student in this major know?
  2. To what learning processes should the student be exposed to learn those things?
  3. How would you assess whether or not the student learned those things? ( Jacob, 1998, p. 6 ).

As a result of the three questions, program faculty refined learning objectives, submitted curriculum changes to support the learning objectives, and identified appropriate assessment measures related to the desired student learning outcomes. CSU's assessment plan invited faculty members and departments to take ownership of the assessment process and recognize the value that student learning outcomes would bring to the programs. According to Jacob ( 1998 ):

The most successful assessment plans are those, which are dynamic in nature. They should include "feedback loops" that identify problems based on measured results, and implement corrections. The assessment plan should remain one in the process of ongoing change and improvement. ( p. 9 )

Accrediting Agencies

Regional accrediting agencies continue to encourage the development and implementation of student learning outcomes and assessment measures. In 1989, NCA approved an assessment initiative for its member institutions to submit plans for assessing student learning ( Lopez, 1997 ). Institutions affiliated with NCA began to develop assessment plans to meet this initiative. The North Central Association of Colleges and Schools Commission on Institutions of Higher Education ( 1997 ) states:

The program to assess student learning should emerge from and be sustained by a faculty and administrative commitment of excellent teaching and effective learning; provide explicit and public statements regarding the institution's expectations for student learning; and use the information gained from systematic collection and examination of assessment data both to document and improve student learning. A strong assessment program is founded on a plan that is widely accepted and routinely updated, it is ongoing, and it is related to other planning budgeting processes. ( p. 42 )

Lopez ( 1998a ) contends that assessment of student learning is key to (1) improving student learning, (2) enabling an institution to verify that it is being accountable to its internal and external stakeholders, and (3) documenting to the general public and interested parties the value of investing in higher education. Lopez suggests that three levels of implementation characterize an institution's assessment plan: Level One: Beginning Implementation of Assessment Programs; Level Two: Making Progress in Implementing Assessment Programs; and Level Three: Maturing Stages of Continuous Improvement. Each of these levels represents an institution's developmental approach to implementing an assessment plan which focuses on the assessment of student learning.

With this policy statement and subsequent requirement by NCA that institutions assess their quality by directly assessing student learning, higher education leaders began working with faculty. Professional development opportunities were provided to assist faculty identify student learning outcomes, measure student learning of the outcomes, and use the results for improvement purposes.

Other regional accrediting bodies have responded to sustained public dissatisfaction with higher education by increasing their emphasis on indirect institutional integrity and accountability as conditions of accredited status. The Middle States Commission on Higher Education ( 1996 ) developed a framework for outcomes assessment, which is reviewed as part of the accreditation process. The framework explains purposes and contexts for the assessment strategies institutions may choose. The New England Association of Schools and Colleges (NEASC) ( 2001 ) has eleven standards for accreditation. Standard Four, Programs and Instruction, requires (a) academic planning and evaluation to enhance the achievement of program objectives, and (b) all existing programs include an assessment of their effectiveness and continued need ( p. 6 ). NEASC requires institutions to have a student outcome assessment plan aimed at assessing student achievement. Results from the plan are used for institutional improvement. The Southern Association of Colleges and Schools ( 1997 ) has institutional effectiveness as one of six criteria for accreditation. "Each member institution is expected to document quality and effectiveness by employing a comprehensive system of planning and evaluation" ( p. 17 ). The institutions must develop guidelines to evaluate the quality of student learning and employ a variety of assessment methods. Standard Three of the Western Association of Schools and Colleges ( n.d. ) addresses institutional effectiveness. Within this standard, Section C: Institutional Outcomes Assessment requires institutions to have (a) outcomes and clear documentation of their achievement, (b) planning activities to communicate quality assurance, and (c) review evaluation processes to determine ongoing utility for assessing institutional effectiveness ( p. 3 ).

Faculty at institutions of higher education were given the task of developing strategies to assess student learning on completion of a program. NCA found that faculty and academic administrators typically have needed considerable encouragement, training, and assistance in the following components ( Lopez, 1998a ):

  1. Developing measurable objectives of student learning;
  2. Identifying and utilizing multiple measures of student learning to assess those objectives;
  3. Collecting and interpreting the data from those instruments;
  4. Disseminating the information about the results of assessment in structured feedback loops that provide faculty and administration with timely and useful information on which to base recommended changes; and
  5. Using the information derived from the assessment of student learning to ascertain desirable pedagogical and curricular changes and to introduce and evaluate the effectiveness of the changes. ( p. 5 )

Each of these accrediting agencies has developed criteria based on experience, research, and consultation with member institutions. These initiatives will place pressure on accredited institutions to demonstrate that students have learned what is promised or implied by the catalog's descriptions of academic programs and that continual improvement of student learning has become an institutional priority ( Lopez, 1998b ). Ewell ( 2001 ) suggests, "a range of other forces have stimulated…accreditation's interest in examining student learning outcomes. Foremost among them are rapidly changing modes of instructional delivery and a burgeoning competency movement in corporate training" ( p. 3 ).

Workforce Development Skills

In addition to regional accrediting agencies, researchers have produced numerous reports that define and discuss the skills students need for the workforce and jobs of the future. Skills identified go beyond the technical skills-which continue to be important. Employers are expressing concern from an increasing need for workers with problem-solving, decision-making, and teamwork skills ( Conrad, 1999 ). Educational institutions need to partner with business and industry to ensure that future employees have the essential skills required for today's workforce.

The Secretary's Commission on Achieving Necessary Skills (SCANS) ( 1991 ) examined the changes in the world of work and the implications for learning. Five competencies and a three-part foundation of basic and thinking skills, and personal qualities were identified for students preparing for work. The five competencies included: (a) resources, (b) interpersonal skills, (c) information, (d) systems, and (e) technology ( Secretary's Commission on Achieving Necessary Skills, 1991 ). The eight components were considered integral to every student's school life. The competencies and foundations can be used when developing student learning outcomes, and when measured will determine if the student has acquired the knowledge, skills, and abilities required in today's workplace.

Imel ( 1999 ) conducted a review of research related to workforce preparation. Skills mentioned most frequently included: knowing how to learn; competencies in reading, writing, and computation; effective listening and oral communication skills; adaptability; self-esteem and initiative; interpersonal skills, ability to work in teams; leadership; and basic technology skills. Recognizing the importance of the skills desirable in workers at all levels of employment contributes to the need for student learning outcomes and assessment measures in two-year colleges across the United States.

Conceptual Framework

The development and implementation of an assessment plan at Western was based on a broader conceptual framework grounded in the theory and practice of Continuous Quality Improvement (CQI). The concept of CQI is connected to the principle of Total Quality Management (TQM). "TQM views outcomes assessment as having a place in determining quality" ( Bryce, 1991 ). TQM focuses on improvement and looks at the effectiveness of the whole system. By looking at assessment measures from student learning outcomes, data can be fed back into the system for overall improvement.

Inherent in CQI theory and practice are certain basic principles that Western already practiced: strong administrative leadership; institutional support for identified priorities; faculty leadership and involvement; and collaborative decision-making. These essential components served as the impetus and provided the focus for implementation of the college's assessment plan and for faculty and administrative involvement. Because the college had been engaged in CQI efforts for over ten years, faculty did not view implementing the assessment plan as additional work. They saw assessment as part of the commitment they make to student success and to continuous improvement.

Identification of student learning outcomes supports overall program effectiveness, and documents student learning. Essential for any CQI process is a commitment to continuous improvement at all levels of the organization. Western used as its conceptual framework for assessment a model that delineated different levels of effectiveness (see Figure 1). These levels include institutional effectiveness, program effectiveness, course effectiveness, and classroom effectiveness. Central to the successful implementation of an institution's assessment plan is the belief by faculty and administrators that assessing student learning is not an "add on" but that it is an essential component of the teaching/learning process. Appropriate assessment measures can directly examine and judge a student's actual performance on significant and relevant tasks.

Figure 1
Levels of effectiveness

Methodology

The research methodology used in this study was a case study that included "multiple units of analysis" ( Yin, 1994, p. 39 ). Data for this study were collected using direct observation and document analysis. Documents along with observation of individuals can be used in a case study ( Fraenkel & Wallen, 1993 ).

Western is one of sixteen publicly funded technical colleges in the Wisconsin Technical College System. Its main campus is located in La Crosse, Wisconsin, and it has five extended campuses that are located in smaller communities throughout the Western district. A follow-up of graduates from 1998 reported 97% of the graduates were employed within six months of graduating, of which 79% were employed in a program-related field. The college offers 37 associates of applied science degree programs; 18 technical diploma programs; 4 certificates; and 7 special certificate programs. Western serves approximately 3500 full-time equivalent students and employs approximately 200 full-time faculty. Faculty are represented by a collective bargaining unit. The College has been preparing students for the employers of the District since 1917. Western has been actively involved in continuous quality improvement efforts since 1987 and has a high level of faculty involvement in decision-making and work teams.

Faculty and administrators participating in Western's college in-service programs were observed, and documents were analyzed from the 55 associate degree and technical diploma programs. Programs were representative of Western's four divisions and included 15 Business programs, 5 Family and Consumer Science programs, 18 Health and Human Services programs, and 17 Trades and Industrial Education programs. Faculty teams from each of the program areas identified the student learning outcomes and assessment measures and subsequently provided feedback about program improvement and curriculum changes as a result of Western's assessment plan.

Procedures

Multiple observations and document analysis were used to collect data for this study. ( Yin, 1994 ). Using multiple methods established construct validity and reliability. Direct observation was used at the College's August 1998 in-service. Researchers observed participants within each program team on how faculty were responding to the activity, their understanding of the student learning outcomes, and how engaged they were in the overall process. At the College's January 1999 in-service, the researchers observed the program teams to listen to the conversations and decisions as to which assessment measures would be identified for each of the student learning outcomes. The direct observation was chosen since this method would least likely affect the actions of the program teams being observed. Direct observation provided data for research questions 1, 2, and 3.

Documents analyzed included: (a) Western's assessment plan, (b) agendas and minutes from assessment committee meetings, (c) agendas, handouts, and minutes from the college's August 1998 and January 1999 in-services, and (d) 55 program documents that identified student learning outcomes and assessment measures. An analysis of each program document provided the common student learning outcomes and assessment measures for research questions 2 and 3.

Results

The first research question aimed at provoking an answer as to how Western engaged faculty and administration in the process to develop and implement an assessment plan. Initial development of the assessment plan was a direct response to NCA's mandate that an assessment plan be submitted. However, the approach used to engage faculty and administration in the process was consistent with Western's approach to all its major initiatives: strong administrative leadership and broad faculty involvement. These two essential components made implementation of the assessment plan seem like a natural outgrowth of previous work the faculty had engaged in to enhance student success-a critical emphasis of the College.

To begin the process, Western formed an assessment steering committee representing faculty from each of the four divisions within the college-Business, Family and Consumer Science, Health and Human Services, and Trades and Industrial Education. The committee was composed of 12 faculty members and four administrators. The Dean of Business and the Dean of General Education co-chaired the committee. NCA states in the Handbook on Accreditation that "the program to assess student learning should emerge from and be sustained by faculty and administrative commitment of excellent teaching and effective learning" ( NCA, 1997, p. 42 ). To embrace the assessment plan early on, faculty needed to be engaged in the process to ensure its legitimacy.

The assessment steering committee met during Spring 1998 to identify Western's core abilities. Western defines core abilities as broad student learning outcomes, skills, or purposes that are addressed throughout a course and are essential skills for students to succeed both in the classroom and on-the-job. The purpose of identifying core abilities at Western would allow program faculty to focus on student learning outcomes specific to their program. The six core abilities identified included:

  1. Demonstrate technical and other occupational skills needed to obtain employment and be successful in the workplace.
  2. Demonstrate literacy skills essential to the requirements of the workplace.
  3. Demonstrate mathematical skills essential to the requirements of the workplace.
  4. Think critically in solving problems and applying knowledge.
  5. Demonstrate self awareness and interpersonal skills needed to be a good citizen and work collaboratively with a diverse range of people.
  6. Use information technology effectively.

Following this task, the committee began planning the college's August 1998 all day in-service. Western uses an in-service approach to provide additional education, training, and workshops for its faculty and staff. This format is familiar to staff and has been used successfully when the College needs blocks of time to involve faculty and staff in activities. Such was the model used to implement the College's Assessment Plan. Time was set aside to work on assessment during in-service days and during division meetings in the 1998-1999 academic year.

Infrastructures had been developed that facilitated the on-going work of the Assessment Steering Team-release time for faculty to work on assessment activities; use of in-service days to provide training and assistance to faculty; and support for the entire effort provided by one department. This combination of factors and a collaborative environment helped implement the assessment plan.

The second research question focused on determining how faculty and administration identified student learning outcomes for each of the 55 programs at Western. This process also identified the common student learning outcomes within divisions and across all four divisions. The college's August 1998 in-service had two major outcomes: (a) to identify student learning outcomes for Western's 55 programs, and (b) to identify methods to validate the student learning outcomes. The college in-service agenda included: (a) an introduction to and definition of student learning outcomes, (b) examples of student learning outcomes, (c) overview of the validation process, and (d) an activity to engage faculty and administration in developing student learning outcomes. Members from the assessment steering committee facilitated the work groups. A template was provided to list the student learning outcomes for each program area.

Faculty obtained copies of Bloom's ( 1956 ) taxonomy of educational domains. Emphasis was placed on writing student learning outcomes at the most complex level. These levels are covered by the top levels of verbs in each domain (e.g., evaluation, internalization, origination). The cognitive domain includes verbs categorized as application, analysis, synthesis, and evaluation. The affective domain includes the verbs valuing, organization, and internalization. The psychomotor domain includes guided response, mechanism, complex overt responses, adaptation, and origination. Using action verbs from the top level ensures students moving from the most basic to the most complex type of learner action. Following the college in-service, program teams had until October 1998 to identify and submit for review, their student learning outcomes to the assessment steering committee.

After the student learning outcomes were reviewed by the assessment steering committee and discussed with each program team, validation meetings were scheduled to be completed by June 1999. The validation process involved program advisory committee members and employers who met to discuss the student learning outcomes and to make additions to or deletions from the existing list. Following the validation, student learning outcomes were communicated to Western's stakeholders.

The total number of student learning outcomes varied within each of the four divisions and 55 programs at Western. The Computer Information Systems (CIS)-Microcomputer Technician, and the Diesel and Heavy Equipment Technician diploma programs each reported six student learning outcomes. The Human Resource-Business Administration associate degree program reported 20 student learning outcomes. The average number of student learning outcomes by program in each division included (a) 15 student learning outcomes in the business division, (b) 16 in the family and consumer sciences division, (c) 10 in the health and human services division, and (d) 11 student learning outcomes in the trade and industrial education division.

Of the total student learning outcomes reported, only 12 were reported as common for the 55 programs at Western. Table 1 lists the common student learning outcomes, by division, and for the 55 total programs. Student learning outcomes are reported in order of frequency for the total college. Student learning outcomes identified by only one program are not reported.

"Demonstrate effective communication (oral and written) skills" was the most common student learning outcome for business programs (73%), family and consumer sciences programs (100%), and health and human services programs (39%). "Establish a safe work environment, adhere to safety procedures" was the most common student learning outcome for trade and industrial education programs (47%). "Demonstrate effective communication (oral and written) skills" was the most common student learning outcome identified by 45% of Western's programs.

Table 1
Common Student Learning Outcomes by Division and College
Student Learning Outcomes B
%
FCS
%
HHS
%
TI
%
College
%

Demonstrate effective communication (oral and written) skills. 73 100 39 12 45
Demonstrate use of computer tools, software, and appropriate applications. 67 40 11 6 27
Apply legal and ethical principles to personal, social, and professional behavior. 67 40 11 6 27
Function as a team member. 47 0 12 6 20
Establish a safe work environment, adhere to safety procedures. 0 0 17 47 20
Think critically in solving problems and applying knowledge 27 0 17 12 16
Exhibit professionalism. 0 0 11 35 15
Formulate a professional development plan. 33 20 6 0 13
Apply mathematical skills. 7 0 0 22 9
Demonstrate effective presentation skills. 27 0 0 0 7
Apply effective leadership skills. 27 0 0 0 7
Use time management techniques effectively. 20 0 0 0 5

Note: "B" is Business, N = 15 programs; "FCS" is Family and Consumer Sciences, N = 5 programs; "HHS" is Health and Human Services, N = 18 programs; "TI" is Trade and Industrial Education, N = 17 programs; and total college programs N = 55.

Some divisions did not report as frequently student learning outcomes that may be interpreted as important for the programs within that division. Only 39% of the health and human services programs and 12% of the trade and industrial education programs reported "demonstrate effective (oral and written) communication skills" as a student learning outcome for programs within their division. The lower percentage may be due to Western's core abilities that were developed and would be addressed throughout a course. Perhaps faculty felt that "oral and written communication skills" was addressed in the core ability, and therefore did not see the need to include it as a separate student learning outcome for their program.

Programs within a division not reporting any common outcomes may reflect the specialization of those programs. This could be interpreted as the case where the student learning outcomes expected of students who complete an Accounting program (Business division) will not be the same for students completing the Interior Design (Family and Consumer Sciences division), Health Unit Coordinator (Health and Human Services) or Mechanical Design Technology (Trade and Industrial Education) programs. At the college's August 1998 in-service, faculty was encouraged to identify the student learning outcomes that culminate demonstration of student learning specific to their program. The validation of the student learning outcomes, likewise indicate specialization by program, thus most program would not have common program outcomes identified.

The ability to "demonstrate effective written and oral communication skills" was the most common student learning outcome reported by the 55 programs at Western. This finding is consistent with the national reports identifying the skills students need ( Copa & Ammentorp, 1997 ; SCANS, 1991 ; Wilson, et. al., 2000 ). Of the 12 common student learning outcomes identified, four of the student learning outcomes align with the SCANS three-part foundation: (a) basic skills-demonstrate effective communication skills, and apply mathematical skills; (b) thinking skills-think critically in solving problems and applying knowledge; and (c) personal qualities-apply legal and ethical principals to personal, social, and professional behavior. Four of the student learning outcomes aligned with three of the five SCANS competencies: (a) resources-use time management techniques effectively; (b) information-demonstrate use of computer tools, software, and appropriate application, and (c) interpersonal skills-function as a team member, and apply effective leadership skills.

Results from the third research question describe how faculty and administration identified assessment measures for each of the 55 programs at Western. This process also identified the common assessment measures within each division and across all four divisions at Western. A January 1999 in-service day was scheduled to inform faculty and administration about types of assessment measures and to identify assessment measures for each of the student learning outcomes previously identified during the college's August 1998 in-service. The college in-service agenda included: (a) an update of student learning outcomes for Western's 55 programs, (b) examples of direct and indirect assessment measures, and (c) an activity to engage faculty and administration to identify assessment measures for each student learning outcome.

Table 2
Type of Assessment Measures
Division Assessment Measure B
%
FCS
%
HHS
%
TI
%
College
%
Accreditation exam 0 0 6 0 2
Capstone/clinical experience 13 20 28 35 25
Case analysis/presentations 40 0 6 0 13
Checklist (lab/performance) 67 60 22 18 36
Clinical evaluation forms 0 0 39 0 13
Competency performance sheet 0 0 28 12 13
Employer survey 0 0 17 0 5
Graduate follow-up survey 0 0 28 0 9
Instructor observation 40 20 28 23 29
Internship 7 20 0 6 5
Journals 7 0 6 0 4
Paper-and-pencil test (written exam) 53 40 28 18 33
Peer/self evaluation 7 0 6 0 4
Performance tasks and student exhibition 53 60 44 82 58
Personal interviews 0 0 6 0 2
Portfolio 53 40 33 35 40
Practicum teaching 0 40 0 0 4
Professional association/licensure exam 0 0 56 18 24
Program exit exam 0 0 0 6 2
Simulation (project) 40 20 0 6 15
Student completion of program requirements 0 0 6 0 2
Technical report/project (written) 47 40 6 0 18

Note: "B" is Business, N = 15 programs; "FCS" is Family and Consumer Sciences, N = 5 programs; "HHS" is Health and Human Services, N = 18 programs; "TI" is Trade and Industrial Education, N = 17 programs; and total college programs N = 55.

Faculty and administration were reminded that effective assessment measures inform students how well they have mastered the knowledge, skills, and abilities and hold them accountable for achieving learning outcomes. Assessment measures discussed included types of tests, competency-based measures, and external reports. Discussion focused on portfolios as a form of authentic assessment. Faculty already using authentic assessment measures presented examples they used in their classroom. Copies of the authentic measures were placed in 3-ring notebooks and placed in the college library for future reference following the January 1998 in-service. Program teams had until May 1999 to identify assessment measures for each student learning outcome.

Responses to identified types of assessment measures varied within the four divisions and across the 55 programs. Assessment measures are listed alphabetically (see Table 2 ). Two of the 22 assessment measures identified were indirect measures. Employer survey (17%) and graduate follow-up survey (28%) were indirect assessment measures reported by programs in the health and human services division. Of all the assessment measures identified, a majority of programs within the divisions reported use of at least one authentic assessment measure (e.g., checklist, instructor observations, performance tasks and student exhibition, portfolio, practicum teaching, or simulation).

The most common assessment measure by division was "checklist (lab/performance)" reported by 67% of the business programs and 60% of the family and consumer sciences programs; "professional association/ licensure exams" reported for 56% of health and human services programs; and "performance tasks and student exhibition" reported for 82% of the trades and industrial education programs. "Performance tasks and student exhibition" was reported as the most common assessment measure by 58% of Western's programs.

Summary and Implications for Further Research

The results of this study support accreditation agency requirements for colleges to implement an assessment plan that includes the identification of student learning outcomes.

At the college's August 1998 in-service, time was allocated to engage faculty and administration in a process to have the 55 programs at Western identify student learning outcomes. At the college's January 1999 in-service, faculty and administration engaged in a process to identify the assessment measures for each of the student learning outcomes that were identified at the August 1998 in-service. The assessment measures would be used in the future to collect data regarding student learning. It was this process that engaged faculty and administration at Western to develop and implement an assessment plan and fostered the identification of student learning outcomes.

Results from Western's assessment efforts over the past three years also identified assessment of student learning at Level Two: Making Progress in Implementing Assessment Programs ( Lopez, 1998a ), as discussed in the "Accrediting Agencies" section above. Faculty identified measurable student learning outcomes and assessment measures, and in the future they will evaluate the achievement of the student learning outcomes. Level Two is supported by administration and recognizes faculty efforts to implement an assessment plan. Feedback loops are being developed at Western to ensure results of assessment are used to improve student learning. Continued assessment efforts will move the college to Level Three (Maturing Stages of Continuous Improvement), resulting in changes in pedagogy and curriculum, and linking the assessment results to academic program review. The assessment plan is reviewed and updated on an annual basis. When data have been analyzed for all programs, the next step will be to create benchmarks for comparison purposes. It will be imperative to provide documentation of program improvements and the resources needed to continue this effort.

Findings from this study indicate that faculty are moving towards implementing authentic assessment measures to assess student learning outcomes, but common assessment methods consistent with Bragg's ( 1995 ) study were also identified in this study. Paper-and-pencil tests were used by 33% of the faculty. However, faculty were using authentic assessment measures-including performance tasks (58%), portfolios (40%), and checklists (36%) to assess student learning outcomes. When using authentic assessment measures, faculty can directly examine and judge the student's actual performance on significant and relevant tasks.

This study focused on describing the process of engaging faculty and administration in the implementation of an assessment plan. Specifically, the study focused on the process used to help faculty identify student learning outcomes and assessment measures. It is recommended that further research concentrate on two different areas. The first area pertains to the feedback received related to the achievement of student learning outcomes, and the second area is to determine what pedagogical and curricular changes have been implemented as a result of the assessment of student learning.

Additional research should be conducted to compare the assessment results with the college's employer satisfaction survey results to determine the degree of alignment between faculty and employer perceptions of student success. In addition, program advisory committee members should be surveyed to determine their perceptions about students achieving the student learning outcomes. It is also recommended that research be conducted to collect student learning outcomes and assessment measure data from other technical colleges to compare and contrast findings on a state level.

Developing student learning outcomes and assessment measures should be a central part of a teacher's professional ethics. As Holyer ( 1998 ) argued and this study substantiated, the assessment process requires more than faculty time and additional committee work. It also will involve structural changes in American higher education and the professional self-understanding of faculty.

References

JournalAssessment measures and methods (n.d.) . Retrieved October 22, 2001, from the California State University at Northridge Web site: http://csbs.csun.edu/webpages/assessment.html

Bloom , B. S. (1956). JournalTaxonomy of educational objectives: The classification of educational goals . Handbook I: Cognitive domain. New York: David McKay Company Inc.

Bragg , D. D. (1995). Assessing postsecondary vocational-technical outcomes: What are the alternatives? Journal of Vocational Education Research , 20 (4), 15-39.

Bryce , C. R. (1991). Quality management theories and their applications. Part I. Quarterly , 30, 25-28.

Christopher Newport University (n.d.). Measures of student learning . Retrieved October 22, 2001, from the Christopher Newport University Web site: http://www.cnu.edu/admin/assess/resources/Assessment_Measures.htm

Conrad , C. A. (1999). The American workforce in the new millennium . Washington, DC: Joint Center's Corporate Council.

Copa , G. H., & Ammentorp, W. (1997). New designs for the two-year institution of higher education. Berkeley, CA: National Center for Research in Vocational Education, University of California.

Cross , K. P. (1998). What do we know about students' learning and how do we know it? Retrieved October 21, 2001, from the American Association for Higher Education Web site: http://www.aahe.org/nche/cross_lecture.htm

Ewell , P. T. (2001). Accreditation and student learning outcomes: A proposed point of departure . Washington, DC: Council for Higher Education Accreditation.

Fraenkel , J. R., & Wallen, N. E. (1993). How to design and evaluate research in education (2nd ed.). New York: McGraw-Hill, Inc.

Frye , R. (1999). Assessment, accountability, and student learning outcomes (Dialogue No. 2). Bellingham, WA: Western Washington University, Office of Institutional Assessment and Testing.

Glatthorn , A. A. (1999). Performance standards and authentic learning . New York: Eye on Education.

Glenn , J. M. (2000). Teaching the net generation. Business Education Forum , 54 (3), 6-14.

Holyer , R. (1998). The road not taken. Changes , 30 (5), 41-43.

Imel , S. (1999). Work force education: Beyond technical skills. Trends and Issues Alert . Retrieved November 2, 2001, from ERIC Clearinghouse-Adult, Career and Vocational Education Web site: http://ericacve.org/docs/tia00069.htm

Jacob , J. E. (1998). A simple and effective student learning outcomes assessment plan that works . Retrieved October 26, 2001, from the California State University at Chico Web site: http://www.csuchico.edu/bss/plan.html

Jenkins , A., & Unwin, D. (1996). How to write learning outcomes . Retrieved October 21, 2001, from the National Center for Geographic Information & Analysis Web site: http://www.ncgia.ucsb.edu/education/curricula/giscc/units/format/outcomes.html

Lopez , C. L. (1997). Opportunities for improvement: Advice from consultant-evaluators on programs to assess student learning . Chicago, IL: North Central Accreditation Commission on Institutions of Higher Education.

Lopez, C. L. (1998a). The commission's assessment initiatives: A progress report . Paper presented at the 103rd annual meeting of the North Central Association, Chicago, IL.

Lopez, C. L. (1998b). Making progress on programs to assess student learning: Why we need to succeed and how we can . Paper presented at the council of North Central Two Year Colleges, Rapid City, SD.

Middle States Commission on Higher Education (1996). Framework for outcomes assessment . Retrieved November 6, 2001, from the Middle States Commission on Higher Education Web site: http://www.msache.org/msafram.pdf

New England Association of Schools and Colleges. (2001). Standards for accreditation . Retrieved November 6, 2001, from the New England Association of Schools and Colleges Web site: http://www.neasc.org/cihe/stancihe.htm

North Central Association of Colleges and Schools Commission on Institutions of Higher Education. (1997). Handbook of accreditation (2nd ed.). Chicago, IL: Author.

Ruhland , S. K., Samson, H. E., Brewer, J. A., & Hague, D. G. (1997). Marketing program outcomes: The building blocks of an associate degree program . Madison, WI: Wisconsin Technical College System.

Ruhland, S. K. (1998, August). Identifying program outcomes . Paper presented at the Fall College Day at Western Wisconsin Technical College, La Crosse, WI.

Secretary 's Commission on Achieving Necessary Skills (1991). What work requires of schools: A SCANS report for America 2000 . Washington, D.C.: United States Department of Labor.

Southern Association of Colleges and Schools (1997). Criteria for accreditation . Retrieved November 7, 2001, from the Southern Association of Colleges and Schools Web site: http://www.sacscoc.org/criteria.asp

Stecher , B. M., Rahn, M. L., Ruby, A., Alt, M. N., Robyn, A., & Ward, B. (1997). Using alternative assessment in vocational education (MDS-946). Berkeley, CA: National Research Center in Vocational Education.

VanHuss , S. H. (1996). Assessment: Key to effective learning. Business Education Forum, 50 (4), 7-10.

Western Association of Schools and Colleges (n.d.). Standards for accreditation . Retrieved November 6, 2001, from the Western Association of Schools and Colleges Web site: http://www.accjc.org/Standard.htm

What are promising ways to assess student learning? (1996). Improving America's school: A newsletter on issues in school reform . Retrieved from the Improving America's Schools: A Newsletter on Issues in School Reform Web site: http://www.ed.gov/pubs/IASA/newsletters/assess/pt3.html

Wilson , C. D., Miles, C. L., Baker, R. L., & Schoenberger, R. L. (2000). Learning outcomes for the 21st century: Report of a community college study. Mission Viejo, CA: League for Innovation.

Yin , R. K. (1994). Case study research design and methods (2nd ed.). California: Sage Publications.

Authors

SHEILA K. RUHLAND is Assistant Professor, Department of Work, Community and Family Education, University of Minnesota, 1954 Buford Avenue, St. Paul, MN 55108. [E-mail: ruhla006@umn.edu ]. Her research interests include two-year technical and community colleges, teacher retention, evaluation and assessment, and Tech Prep education programs.

JERRILYN A. BREWER is Director of Educational Support Services at Western Wisconsin Technical College, 304 North Sixth Street, La Crosse, WI 54601 [E-mail: brewerj@wwtc.edu ]. She is chair of the College's Assessment Team providing leadership for developing and implementing the Student Outcomes Assessment Plan. Her work includes the integration of continuous quality improvement in higher education using the Baldrige Framework for Educational Excellence.