The aim of this paper is to assess the impact the Advanced Certificate in Education (ACE) in-service technology training program has on technology teachers’ knowledge and understanding of technology. The training of technology teachers is an initiative toward teachers’ professional development within the mathematics, science, and technology sphere of education (MSTE). ACE is a two-year training program that technology teachers in the Gauteng and Mpumalanga Provinces (South Africa) attended during 2008 through 2009. The program attendees were senior phase teachers, of whom a few taught in the Further Education and Training band of education (certain high schools begin with grade 8). The research problem that the study addressed is stated in terms of the following hypothesis: There is no statistically significant difference between the pre- and post-knowledge and understanding survey scores for teachers attending the ACE professional development program in technology education. A survey questionnaire to collect biographical and technological input was administered to teachers who attended on the days the questionnaire was administered. The same questionnaire was administered at the beginning of training in 2008 and at completion of the program in 2009. The aim of the quantitative study was to evaluate whether the ACE-Technology training had a statistically significant impact on technology teachers’ knowledge and understanding of technology. In total, 304 completed questionnaire responses were included in the study. The results indicated that there were improvements in the teachers’ technological knowledge and understanding. This indicates that teachers benefited positively from the ACE-Technology training.
Keywords: Advanced Certificate in Education, in-service training, technology teachers, technology teaching capabilities, professional development programs, MSTE education, nonparametric Kruskal-Wallis test, Wilcoxon test.
The introduction of outcomes-based education (OBE) in the form of Curriculum 2005 (C2005) was a huge educational reform in the history of South Africa. C2005 was reviewed twice and became consecutively known as the Revised National Curriculum Statement (RNCS), the National Curriculum Statement (NCS); currently it is the Curriculum and Assessment Policy Statement (CAPS). It is envisaged that CAPS will be implemented in 2012 (Department of Education [DoE], 2005, 2010). The reviewed versions have not lost their OBE flavor per se, as the present version is still undergirded by the curriculum principles rooted in the South African Constitution’s “Preamble,” which motivated the transformative OBE curriculum approach (DoE, 2003, 2010).
The OBE approach to the curriculum created a gap between the requirements of the OBE and training the majority of teachers previously received (Ono & Ferreira, 2010). Because the pedagogic practice of the OBE differs from previous practice, intensive, continuous professional teacher development is imperative to prepare teachers for the implementation of the revised curriculum. The urgency of the matter becomes even more apparent considering the training of underqualified and unqualified teachers is still incomplete and a reality to be dealt with (Jansen & Christie, 1999; Taylor & Vinjevold, 1999). Although the qualifications of many teachers in the country have improved, the majority of teachers have not been sufficiently equipped to meet the changing educational needs of modern society (DoE, 2006). Two of the most important factors in determining whether teachers are adequately equipped to teach technology successfully are content knowledge (subject matter) and pedagogic skills (Aluko, 2009). Most studies in teacher development have found that many teachers seriously lack pedagogic skills regarding supporting individual differences in students (Kent, 2004; Laine & Otto, 2000). Insufficient pedagogical skills may be attributed to current teacher education and development practices for both pre-service and in-service teachers (Kent, 2004).
The previous literature references emphasize that continuous professional development is crucial for teachers who work in an environment of school curriculum changes. Adequate time should be made available for teachers to study and plan if they are to effectively and successfully implement the curriculum (Laine & Otto, 2000). Literature indicates that most school districts in America usually provided too little time for professional development (Kent, 2004). The outcome of such a situation is that teachers may not be in a position to pass sound judgment regarding learners’ needs.
The research described in this article is approached from the perspective of a teacher’s professional development. According to Villegas-Reimers (2003, p. 11), professional development is broadly defined as “the development of a person in his or her professional role.” More specifically, teacher development is explained as “the professional growth which a teacher achieves as a result of gaining increased experience and examining his or her own teaching systematically” (Gatthorn in Villegas-Reimers, 2003, p. 11). According to Villegas-Reimers (2003), professional development includes formal experiences such as reading professional publications, watching TV documentaries related to an academic discipline, and attending cluster meeting workshops. It is broader than career development, staff development, or in-service training, and it includes a longterm development process.
Teacher training is pivotal to the success of curriculum change (Brown, Sithole, & Hofmeyr, 2000). Thus, the challenge to the system is to help teachers to become change agents and thereby enable them to lend impetus to transformation (Brown et al., 2000; DoE, 2006) through creative approaches (Castellano & Datnow, 2000; Kent, 2004). Many studies have shown that teacher competence in pedagogic and content knowledge is crucial for student achievement (Borko, Elliott, McIver, & Wolf, 2000; Darling-Hammond, 2000; Kent, 2004; Pikulski, 2000; Rivers & Sanders, 1996). In an attempt to develop teachers professionally, the DoE in South Africa proposed new professional qualifications for teachers, namely the four-year Bachelor of Education (BEd) degree, with a senior certificate, that is matric plus four years’ qualification as prerequisite; and a Postgraduate Certificate in Education (PGCE) (Aluko, 2009). However, in reality in South Africa about 40% of practicing teachers are either unqualified or underqualified (DoE, 2009) and hold outdated teachers’ diplomas, such as the Primary Teachers Diploma, the Senior Primary Diploma in Teaching, the Junior Primary Teachers Diploma, or the Senior Teachers Diploma (Welch, 2009). According to a survey undertaken by the Human Science Research Council (HSRC), only about 18% of currently practicing teachers are professionally qualified.
The introduction of technology education has triggered an urgent and fervent need for inservice technology teacher training as part of teachers’ professional development. This need was exacerbated by the fact that technology education was introduced as a relative newcomer at the inception of C2005 (Gumbo, 2003; Maluleka, Wilkinson, & Gumbo, 2006). There were no trained or qualified technology teachers at this stage. When technology education was rolled out with C2005 in 1998, teachers, qualified in other subject fields, were asked by the DoE to volunteer to teach technology. They thus started teaching technology with a very limited pedagogical content knowledge background. Similar developments were reported internationally. Reference can be made to China, where teachers “floor-crossed” from other disciplines — with different knowledge backgrounds — into technology education (Feng & Siu, 2009). Feng and Siu (2009) view in-service teacher education, based on the China experience, as a crucial factor in technology curriculum development.
As part of a formal two-year qualification to address the technology teacher training backlog, the DoE decided on the ACE qualification to fast-track teacher training, particularly as an in-service training course. The ACE program enables teachers to upgrade from a Matric + 3 (matric plus three years’ qualification) to an M + 4. The ACE furthermore provides the option for practicing teachers to either qualify in a new subject learning area or to specialize in a subject/learning area that they are currently teaching (DoE, 2000). The admission requirements for entry into the ACE program include professional qualification. This qualification may be either a three-year teachers’ diploma, or a BEd degree (Aluko, 2009). When the program it completed, it is envisaged that teachers will be highly competent in terms of knowledge, skills, and didactics relevant to the subject. This is in keeping with the principles of the National Policy Framework for Teacher Education and Development in South Africa, which states that “a teacher should be a specialist in a particular learning area, subject or phase” (DoE, 2006, p. 5).
Since 2002, in-service training workshops sponsored by the DoE for Higher Education Institutions have been held for teachers during school holidays and on Saturdays. In this regard, Potgieter (2004) accounted for some 137 teachers who participated in workshops that he facilitated, and some 950 teachers who enrolled with the University of South Africa in 2002 for the ACE program. However, despite these initiatives, according to Ndahi and Ritz (2003), the supply of technology teachers is still minimal and should continue to receive attention. The DoE (2006, p. 16) furthermore stated that “both conceptual and content knowledge and pedagogical knowledge are necessary for effective teaching.”
The nature of technology is more of project based and problem driven. This means that students design and make projects to solve identified problems with “structures,” that is, electrical and mechanical systems. In line with this nature of technology, the ACE training program reported in this article covered the following topics: (1) technological processes and skills, which includes investigate, design, make, evaluate, and communicate. It is about designing, making, and evaluating technology prototypes or artifacts to solve technological problems while incorporating a range of other technological processes; (2) technological concepts and content knowledge, which includes structures, material processing, electrical/electronic control systems, and mechanical systems; (3) indigenous technology, which is about the impact and biases that technology has on society and the environment. These topics were integrated in the practicals that teachers conducted within “structures,” electrical and mechanical systems.
The research under discussion was designed as an exploratory study, and a quantitative research approach was used to this effect. A mixed-model approach, in which quantitative results (e.g., interviews) complement quantitative deductions, could have enriched the findings. Time and funding was however a restricting factor in this study; therefore, it was argued that once the impact of technology training had been verified, future studies should incorporate additional aspects, such as the length of the training presented, countrywide representation of respondents, and qualitative methods to strengthen research findings.
The formal hypothesis of the research question on the impact that the Technology-ACE training program has on technology teaching knowledge of practicing teachers can be formulated as follows: There is no statistically significant difference between the pre- and post-knowledge and understanding survey scores of practicing teachers who completed the ACE professional development program in technology education. The research environment in which the research was conducted that is reported on in this article is briefly discussed in this section.
The first two authors were involved as facilitators in an ACE-Technology training program offered during 2008 and 2009. The program was a collaborative project between the Tswane University of Technology and the Vaal-Triangle University of Technology, and it trained senior phase technology teachers of the Gauteng (Sebokeng, Johannesburg North, Soweto and Tswane West Districts) and Mpumalanga Provinces (Bushbuckridge District). The authors were keen to assess the impact of subject-specific (technology education/TLA) training on technology teachers’ knowledge and understanding of the subject of technology education. Hence, the researchers integrated their training with the research project under discussion and undertook a quantitative survey design study.
A questionnaire was designed and administered to technology teachers at the beginning of the ACE-Technology training program in 2008 to determine the status of teachers’ technological knowledge and understanding. The same questionnaire was administered to the same teachers when they completed the program in 2009 to determine the impact that the training had had on the teachers’ perceptions of their technological knowledge and understanding. The questionnaire included 14 questionnaire statements on teachers’ perception of their knowledge and understanding of technology education subject matter and interpretation (see Tables 2 and 3). The statements were scored according to a five-point Likert agreement rating scale. Questions on biographical attributes included training background, qualifications, and qualifying institutions, as well as present and past teaching experience.
Ethical research aspects were addressed by acquiring permission to conduct the research from the DoE and participating parties. ACE organizing officials and senior DoE staff who visited the training sites were approached for ethical clearance. Survey participation was voluntary, and the purpose of the study was explained to the participating teachers.
The target population of the study was practicing technology teachers and the population was sampled purposively since several logistical problems initially compromised random sampling: teachers interpreted the registration procedure incorrectly due to poor communication by the DoE; and other teachers enrolled late because school managements granted permission at a late stage; in other cases teachers attended classes infrequently; or teachers who had been selected to attend did not attend and some teachers switched between the ACE specialization fields of mathematics, science, and technology. As a result, 304 teachers who were conveniently available participated in the questionnaire survey.
One-way frequency distributions on respondents’ biographical attributes were calculated to provide a descriptive background of the sampled population and to verify the representativeness of the sample in terms of target population attributes. These frequency distributions were furthermore used to determine whether biographical attributes could be further investigated for their effect on teachers’ technology knowledge acquisition — over and above the effect that the ACE-Technology education training program had on the acquisition and understanding of technology knowledge.
Composite frequency tables on the knowledge and understanding perception rating statements (14 statements), which respondents rated before and after the completing the ACE-Technology education training were also calculated to provide a general overview of respondents’ perceived knowledge and understanding prior to and after the ACE training (Table 2). The difference between pre- and post-ACE technology knowledge and understanding perception scores were also calculated for each respondent and statement. The analysis strategy argued that if training did not affect the perception of teachers’ technological knowledge and understanding, perception ratings prior to and after ACE-Technology training would be more or less the same. This no-effect assumption would imply that the difference between pre- and post-training scores would be close to zero. A nonparametric Wilcoxon signed-rank test was conducted on all difference ratings, combined over all the questionnaire statements, to test the hypothesis that the general mean difference rating score was zero. Separate tests were also calculated on the 14 sets of difference scores for each knowledge statement. These tests were conducted separately to assess whether teachers perceived to have significantly improved their knowledge and understanding on each aspect of technology tutoring that the questionnaire probed.
Once the issue of the impact of ACE-Technology training had been validated, the analysis strategy investigated the effect that biographical attributes — such as previous technology tutoring experience and previous technology training, could possibly have had on the expected increase in technology knowledge and understanding (over and above the effect of ACE-Technology training on technology knowledge and understanding). Nonparametric analysis of variance (Kruskal-Wallis one-way analysis of variance) was used to investigate this aspect of the research. The analysis strategy was duly followed and analysis results are presented in the next section.
Table 1 presents the frequency distributions of the biographical variables probed in the questionnaire.
These distributions describe the sample as teachers of whom approximately the same proportion was selected/or attended the ACE technology education program voluntarily (50.34% and 48.65%); and the same proportion had no/had previous formal training in technology teaching (58.75% and 41.25%). The figures show that previous training was most commonly gained from week-long workshops (51%) and that previously acquired technology education qualifications mostly consisted of an attendance certificate (58.14%). Approximately half of the respondents had taught technology previously, and 73% of teachers were currently teaching technology.
|How were you selected? (missing = 8)||Qualifying institution (missing = 217)|
|fi||%||Cum fi||Cum %||fi||%||Cum fi||Cum %|
|DoE inst. urban||92||31.08||92||31.08||Secondary Sch||13||14.94||13||14.94|
|School sent me||57||19.26||149||50.34||Tech/College||17||19.54||30||34.48|
|Training background in Technology education? (missing = 1)||Taught technology prior to ACE? (missing = 5)|
|Type of training (missing 171)||Grade taught previously (missing 226)|
|Tech Ed qual.||22||16.54||127||95.49|
|Workshop duration (missing = 187)||Currently teaching technology (missing = 4)|
|Type of qualification (missing = 218)||Grade currently teaching (missing = 216)|
|Attendance cert.||50||58.14||50||58.14||Grade 8||86||97.73||86||97.73|
|H/ M/ Doctorate||2||2.33||86||100.00|
The deduction could thus be made that the sample appropriately represented the target population of the research, namely, practicing technology teachers with limited formal teaching qualification.
Once the adequacy of the sample had been verified, the authors’ attention was turned first to an exploratory overview of technology knowledge and understanding perception trends and next to a formal validation of these observed improvement trends. A summary of the respective analysis results is presented in Tables 2 and 3.
In Table 2, the total frequency distribution row before onset of the ACE program indicates that the majority of perception responses fell in the no-experience to limited experience categories (61%). If the perception rating scale of no experience to extensive experience is interpreted as “a substantial lack of knowledge” to “substantial knowledge or confidence,” then the deduction can be made that respondents seemed to lack the general academic knowledge and understanding to teach technology before they started the program.
On the other hand, when the participants completed the program the total row frequency distribution indicates that respondents in general felt more relaxed about their academic knowledge and understanding of technology once they had undergone ACE-Technology training, since the majority of responses now fell in the moderate to the more than average experience perception categories (91%). This shift seems to indicate that respondents felt more confident about their technology knowledge and insight once the ACE-Technology training program had been completed.
|No experience||Limited experience||Moderate experience||Above average experience||Extensive experience||Totals|
|Frequency Missing = 80 (pre); 149 (post)|
|1. Meaning of technology||64||3||108||10||87||143||38||134||6||6||303||296|
|2. Meaning of tech. Education||86||3||108||12||72||140||31||133||3||6||300||294|
|3. Learning outcomes, ass. stds||89||3||68||11||83||101||58||172||3||7||301||294|
|4. Grasp/apply design process||106||4||71||9||76||125||46||152||5||6||304||296|
|5. Identify problems/needs/wants||87||3||94||16||74||132||39||137||3||2||297||290|
|6. Structures/strengthening techniques||77||3||77||11||84||122||51||153||3||5||292||294|
|7. Priorities, selecting materials||75||3||100||16||78||119||46||152||2||3||301||293|
|8. Systems and control||90||5||99||26||72||144||27||112||0||4||288||291|
|9. Design, completing projects||96||4||88||22||84||148||28||116||3||4||299||294|
|10. Identify and apply resources||78||3||90||21||91||132||34||134||2||3||295||293|
|11. Tech. lesson planning||87||3||91||21||88||137||30||127||1||4||297||292|
|12. Tech. methods and strategies||97||5||103||20||66||151||32||114||1||4||299||294|
|13. Tech. assessment||104||5||96||26||71||154||27||106||1||3||299||294|
|14. Implement NCS Tech Grade 8||99||4||92||27||77||146||31||114||2||1||301||292|
|Percentage of pre-/post-totals||30%||1.4%||31%||6%||26%||46%||13%||45%||.1%||1.6%|
|Signed-rank test to test the null hypothesis of no overall improvement in knowledge once ACE-technology training is completed (H0 : Mu0 = 0)|
|Knowledge aspect||N||S-statistic Signed-rank
|Probability associated S-Statistic||Mean (std. dev.)||Skewness||Kurtosis|
|Signed rank test to test the null hypothesis of no overall improvement in knowledge once ACE-technology training is completed (H0 : Mu0 = 0)|
|#: The differences for the overall knowledge improvement variable were calculated by subtracting scores rated on program completion from ratings scores rated prior to course commencement for each of the 14 subquestions for each respondent. The total number of responses considered was therefore 14 x 304 = 4256.
For the individual rank tests only data of respondents that completed the same question on both questionnaires could be included in the various analyses, therefore varying totals are reported.
|Overall knowledge improvement#||4047||2195144.00||< 0.0001***||1.19 (1.00)||0.05||-0.63|
|The signed rank sum test results on knowledge statements (14) assessed in the questionnaire|
|Meaning of technology||296||11420.00||< 0.0001***||1.08 (0.93)||0.14||-0.29|
|Meaning of technology education||290||12472.00||< 0.0001***||1.27 (0.98)||-0.10||-0.48|
|Learning outcome, assessment standards||291||9401.00||< 0.0001***||1.21 (1.12)||0.12||-0.55|
|Grasp and apply design process||296||11301.50||< 0.0001***||1.28 (1.12)||-0.01||-0.86|
|Identify problems, needs and wants||287||11218.00||< 0.0001***||1.20 (1.01)||-0.01||-0.78|
|Structures, strengthening techniques||283||10027.50||< 0.0001***||1.14 (1.00)||0.05||-0.84|
|Priorities, selecting material||291||11038.50||< 0.0001***||1.17 (1.02)||0.11||-0.80|
|Systems and control||278||11542.50||< 0.0001***||1.19 (0.90)||0.13||-0.54|
|Design, completion of projects||290||10953.00||< 0.0001***||1.18 (0.99)||0.01||-0.62|
|Identify and apply resources||286||11065.50||< 0.0001***||1.13 (0.94)||0.06||-0.65|
|Lesson planning, technology||287||11385.00||< 0.0001***||1.18 (0.99)||0.14||-0.68|
|Technology methods and strategies||290||11489.50||< 0.0001***||1.22 (1.00)||-0.03||-0.60|
|Implementation tech assessment||291||12501.50||< 0.0001***||1.22 (0.95)||-0.05||-0.64|
|Grasp, implement Grd R9 NCS Tech||291||11863.50||< 0.0001***||1.15 (0.97)||-0.01||-0.59|
These initial indications of technology competency shifts were further explored and statistically validated by means of nonparametric Wilxocon signed-rank tests. The tests were conducted on the combined difference data set as well as on the 14 individual subsets of pre- post-difference scores for each of the 14 questionnaire statements for all respondents. The null hypotheses evaluated in all instances state that the ACE program did not statistically significantly improve technology competencies in any respect (14 competency aspects and a general trend), as opposed to the alternative hypotheses of a statistically significant effect of ACE-Technology intervention on technology competency. Table 3 summarizes the results of the Wilxocon tests.
Highly significant Chi-square test statistics were associated with all Wilcoxon signed-rank tests (column 4 Table 3). The tests therefore verify initial indications of positive shifts in perceived technology competency. The general test in Table 3 verified that teachers perceived ACE-Technology programs to statistically significantly improve their technology teaching competencies; more specifically, teachers perceived that all (14) aspects of their technology knowledge and understanding that were probed in the questionnaire were statistically significantly enriched by the ACE-Technology intervention.
The results, derived from Tables 2 and 3, thus answered the main concern of the study: ACE-Technology intervention had a positive impact on teachers’ perceptions of their technology teaching competencies. To enrich these findings, the researchers also investigated the effect that other factors might have played a role in teachers’ perceived improved technology teaching capabilities. Separate nonparametric Kruskal-Wallis analyses of variance were conducted on the overall set of differences between pre- and post-rating scores of respondents to evaluate how respondents’ perceptions of their improved technology training competency were affected by the following:
The results of these analyses are summarized in Table 4, and the factors investigated are listed in column 1 of Table 4.
The analysis results identified the following biographical factors as statistically significant additional role players (over and above ACETechnology intervention) affecting perceived general positive change in teachers’ technology knowledge and understanding, namely:
The study in this paper aimed to assess the effect that the ACE-Technology training programs had on technology teachers’ professional development regarding their knowledge and understanding of technology. The findings revealed that teachers overwhelmingly benefited from the training in terms of their knowledge and understanding of technology. This held true for the overall perception of improved technology knowledge and understanding competency once ACE-Technology training had been completed, as well as for the 14 specific aspects of technology knowledge and understanding probed in the research. The initial indications of improved competency indicated in the exploratory frequency analyses were neatly statistically confirmed in the advanced statistical analysis.
|Previous technology training
Kruskal-Wallis Chi-sq statistic = -13.3835, probability (chi-sq statistic-value) < 0.0001***
|N Obs||Mean||Std Dev||Maximum||Minimum||N|
|Type of previous technology training exposure
Kruskal-Wallis Chi-sq statistic = 93.40, probability (chi-sq statistic-value) = 0.0001***
|Tech education qualifications||308||0.5776||0.8838||3.0000||-1.0000||277|
|Other training exposure||84||1.5256||0.9359||3.0000||0.0000||78|
|Type of technology qualification obtained
Kruskal-Wallis Chi-sq statistic = 22.76, probability (chi-sq statistic-value) < 0.0001***
|Institute where the previous qualification was obtained
Kruskal-Wallis Chi-sq statistic = 44.59, probability (chi-sq statistic-value) < 0.0001***
|Province where respondent taught
Kruskal-Wallis Chi-sq statistic = 106.70, probability (chi-sq statistic-value) < 0.0001***
|Previous teaching experience in technology
Kruskal-Wallis Chi-sq statistic = 81.78, probability (chi-sq statistic-value) < 0.0001***
Other contributing factors, expressed as biographical attributes of teachers, presented some noteworthy perspectives on the findings. Teachers who had had no previous exposure to technology education training perceived that they benefited more from the ACE-Technology program than did their peers, who had previous exposure to technology training (confirmed in deduction (i) of the analysis results and interpretation section). This finding may be related to these teachers’ heightened determination to learn more from the training to fill their technology knowledge training gap. These findings serve to strengthen the opinion of research by the DoE (2006), Taylor and Vinjevold (1999), Jansen and Christie (1999) and Aluko (2009), who concluded that the training of many teachers (underqualified and unqualified) is still incomplete. Furthermore, results indicated that (deduction (ii), in the analysis results and interpretation section) teachers who had received previous technology training at colleges perceived that they benefited less from the ACE-Technology training than did those who had previous technology training exposure through workshops. The longer institutionally based technology training for the college teachers might provide the reason why these teachers perceived to have benefited less from the ACE-Technology training program: they most probably gained more knowledge and understanding of technology during their college training. An interesting finding that was not expected in the research (deduction (v), in the analysis results and interpretation section), is that the Mpumalanga respondents experienced a significantly greater positive change in technological knowledge post-ACE-Technology training compared with that of their Gauteng Province colleagues. This may be attributed to their higher level of commitment to acquire technology teaching capabilities because of the “the rural environment” where they work and the assumption that their rural setting is “technologically poor.” Deduction (iii) in the analysis results and interpretation section furthermore indicated that teachers who had attendance certificates in technology perceived their acquired technology competency to have improved significantly more on ACE-Technology completion than did teachers with an Honors or second degree qualification prior to ACE-Technology training. Teachers with an Honors degree were most probably more knowledgeable at the onset of ACE-Technology training. Deduction (vi) of the analysis results and deductions section also indicated that teachers who had not previously taught technology perceived to have benefited significantly more from the ACE-Technology training than did those who had taught technology previously. The latter group most probably had ample exposure to technology prior to ACE-Technology training, and they could therefore identify with what the ACE-Technology training covered. These findings confirm that the benefit that these categories of teachers derived from the training based on their biographical attributes is in keeping with DoE’s (2006) intention with ACE programs — for teachers to become specialists in their subject areas.
Technology teacher training should be preceded by profiling teachers and analyzing needs so that strategic decisions can be made to vary the depth and nature of the training based on the profile and specific needs; otherwise, the training may not be beneficial to all. Teachers without any training background in technology should preferably receive intensive training in all content knowledge and pedagogical areas. For those with some training background, only specific gaps as identified in their needs survey should be addressed in the training. Furthermore, because a quantitative research approach (as was followed in the current research) might have presented as a limiting factor in knowledge acquisition on the dynamics of perceived benefits to be gained from ACE-Technology training, researchers should consider mixed-methods approaches for the assessment of the impact of training on technology teachers in future studies. Such an approach will enable triangulazation. The current exploratory study was also restricted to only two provinces in South Africa. In the future, a study of this nature should be extended to other provinces to be able to generalize to South Africa as a whole.
This paper reported the findings of the study that inquired into the effect of ACE-Technology training of teachers regarding their knowledge and understanding of technology. In terms of the research question and the hypothesis that was stated, the main finding of the study is that the ACE training in technology education enhanced teachers’ knowledge and understanding of technology. This is an important finding considering that technology education is a relatively new learning area/subject and that there is dire need for training teachers to offer the same to learners. Furthermore, the training of teachers in the field should be seen to make a difference in their knowledge of technology and the methodologies of presenting it to the learners. It is hoped that teachers who underwent this training are now serving their learners in schools by implementing what they have acquired.
Dr Mishack Gumbo is a senior lecturer in the Science and Technology Education Department, College of Education at the University of South Africa.
Dr. Moses Makgato is an associate professor in the Department of Educational Studies at Tshwane University of Technology, South Africa.
Heléne Müller is a senior research support consultant in the Department of Interdisciplinary Research, College of Graduate Studies at the University of South Africa.
Aluko, R. F. (2009). The impact of an Advanced Certificate in Education (ACE) Program on the professional practice of graduates. International Review of Research in Open and Distance learning, 10(4), 1-15.
Borko, H., Elliot, R. L., McIver, M. C., & Wolf, S. A. (2000). “That dog won’t hunt”: Exemplary\school change efforts within the Kentucky reform. American Educational Research Journal, 37(2), 349-393.
Brown, C., Sithole, V., & Hofmeyr, R. (2000). South Africa: Teacher training in the sky. TechKnowlogia, 53-54.
Castellano, M., & Datnow, A. (2000). Teachers’ responses to success for all: How beliefs, experiences, and adaptations shape implementation. American Educational Research Journal, 37 (3), 775-799.
Darling-Hammond, L. (2000). Teacher quality and student achievement: A review of state policy evidence. Education Policy Analysis Archives, 8 (1), 1-49.
Department of Education. [DoE]. (2000). Norms and Standards for Educators. Pretoria: Government Printers.
Department of Education. [DoE]. (2003). Revised National Curriculum. Pretoria: Government Printers.
Department of Education. [DoE]. (2005). Revised National Curriculum. Pretoria: Government Printers.
Department of Education. [DoE]. (2006). The national policy framework for teacher education and development in South Africa: More teachers; better teachers. Pretoria: Government Printers.
Department of Education. [DoE]. (2009). Teacher Qualification Survey: Draft report prepared by Human Sciences Research Council for Department of Education.
Department of Education. [DoE]. (2010). Curriculum and Assessment Policy Statement (draft document). Pretoria: Government Printers.
Feng ,W., & Siu, K. W. M. (2009). Professional development for technology teachers in Mainland China and Hong Kong: Bridging theory and practice. A paper presented at the PATT22 conference held between 24-28 in Delfti.
Gumbo, M. T. (2003). Indigenous technologies: Implications for a technology education curriculum. (Unpublished PhD thesis). Pretoria: Vista University.
Jansen, J., & Christie, P. (Eds). (1999). Changing curriculum studies on OBE in South Africa. Pretoria: Juta & Co Ltd.
Kent, A. M. (2004). Improving teacher quality through professional development. Retrieved from http://findarticles.com/p/articles/mi_qa3673/is_3_124/ai_n29092860.
Laine, S. W. M., & Otto, C. (2000). Professional development in education and the private sector: Following the leaders. Oak Brook, IL: North Central Regional Educational Laboratory.
Maluleka. K., Wilkinson. A., & Gumbo. M. (2006). The relevance of indigenous technology in Curriculum 2005/RNCS with special reference to the Technology Learning Area. South African Journal of Education, 26(4), 501-513.
Ndahi, H. B., & Rit, J. M. (2003). Technology education teacher demand, 2002-2005. The Technology Teacher, 27-31.
Ono, Y., & Ferreira, J. (2010). A case study of continuing teacher professional development through lesson study in South Africa. South African Journal of Education, 30, 59-74.
Pikulski, J. J. (2000). Increasing reading achievement through effective reading instruction (Tech.Rep.No. 19716). Newark, DE: University of Delaware, School of Education.
Potgieter, C. (2004). The impact of technology education, as a new learning area, on in-service teacher education in South Africa. International Journal of Technology and Design Education, 14, 205-218.
Rivers, J. C., & Sanders, W. L. (1996). Cumulative and residual effects of teachers on future student academic achievement. Knoxville: University of Tennessee Value-Added Research and Assessment Center.
Taylor, N., & Vinjevold, P. (1999). Getting learning right: Report of the president’s education initiative project. Johannesburg: The Joint Education Trust.
Villegas-Reimer, E. (2003). Teacher professional development: An international review of the literature. Paris: International Institute for Educational Planning.
Welch, T. (2009). Teacher education qualifications: Contribution from Tessa Welch to working paper for teacher development summit — 27 May 2009.