The origins of continuous professional development for teachers in India
The piece by Subir Shukla discusses the origins of CPD for teachers in India, in the way we understand it now, by tracing the path back to the systems and processes instituted by DPEP (District Primary Education Program) in the mid-1990s.
The origins of continuous professional development (CPD) for teachers in India, as we know it now, may be traced to DPEP (District Primary Education Program). Though there were a few ‘smaller’ EFA (Education For ALL) projects at the time, such as, Basic Education Project (BEP), UPBEP, AP-PEP, etc., it is DPEP that has had the greatest influence. DPEP began in late 1994 in India, as part of the country’s EFA endeavor.
Several features of the program, and its background, influenced how in-service teacher training evolved. In this article I trace the evolution of “in-service training,” as it was commonly referred to at the time, the context, the early steps and struggles, and the form that it eventually began to take. Though there is much to say on developments after DPEP, I do not explore those. Apart from space constraints, it is also that these aspects of DPEP are now not as well-known.
I was Chief Consultant (Pedagogy and Curriculum), DPEP-MHRD from 1995-98, and later Educational Quality Advisor – SSA, MHRD from 2009-11, to lead the development of the Quality Framework for RTE. As I was closely involved in the processes described, it must also be said that this is a subjective perspective and others might well have different views. I have tried, however, to describe the period as objectively as possible and used documentation available to me as a basis.
The evolution of in-service training of teachers in DPEP
The poor status of school education was an impetus: NPE 1986 emphasized “learning through exploration, activity, and projects” in a “warm and welcoming atmosphere.” However, that was far from happening in schools. Baseline studies had shown limited access and low levels of learning. These revealed that the quality of school processes was responsible in 40% cases of drop out.
Apart from high PTR, there were issues of teacher absenteeism, teacher isolation, the absence of TLM, textbooks that burdened children and teachers alike, and exam-based evaluation leading to a prevalence of rote learning. “Inspectorial” supervision was the norm, and teacher training tended to be lecture-based and ritualistic.
Under the EFA projects, efforts were made across several fronts, on improving enrolment, enhancing community involvement, initiating the data system (that eventually became UDISE), strengthening contextual educational planning capabilities, renewing curricula and textbooks, and, of course, strengthening in-service teacher training.
Institutional limitations: Institutions such as NCERT and SCERTs tended to be understaffed and poorly equipped. This resulted in them being unable to deliver committed programs, especially in in-service training. To get DPEP off the ground and functioning, the MHRD, therefore, created a group of professionals at the national level, who would provide resource inputs into various aspects of the program. Especially in the early years of DPEP, this body, known as the Technical Support Group, provided critical inputs both at the national and state level implementation.
Studies were done about the status and capacities of SCERTs and DIETs. Substantial provision was made to strengthen these too. However, there was reluctance on the part of the states to invest in these institutions, as they were unsure if they would be left with the liabilities.
Most states began by relying on NCERT and their SCERTs to deliver in-service training. However, they were repeatedly unable to do so in time, leading to funds lapsing. The states then created State Resource Groups, which were carefully selected through tests and workshops. This led to a major shift in how in-service training was developed and conducted.
Very quickly, the mode of implementation began to move away from expert committees to empowered resource groups. The latter comprised practicing teachers, CRC-BRC personnel, trainers, and resource persons. This had its own issues. DPEP was being accused of creating a “parallel structure.”
The effect of the funding mechanism: State activities tended to be constrained by central funds being passed through the state treasury, leading to delays in fund flow. Hence, under DPEP, autonomous state implementation societies were created to receive funds instead. This freed states to conduct large-scale programs within time limits.
More critically, the funding guidelines restricted construction costs to 24% and management costs to 6% of the overall investment. The remaining 70% was to be spent on quality related issues. The bulk of this comprised expenses on statewide in-service teacher training. Without implementing it, states could not construct schools or even fund salaries in the state project offices. That is why, in later years, even when training was not entirely necessary, it continued to be held, leading to “training fatigue.”
Sorting out the pedagogy: DPEP tried to move away from the prevalent “pedagogy of punishment,” toward an activity-based, learner-oriented pedagogy. Here children were envisaged to be encouraged to use their own minds in interesting and meaningful circumstances. Apart from “traditional” pedagogy, there were still vestiges of “joyful” learning programs. These often tended to result in “joy without the learning” or a set of random activities implemented in the classroom with little rigor.
The prevalence of this “song and dance” notion of child-centeredness had to be overcome too. One had to move toward more balanced, carefully planned learning strategies, which would involve children, be more meaningful for them, and ensure learning.
National workshops were held with state representatives on pedagogy and training methodology. Early on, it became apparent that key implementers needed to experience this pedagogy for themselves and articulate the classroom processes and outcomes it would lead to as well. These were dubbed “visioning” workshops and were also conducted at state levels. These contributed strongly to the development of SRGs and in service training.
There was also a considerable emphasis on identifying and using TLMs from the environment to the extent possible. The intention was to reduce the time teachers spent on creating materials from thermocol, paper, etc. “TLM Melas” were held across the country to help teachers showcase their innovations.
Multi-grade teaching was an important issue. Large classes in several states were also of concern. Interestingly, India sent a delegation of state and DIET personnel from around 18 states to Columbia for an exposure to Escuela Nueva – Spanish for “new school” – a model of learning that focuses on understanding rather than memory and gives importance to interpersonal skills. Many of the members of this exposure visit to Columbia initiated small projects in their own districts later.
A national workshop was held on this theme, after some fieldwork done by different states, especially Gujarat, where 50 schools with different categories of multigrade were worked on. It was agreed that academic solutions should not be found for administrative problems, and the focus of Multi-Grade Teaching (MGT) related efforts would be on small-class MGT and multi-level learning. MGML (Multi-Grade and Multi Level) became a theme for a few years.
Training methods and materials: Like the typical classroom, in-service training had consisted of authoritative lectures, delivered mostly by non-practitioners. DPEP sought to move toward a more participatory, experiential, hands-on process. This was to be backed up in the field through CRCs and BRCs. The term “in-service training” was understood as “workshops + on-site support visits + peer meetings + provision of needs based materials periodically.” In the initial years, in some of the states, administrative and supervisory staff were also included in training, along with teachers.
In terms of training methodology, the visioning workshops showed how an experiential and reflective process could be effectively implemented in our circumstances. After a “training methods” workshop in Kerala, the state made changes to its in-service training. To this, teachers responded by saying, “If this is the pedagogy we want, our teachers’ handbooks are all wrong! Please change them.” When the handbooks were changed, this led to the demand: “If our state can make such good handbooks, why do we have such bad textbooks!” And this led the state to change the curriculum and textbooks over 1997-99. This was followed by intensive in service training on the new components. It was after this that Kerala started figuring in the top three in national surveys of learning achievement.
A major issue being struggled with was that of the “cascade approach.” A cascade approach was one where a group at the state level trained master trainers, who trained trainers, who in turn trained teachers. This was commonly adopted as the number of teachers was large. The cascade approach often led to “transmission losses” and distortion down the line.
Different models were examined and tried out, including one in Assam involving “lateral movement” of resource persons. However, over the next few years, the one that worked best was the “reconstruction model,” which emerged in Karnataka. Here, resource persons would be trained over 8-10 days, to be able to conduct, say, 40 sessions. However, the actual training they would conduct had only 24 sessions. They would have to choose and sequence the sessions they wanted, defend their choice before peers, and work out how they would conduct it in a holistic manner. It was the ownership that led to a high degree of quality across a large state such as (unified) UP.
It was also realized that instead of ‘modules,’ a training ‘package’ would serve our needs better. This first emerged in Karnataka. It included the following four components.
The first one involved a teachers’ booklet. This would have thematic papers on an activity oriented classroom. Teachers could take this away for reference. It also had spaces where they could write their own comments, and thus personalize it.
The second component was the trainer’s booklet. It contained detailed hints/practical advice and notes on what a participatory training program is. It also had support material on the process of conducting participatory training programs. It was intended to enable trainers to be independent and to continue their growth as trainers, on their own.
A flexible training design was the third component. It would provide options and was envisaged to be adaptable to different circumstances. This had to be used by trainers to construct their own design.
The fourth component is an activity bank. It would contain activities and learning experiences. These could be used by teachers and trainers. It is intended to keep growing with use.
The trainers’ booklet and activity bank needed to be produced only once every few years. The teachers’ booklet would need to be issued only if something new was being added. The training design would be developed for each round of training, building on identified needs.
On-site academic support structures: Several quality improvement efforts in school education have floundered. This has generally been due to the lack of on-site support mechanisms for teachers. This support is necessary to help them undertake the complex endeavor of effecting changes in their classrooms. It was to facilitate this that the Cluster Resource Centre (CRC) was conceptualized.
Typically, a school was designated as a CRC. It would then be strengthened with the provision of space for meetings and a library useful for teachers. It would serve around 40 teachers, from 6-10 schools located within a distance of a few kilometers of the center. The full-time coordinator would undertake support visits to the schools, organize monthly meetings, and make the center available as an academic resource for teachers.
The CRCs, in turn, were facilitated by the Block Resource Centre (BRC). Located at the block level, the BRC was a sub-district training center. It was envisaged to provide in-service training to teachers of the block, and academic monitoring and supervision. This was at a time when DIETs had little infrastructure and capability.
Both the CRC and the BRC facilitated grassroots implementation, while providing on-site and on-going academic support to teachers. They increased the reach of the program. They also enabled it to connect with hundreds of thousands of teachers every month. This happened through school visits, meetings, and dissemination of materials. These centers also facilitated community involvement with schools and acted as an outreach medium.
It can be said that these centers were the cutting edge of DPEP. In the initial years they did work well too. They contributed greatly to enrolment, community mobilization and teacher motivation. The ‘Gath Sammelan’ of Maharashtra, the innovative selection measures adopted in Assam (with finalists being interviewed in front of each other and all the teachers of the cluster), the interesting building designs for BRCs in many places (hexagonal training rooms, wide doors to allow quick entry and exit of a large group to save time) were among the many innovative measures taken.
There were also initial efforts to implement “needs-based training” at the block level. “Job charts” of CRCCs and BRCCs were discussed in many states. It was recognized that CRCs’ meetings could easily become ritualistic. Thus, yearly agendas in combination with topics covered in in-service workshops were worked out.
This was also the first structure within DPEP to be derailed. As SPDs (State Project Directors) and secretaries or governments changed, they questioned the roles of CRCs and BRCs or tried to appoint those from the open market (usually with disastrous results). As the data system was being set up, bureaucrats found the CRCs-BRCs as the vehicle to address the pressure they faced from the center on this front. This inverted the role completely. Instead of serving children, communities and teachers, these structures ended up serving the needs of the leaders. This represents a dark chapter in the history of TPD in India.
The four-step process for holistic quality improvement: Within a few years it was clear that curriculum, textbooks, assessments, and in-service teacher training needed to be seen in a holistic manner. In the initial years, institutional departments would not confer with each other. Components would be developed in isolation. All this led to mixed messages. A four-step process was therefore worked out and implemented in over 12 states. This led to good results and solid improvement.
First, a common group of resource persons was identified who would work as the “Quality Improvement Team.” This team went through a visioning workshop, followed by school placements. Here they tried to implement what teachers were expected to.
Then an “underpinnings workshop” was organized. Here beliefs and assumptions related to children, learning, teachers, equity and aims of education were debated and documented. Their implications were also worked out. Fieldwork was undertaken post this too.
As a third step, an “approaches workshop,” took place. Here the approach to knowledge in general and to the subjects was deliberated upon. Implications for quality related components were also being worked out at this stage.
The outputs of the above three were written up as “base papers.” These provided a guiding framework for the development of curriculum, textbooks, and teacher training. The teams used these to review existing components, plan for changes and then worked separately. However, because of the “foundational discussions,” there continued to be commonality across all strands of action.
Interestingly, Kerala used the outcomes of these workshops to create a tool used by its “Internal Academic Review Missions.” This helped assess the effect of training. UP used these to develop a long-term “perspective plan” for teacher development.
Post DPEP
Many of those involved in the processes mentioned above will refer to that period as the “golden days.” This is mainly because it was a time full of enthusiasm and innovation. Different echelons were being empowered. The system was also experiencing provisioning and processes that had been missing. However, as DPEP transited to SSA, several structural changes impacted what happened in in-service training.
The SRGs were made ex-officio, instead of their selection being based on performance. The institutions were the “competent authorities.” However, they had received (and continued to receive) only limited inputs to strengthen them. They evolved more into those organizing the development of in-service training rather than taking academic lead.
The budgeting criteria were norms, rather than built around outcomes or shifts to be delivered in the classroom. Module development became a fairly routine process. Training fatigue was commonly reported. Very often, teachers ended up attending training on similar topics as before.
An interesting phase came in 2005-7, with the nation-wide launch of a process to develop performance standards and indicators for teachers, trainers, CRC-BRCs, DIETs and SCERTs. Known as ADEPTS (Advancement of Educational Performance Through Teacher Support).
This was a joint initiative of MHRD and UNICEF. Later this led to the development of PINDICS, Shala Siddhi and the current National Performance Standards for Teachers.
The period post-RTE saw a push for child-oriented teaching and learning, along with an emphasis on CCE. A little later, as CSR-funded organizations proliferated, the nature of inputs to teachers began to change. A CPD perspective began to be discussed. However, as state systems began to take external help, multiple programs were implemented, often with perspectives that were not aligned.
The introduction of technology too led to a shift in how CPD was conceptualized and delivered. It is difficult to know how effective this has been. We can get data on how much teachers have learnt. However, we do not yet know how much of this is implemented in the classroom. And if implemented, whether it is leading to improvements in learning levels.
A project implemented in UP, TELOS (Targeted Enhancement of Learning Outcomes through Supportive Supervision) successfully tied together an indicator-based CPD process with demonstrated improvement in learning outcomes. It involved supportive supervisors from DIETs, BRCs and clusters. However, this was discontinued after the Covid-19 pandemic lockdowns.
A few reflections
The fact that the initial efforts were seen as an “educational change” prevented the understanding that this was also a change in power relations. Empowered teachers and resource persons who asked questions were troublesome. Many officials actually worried that if the children of the poor get good education, how would their children get jobs.
While institutions were ineffective, they were often led by those with “connections” and worked behind the scenes to undermine or take over the CPD initiatives, with corruption as a motivation too.
Despite substantial investments, both the center and the states were unable to bring about the needed transformation in DIETs and SCERTs. These institutions are now better resourced and more active than before. However, they function more by outsourcing their academic work to NGOs and others. They often lack ownership and continuity, as they are liable to change from year to year.
On their part, NGOs tend to be driven by donor requirements as much as children’s needs. They do not see their work as replacing that of institutions (and thus weakening them). They also generally do not have an exit strategy. On the contrary, every small act, such as participating in a meeting, is highlighted in social media as an achievement. Overall, this runs the danger of fracturing and diminishing CPD efforts across the country.
Key aspects such as pedagogy or the programs’ academic direction remain hostage to the whims and vagaries of leadership. An effort was made to create a protocol wherein new SPDs could not uproot ongoing projects without an evaluation. However, with the change in national government, this did not go forward.
The recent data driven approach still does not measure teacher performance. It tends to use student learning as a proxy. This needs to examine classroom processes and teacher empowerment processes far more.
Interestingly, with technology, it has become easier to pass on instructions to teachers. Overall, teacher autonomy appears to be far less than it was in the early years of DPEP. Teachers are now burdened with all kinds of data to be uploaded. They are also instructed on what to teach on which day, irrespective of their context. As was the case with CRCs and BRCs, teachers too are now spending more time serving the needs of those above them in the hierarchy instead of those for whom they exist – children and communities.
No approved comments yet. Be the first to comment!