Archive for February, 2017

In the traditional system of education, there are many fundamental incongruities. For example, let us take certification of progress or advancement.

The output of an academic level (degree, year) is a certification of progression. This certification, awarded by the institution, indicates the achieved levels of learning and performance. The value perception of that certification is either implicitly understood through common sense or popular conception of what that level should be (“She is an engineer!”), or explicated through rubrics codified in standards or through formalized benchmark tests (“She max-ed the SAT!”). This certification is agreed and generally understood to signify a common understanding about the underlying competency.

As a consequence, what is also assumed is that the education system is organized (within the constraints of policy) such that the general meaning of the certification remains the same. That is, it self-organizes in a way as to promote a fixed correlation between certification of progress and competence.

On closer scrutiny, this can hardly be an exact or specific relationship. No two institutions may share the same everything. It is a really complex environment. There are many moving parts that contribute to the perception of competence or academic achievement, such as the specific curriculum, the quality of teaching or infrastructure, institutional brand, the ability of students and the level of rigor of assessments. An MBA program from Wharton could be very different from an MBA program offered by a local college in India. Treatment of a subject like school Science could vary between the common core in the US and the CBSE in India. Even two neighboring schools may be altogether different in how they conduct and certify the progression, even within a shared bureaucratic practice.

All we can say, and say in general, is that we could generally expect some competencies to be demonstrable at a specific level, and that that set of competencies would also vary by the observer’s own frame of reference. But we cannot specifically and objectively prove that there is a causality between the design of the education system and it’s putative outcomes.

This is what is predicated by design of our education systems today. Whether it is a higher level of education or a professional entry level certification, the system connives a certain trust, within and across institutions, and with external stakeholders, a system based literally on bias and subjective interpretation of competency or progress, an almost incestual behavior that feeds and reproduces from within.

This is achieved because of the nature of the system itself. Rules are codified in order to set the parameters of behavior and performance at institutional levels, and all stakeholders follow this way of being.

Similarly, the bureaucratic form of organization is followed to address scale.But scale destroys the ability of a bureaucracy to focus on what is being organized.

By expecting self-replication of practices at all levels, policies and processes get constrained by the needs and abilities of the lowest common denominators. In fact, the popular approach to change initiatives is through the language of the system itself, to create more institutions (and thereby more bureaucracy) to address those aspects. When these institutions are created, they inherit the same shortcomings thereby reducing their ability to apply innovation, however brilliant, at scale. Order begets more Order.

This is an untenable system of education, because it is by design reductionist and deeply hypocritical. It tries to eliminate complexity, and in the process gives rise to incongruous and undesirable outcomes.

Read Full Post »

Shaken, not stirred

The events of the past few years following the National Curriculum Framework (2005) creation have culminated.

In my reading, the constructivist efforts to systemically shake up the system in its aftermath, through the Continuous and Comprehensive Evaluation (CCE) scheme, the Open Text based Assessments (OTBA) and the Problem Solving Assessment (that was scrapped earlier), have been altogether stopped and we have returned to a pre-NCF era. The final scorecard looks like NCF:0 and System:1.

The CCE now seems defunct, Class X board exams are back, OTBA has gone away, and even CBSE-i has now breathed its last, changing the lives for about 18,000 affiliated schools. What is also very disturbing is that schools are no longer required to, formally, hold physical evidence of data or learning artifacts for more than a few months, unless questioned. For most schools, that will mean throwing away insights across years and destroying the student portfolios collected over the past few years, in the absence of an e-portfolio and (in many cases) performance record-keeping software.

And the culmination hasn’t stopped at the Central Board of Secondary Education (CBSE), it has also permeated the Council for the Indian School Certificate Exam (CISCE) which runs the ISC and ICSE curriculum. The CISCE just announced a curriculum revision that is more in line with preparing students for competitive exams, so far dominated by the CBSE’s hegemony, and affecting its 2157 schools who have been reeling under declining student numbers and slow growth in affiliated schools.

Nor has it stopped only at starting to implement the Right to Education, enacted unto law a few years back, or other state schemes that have rigorously attempted to raise the GER (like the SSA and RMSA), but whose impact has not been adequately backed by improvements in effective demand or in supply conditions or by changes downstream into the HE system.

Shaken, but not stirred. It will be many years before the Boards are again induced to change their practices. One can hope for a new NCF, that can be more acceptable and still carries some of the new ideas, but the system has won, today.

This is really a story worth learning from. IMHO, although the change itself was perhaps in the right direction, the educators miscalculated the extent of resistance and inertia to change. They perhaps also did not quite understand the mindset of the students, parents and teachers, and that of school owners and heads of the Boards. It was a case of policy trying to drive change, a top down effort, which did not reflect the realities of the system and ironically, exposed the deficit of planning itself.

One could argue that change must start somewhere and this was a useful experiment that shall enable us to plan more realistic experiences for our students and teachers further on. This may just be true and it is good to hope for a better future. But a few considerations may really help take the next version forward.

  1. For large scale strategies, special care must be given to ensure the appropriate conditions are created for viral growth and adoption. Here technology can act as a disruptor, both in terms of information dissemination, and in terms of tools and standards, and care must be taken to systemically enable its deployment. But there are equally important factors that must be addressed in parallel such as downstream impacts (viz. how the downstream systems of higher and further education need to adapt, starting from Entrance examinations which bridge school and higher education), teacher and leader education, ownership of parents as well as the school system, greater choice for students, and a re-look at existing bureaucratic practices.
  2. We must have more, not less, detail of how we do things – in the most appropriate directions. For example, if we had tied teacher career progression to implementation, we would have had to work on a scalable strategy for teacher education and not allow the NCTE to destabilize or for SWAYAM to take such a long while to get going, even as we leveraged national networks and infrastructure of our universities and distance education providers. For example, IGNOU did not launch a single course on the CCE which would give certification to teachers (to be fair, nor did the NCERT).
  3. There must be a way to measure key elements of the transformation and adapt on a continuous basis, led by an organization that is not invested from a Board or Standards perspective – but purely from planning and implementation perspective. We don’t have a structure in place to do this, except for NUEPA perhaps. Data-driven insights would have helped implement these changes in a much more objective and efficient manner.
  4. We must introspect further on where we want to really facilitate our students to be. In my (elearning) mindset, that has to do with paying attention to the finer details of the implementation and supporting it to the fullest extent. For example, our competency frameworks must evolve to a much finer level of detail and supporting materials and systems created to support those outcomes. It is not enough to state that debates could be a technique that promote critical thinking and communication skills without providing details on rubrics behind that instrument in the educational context.

I have no doubt that we have the intellectual bandwidth and the support of many interested and expert resources worldwide, nor do we have a paucity of funding. We are just not piecing it together. We need to stir the system continuously to provoke change, tweaking it to find those small changes that will have chaotic long term effect.

Above all, we must perhaps reconcile another world view when we conceptualize our system of education itself – that of complexity and complex adaptive systems.


Read Full Post »

%d bloggers like this: