Feeds:
Posts
Comments

Posts Tagged ‘learning 2.0’

Update (Aug 6): IIT Roorkee has decided to re-admit the expelled students, on certain conditions.They have taken a lenient view, considered the situation again and accounted for the impact of the expulsion on the students’ future. #inanity-of-it-all


IIT Roorkee, a premier engineering institute of India, recently expelled several first year students for not meeting the requisite grades. Predictably, there is a backlash both outside and from within the IIT communities themselves, although there are more examples in the past of such incidents in the IITs. There are also insinuations that the decision, by affecting mostly students from disadvantaged backgrounds, is discriminatory in nature.

Many important issues in our education system are laid bare by this unfortunate event. As the author of one of the articles asked, why is the teaching not being questioned? Or the academic practices? Or counseling and remediation? Where are the voices of students in decision making? What legal and educational recourse do students have in the face of such orders? Why is the evaluation and grading system designed in the way it is? Why expel at all, anyways?

It makes me question why we take our education system so seriously. It also proves a thesis I have evolved. For generations we have believed that the education system transforms students, with each class level and exam signifying one step in that direction. But if that were really true, in general, then we would be living in a far equitable, happier, sustainable and prosperous world.

Instead, I have come to believe that the student, far from being transformed, represents a form of organized labour, who along with the academic and administrative labour, and the capital inputs of buildings & infrastructure, actually manufactures certain outputs – the outputs being marks and degrees. These marks and degrees then become commodities used to transact production downstream – either more degrees or formal employment. All funding, policy, standards, school practices and the like are subservient to this production process.

This is not learning. This is production. And production by any means possible – even those that cannot ever pass for anything close to academic excellence, far less to the delight and joy of learning. So we see ministers with fake degrees, grace marks in standardized exams, teachers or school leaders with zero qualification, schools with no infrastructure and research that is non-existent – but still reports that our children have completed school levels or have got into the IITs in droves – as evidence that the system really, really works.

The system works, but it is not learning, it is production of a different kind altogether. And this system of production, at scale, can have no other ways to work – it knows nothing about people and learning, but a lot about numbers and certificates.

People, though, are another thing. People are resilient. They understand the value of the system in transacting the business of living, and accept it as yet another fact they have to deal with, and carry on. That single fact pushes the system through, from generation to generation, from shocking fact to abysmal deception. And people do succeed, some due to and some despite the system.

But it does not need to be this way. There is great joy and reward in learning and sharing. The potential benefits of a well thought out educational system can really result in social outcomes of equity with growth. Such a system would have none of the trappings of the production organization that education is today.

The countless folks who have been rejected or denied education, both outside and inside the current system – there is hope that things will change. Or else they shall have to be made to.

In solidarity, then!

Read Full Post »

Not without books. Books are great. I mean textbooks as they are academi-factured (if that can be a word to denote academic manufacturing) and used now. The written word that becomes the gospel truth for 250 million students and millions of teachers in school today in India.

Seriously, the textbooks we produce are perhaps the greatest barrier in the system to fostering capable and autonomous learners. The fact that something is written in the textbook becomes the gospel truth that children cannot but recite.

There is the fact that most teachers cannot deviate from the text, cannot award imaginative, researched answers to questions given in assignments and tests. Many teachers would neither have the motivation, nor the passion, to understand these deviations.

Then there is the length of the written text, often verbose, and sometimes too simplistic or inadequate on even slightly deeper inquisitions. The sheer length of the discourse simply limits the extent of engagement that a student can have with the topic.

Compounding this litany of problems is the obsession with facts, so microscopic and so many, that you would even wonder later in life, why you were even expected to remember them, particularly as you could get to the net and answer them in a jiffy.

Ironically, TV shows that demonstrate the greatest failures, like the one that asks adults questions to check if they have actually passed the 5th grade, become the subject of great popular mirth and unconscious intellectual debauchery.

Then, as a result of the enlightenment that our students are not learning, they introduce new ways of assessments that actually end up spawning (to the publishers’ delight) new textbooks. And the whole cycle starts again.

There are umpteen examples from our system of textbooks that demonstrate these problems. CCE (Continuous and Comprehensive Evaluation) is a mechanism that was supposed to induce children and teachers to think more, memorize less. But like the USA with the 21st Century skills curriculum, this got reduced to textbooks and project guides. The travails of the CCE, in the end resulted in diagnostic tests to check the problem solving skills of students with the PSA (problem solving assessment), which again has become the subject of many textbooks (almost like a separate subject).

Again, the system of gradually exposing students to a topic, in a step by step manner in each successive grade, leans exactly in the opposite direction of the non-linearity of learning through discovery, problem solving and peer-negotiation, because it limits the precise extent to which one can explore any topic and restricts, in effect, students to the contours of the author’s creative and intellectual boundaries.

My sincere apologies to the experts, but remember you were children once. In fact, it is a cruel testament to time, that you follow the same general methods that you were steeped in, perhaps with a liberal dose of buzzwords that you choose to believe make crucial differences to the way children learn now.

Perhaps it is time to stop treating children as dumb witted morons who will be developed into fine holistic individuals by using textbooks and allied means, however utopian and unrealistic the alternative may seem at present.

So, let us imagine a school without textbooks.

It will be a load off the shoulders, literally. Everyone will breathe a sigh of relief. And then will come the true revolution.

Perhaps then the students will be introduced to a world of themes, which they desire to investigate, alone or in groups. The themes that they choose will be their personal journey into the world, trying to decipher its working to the extent they can, facilitated by not just the teacher, but every adult or peer who can contribute.

Along the way, they will leave a trail of learning and sharing. Themes may span across multiple years, result in multiple explorations and projects, depending upon interest and guidance. In short, the curriculum will be a co-creation, the syllabus a much wider canvas to draw on, and the assessments driven by the capability to learn and master different dimensions and levels of technical complexity.

At all times, the focus will be in fostering skills that promote autonomy, open-ness, collaboration, scientific temper, values and logic and seeing their application to the theme. It will celebrate curiosity and wonder, aesthetics, sensibilities, discovery, inferencing, deduction and a host of skills that will define the individual.

The spaces of learning will become a celebration of coming alive.

And we will have done what is expected of us – we will have given our children not the right to education, but the right to learn. Amen.

Read Full Post »

I have written earlier about what I am proposing as the evolution from the CBT and WBT – the NBT or Network based training, for some time now. NBTs provide a framework for organizations who want to adopt Web 2.0 and networked learning (the connectivism way) in their systems. The main components of the NBT would be both learning process and tool based.

The NBT consists of the following components:

  • a learning process that emphasizes learner participation prior to the course in setting up goals and sequences
  • definition of agreed upon sequence of focus areas and learning events based on a temporal sequence
  • agreed upon rules/structures of participation with weakly or strongly defined compliance
  • defined initial roles for participant and educator (and others) that is consonant with a networked learning strategy
  • initially defined ecology of 2.0 tools (blog, wiki, discussion forum, live conference events, other collaboration techniques etc) to be enmeshed in the course
  • choosing appropriate collaboration techniques e.g. Delphi, shared maps,
  • if required, avenues for structured peer review (could have multiple levels) and group work; if so required an expert review
  • resource repository that captures suggested content for review and discussion; could include documents or web collaboration resources
  • collaboration using techniques specifically suited for the context of the course; e.g. grouped concept maps if a goal is to create a resource base
  • policy for sharing; e.g. if sharing with a wider audience is agreed upon, some way of sharing blog posts, discussions with personal blogs or social network could be explored
  • statistics for the facilitator role to judge quantitatively and tools for analysis based on qualitative criteria
  • setting up of a default network for the participants of the course (as more people join, a historian role is defined that brings them up to speed using a special mechanism for navigating the content, maybe through learner contributed summaries or commentaries)
  • post assessment of learning experiences to evolve the learning ecology
  • some way of integrating and reporting on the experience in both directions – organizational and personal learning environments
  • norming of the participants on how to use; overcoming barriers to use

These would define an ecology within which much learning could happen. One possible view is that each NBT could become a “slice” of learning that could be linked to the PLE. Several such slices could be linked and could potentially inter-mesh to allow cross-disciplinary or cross-network linkages to promote diversity.

Obviously, from a technology point of view, one could go in two directions. One, allow loosely coupled 2.0 service integration. Two, create generic tools to store localized data and build bridges so that this information can be ported to available 2.0 services. The first allows for easy extensibility when a new 2.0 service or app comes along. The second encourages careful selection of appropriate learning tools (not just mash up anything with anything irrespective of the impact on learning – if something is indeed effective, one would rather build it in to the system in a generic fashion, giving far more control).

From a learning process orientation, specifically a connectivist orientation, it will be necessary to position the NBT somewhere along the range between individuals and groups, connectives and collectives, in an attempt to engender the greatest possibilities for leveraging the power of networked learning, collaboration and innovation. The prime challenges and constraints will lie in shaping policy, between open-ness and protection of IP for instance.

Read Full Post »

When I think of the term under siege, it reminds me of Steven Seagal, a master chef, on board a US Navy battleship taken over by terrorists in the 1992 movie by the same name. Of course, he fights back and defeats the terrorists. Doubtless somewhat of a stretch of imagination here and completely unrelated, but I think that Instructional Design as we know of it traditionally, is under siege.

I wrote a post on eCube on Indian Education, contrasting the challenges in developing countries such as India with the remarkable developments in social learning worldwide. In that I refer to George Siemens’ article where he refers to the changing role of the instructional designer in the new milieu. From being an expert in applying design techniques on a body of content for a specific kind of learning experience and target audience, the designer is seen more as a guide and facilitator in bring animate (human) and inanimate (computer, device) networked knowledge closer to the learner and fostering learning through active reflection and search, more so than just (and in addition to) relying on traditional design activities such as content sequencing.

What becomes of the carefully and painstakingly created user learning experiences with emphasis on language, defined control imposed by corporate styles & standards, exclusion of irrelevant content, step-by-step elaboration, elaborate understanding of the target audience, pilot evaluations, focused group feedback et al?

How does the social learning experience address these aspects of design? By its very definition, the network is autonomously constituted, with no formal controls, with little or no accountability to ensure adequate coverage (or quality at this point) of any piece of the curriculum, but one where potentially the benefits of active reflection, learning engagement, expediency of learning and scale of community participation may far outweigh the traditional system. A designer who can simply point or piece together these resources, may be compelled to discard entirely useful contributions to knowledge just because the form is not conducive for presentation or there is too much redundancy between two critical but related articles. Obviously, without these interventions, research and reflection may take on too much time to prove useful in situations of learning immediacy (read workflow learning). One of the things that may work, perhaps, and that is that the designer provides the tools and frameworks to allow for an ever growing landscape of content in ways that she can make intelligible for her learners in a participative manner.

Read Full Post »

There have been some huge developments in information and communication technologies (ICTs), especially those around the internet and the way we learn. The “X” in “X.O” represents “fault lines” or tensions between local and global, groups and networks, structure and chaos, homogeneity and diversity, teacher-led vs facilitated and simple vs complex. With each tension comes a lot of hard work and experimentation, sometimes building on existing paradigms, sometimes with novel approaches. This post tries to summarize the different X.Os in one place.

X.O

The X in X.O represents versions or generations of thinking and capacity. Not unlike versions in software or documents, X.Os represent change and a philosophy of transformation. While Generation 1.0 showed us the power of visualization, of search over aggregated knowledge, of 3D immersion, of multimedia based learning, Generation 2.0 has facilitated for us the power to network and to leverage collective insight through social networks, learning 2.0 styles, collaboration and ever growing news forms of media. Generation 3.O is still very nascent and further improves on Generation 2.O by adding ubiquity and context to the teaching-learning process.

The caveat, and there is always one, is that the internet is a powerful medium, but only to those for whom it is accessible. We must leverage alternate forms for audiences that are either not empowered or not influenced by these technologies. In the end, technology is an enabler – not the knowledge itself, not our relationships with others in a network and not our own little hype that we believe is the only solution. We have to learn how to use these effectively and judiciously and it equally behoves those that have this access to disseminate it to others who do not.

Generation 1.O

Gen 1.O has shown us the power of Web and computer based learning. This generation of learning has become popular and has an established art, craft, business and science. What is this generation? It started with small computer programs that were used by teachers to explain and simulate educational problems. These evolved into computer based training or CBT modules that became richer and more appealing with the advent of multimedia and the evolution of the personal computer. But an inflection point emerged with the birth of the Internet towards the end of the last century. Suddenly a new accessible medium and a common presentation language  enabled us to create web based training – training that could be placed (hosted) on a server on the Internet and be used across the world. This shift enabled immense economies of scale to corporations that were able to save costs of training logistics and precious travel time. As bandwidth improved, video conferencing evolved to provide immersive situations for collaboration and communication.

For teachers and students, all this marked the beginning of a change in the way instruction was designed and delivered. No longer did we have the flexibility of a classroom, board and chalk. We did not even have the chance to know a student by name and look her in the eye. We lost conversational ability and had to strive to ingenuously incorporate that ability within WBTs using third person role-plays and scenarios. We also tried to reinforce and replicate the same fundamental ways of teaching – only magnified the scale through a global platform – and kept the expert, now a mix of multiple more rigorously defined skills such as instructional and graphics design, at the centre, rather than the learner. The limitations of the WBT were sought to be overcome in part by virtual classrooms and satellite based video conferencing. Te teacher could at once scale to multiple locations via a global classroom with the help of technology using simple, rapid elearning tools such as Powerpoint (somewhat misplaced, no learning is that rapid to create, deliver or experience). They also brought with them the reinforcement and perpetuation of systems that promoted the teacher at the core of the learning experience.

Even bigger innovations brought together learning theory and technology to create real-life immersive simulations and a high level of engage through gaming and virtual reality. Parallely, systems for managing learners and administering learning programs and content (LMS/LCMS) – also evolved to manage the huge amount of training content and delivery that was created. Industry, government and academia got together to build standards such as SCORM.

The benefits were enormous. There were huge improvements in terms of standardization and quality of presentation of content. The space became more specialized and verticalized in terms of both skills and solutions.

Improvements were largely innovation-led through advances in pedagogy and technology. Elements such as 3D graphics, simulations and gaming are still high-cost, esoteric and time intensive to create.

But still, such a lot of effort, suffering from tensions of art vs science, autonomous vs teacher led, local vs global, has still left an entire generation dissatisfied!

And the main reasons for this dissatisfaction are not hard to find! Cost is one factor. Learner engagement is another. Lack of personalization is yet another key cause. Teacher awareness and skill and sharing of best practices have been challenges. Key challenges such as learner retention, visualization and real-life immersion are the learning domain’s own unique and continual challenges.

Generation 2.O

Then the internet changed. Fundamentally. The next generation is radically different, both in core technology and it’s application in learning. The next generation of the Web was christened Web 2.0. The most fundamental elements of this new generation are user-generated content, social networking, mashups and remixable data sources. Let us examine these elements in greater detail.

User generated content

The web was deemed “read-only” for the vast majority of users. This meant that you needed specialized expertise to author and publish content on the web. This is different from e-mail that is used to communicate one on one or one t o group easily. It was the process of being able to create something to share with the global community that was esoteric. Some of us embraced that technology readily while a lot of us struggled with using even the most basic tools, let alone be capable to generating highly sophisticated elearning.

With Web 2.0, these barriers to creation and sharing of content have been significantly reduced. Anybody can contribute – all it requires is web browser, an internet connection and lots of ideas and experiences. Blogs provide, for example, a channel through which anyone could share content with the global community. The web has become writeable. Not only could you write textual content, but could also author and share other forms of content such as pictures, audio, pictures with audio and many other continuously emerging new forms of media. As a result, the amount of content generated over the past 2-3 years has been many thousands of times the amount in physical form ever created by man. This sudden explosion has been facilitated by advances in software, hardware and networking, very specifically, by advances in storage, processing power, improving network technology and virtualization.

Social Networking

But being able to author content on the web is not enough. The real power lies in being able to share it. As humans, we have an innate need and desire to communicate with each other. We build relationships, we create networks, whether they be friends, family, colleagues or just about anyone else. We learn through these networks by sharing and communicating thoughts, ideas and experiences. Web 2.0 enables us to create digital social networks, virtual communities of people irrespective of who and where they are. These networks have the potential to grow virally and have sen tremendous growth in the past few years.

What does that do for us? It enables us to draw upon the shared thoughts, ideas and experiences of people globally. The internet is now suddenly not a website anymore. Rather it is an open space for dialogue, debate or collection of information and critical thinking. It is a space that can help us leverage collective insight. It can help and grow relationships and reduce the asymmetries of knowledge and information. Correspondingly, it provides tools to search and source knowledge from millions of different sources.

An element of this generation is the ability to create one’s own classification or interpretations of knowledge. A name or place or visual could mean or evolve associations differently for different people. Which means that if it is classified using a particular standard taxonomy like in libraries or directories, it may never be found by someone who associates a different taxonomy or interpretation to it. This new way of classifying information, the personalized or group taxonomy, is called Folksonomy (more popularly known as social bookmarking or tagging). A fundamental change brought about in this generation is not only the ability to tag but also to be able to share these tags with your communities.

Mashups

The third most fundamental element of this new generation are mashups. Prior to the introduction of this element, software applications such as an order and pay ecommerce application were standalone islands that architecturally, were not built to inter-operate (thence standards such as X.12 and EDIFACT) and share their data with other software applications (at least not easily). Today it has become easy for even novice users to create more complex views of information (e.g. Dapper), e.g. combining pollution indices with geo-spatial maps. Web services now provide the glue through which these can happen. It has become very easy to “plug-in” and integrate functionality pieces from multiple sources into your own application or portal – skills that were uptil now, the domain of skilled programmers. For example, Yahoo! Pipes and RSS combined can place the knowledge of your interest area at your disposal.

Remixable data sources

The power of this fourth fundamental element lies in the ability to look at the internet as a large database system.  The world’s data, in this view, becomes a set of inter-related structure (not unlike an RDBMS), with elements semantically related with each other through defined and dynamic associations. As Sir Tim Berners Lee believes, the semantic web is something that we can use very intelligently to perform a lot of tasks triggered by these associations. Over time, these tasks could be handled by agents without the need for human intervention prompting futurists like Ray Kurzweil to talk about the future half machine half human social form.

Learning 2.0

Consequent to this fundamental transformation and aided by continued frustrations with the existing teaching-learning process and the evolving behaviour of digital social networks constituted by new age digital learners, is the push towards the next generation of learning. Founded on an epistemological framework that defines knowledge as being emergent, adaptive and composed of connections and networked entities (Stephen Downes, 2006), George Siemens posits connectivism  as a learning theory that suggests that the act of learning is largely one of forming  a diverse network of connections and recognizing attendant patterns (Siemens, 2006).

Stephen Downes is widely credited with the term Learning 2.O. According to him learning is not negotiating an organized repository of knowledge, but like electricity or water – available through networks like on tap. This is a fundamentally new view representing an entirely new way of learning steeped in the belief that networks can produce reliable gains in knowledge more effectively than traditional systems. Learning 2.O enables a digital generation to connect, collaborate and co-create knowledge and collective insight through relationships and identity in a network.

Changing Roles of learners, teachers and learning managers

Learners are changing from passive receptors of information and training to active participants in their own learning. This is a viral change, so it is really fast. Today’s digital learners are part of communities. They share their interests with members of their community. They twitter. They blog. They rake in RSS feeds and bookmark their favorities on de.li.ci.ou.s. They share photos on Flickr and videos on YouTube. They share knowledge on Slideshare and Learnhub or Ning. They share ideas. They grow by meeting and engaging peers and gurus alike using the LinkedIn or Facebook. On their laptops and on their mobile phones.

Traditional instructors are now moving from being trainers to being facilitators, guides and coaches in a collaborative teaching-learning space. The instructors need not treat their learners as passive receptors, rather they can actively shape, by dialogue and discovery, the nature of their learning.

Learning Managers, though, have perhaps the biggest challenge. Undisputedly, an organization that has both the vision and a demonstrable culture of continuous learning, collaboration and improvement, will benefit natively from the formalization of this style and the adoption of the available tools. This kind of an organization worries about functional excellence and the ability to transform the domain in which they operate through leveraging individual and collective insight.

Several metaphors of the educator have emerged. John Seely Brown posits the notion of studio or atelier learning portraying the educator as a master artist in an art studio who observes student activities, points out innovation and uses the activities of all users to guide, direct and influence the work of each individual. Clarence Fisher talks about the teacher as a network administrator who help students construct personal networks for learning. Curtis Bonk talks about the educator as a concierge who directs learners to appropriate resources that they may not be aware of. George Siemens suggests educators must behave both as curators – experts and guides who encourage exploration and create learning spaces or ecologies. And this participative pedagogy is what is a dramatic change or reform for the existing system.

Emergence of new media forms and collaborative learning

Learning 2.0 has spurred interest in collaborative learning and new forms of media. Immersive collaborative learning, which is really an immersion of self within a networked learning ecology, has been very evocatively been drawn out by solutions such as SecondLife. The practice of teaching and learning can now benefit greatly from these and structured techniques for collaborative learning suc as collaborative online brainstorming, voice and video blogs, voicethread type learning triggers, life threads (that follow an individual online) etc. Communities of Practice, I believe will be an important source of new media forms. CoPs provide an open space for collaboration around a specific interest area and because of that new types of collaboration artefacts stand a good chance of getting created that become a knowledge point in the learning experience.

Generation 3.0

The latest X.O is the third generation of web and learning. What seems to be emerging as unique characteristics of this web generation are ubiquity, context awareness, location awareness and mobility.

By ubiquity we mean an omnipresent network, connecting devices and humans alike to each other blurring the man-machine interface.

By context and location awareness, we mean that our networks will increasing be ware of not only what we need but also where we need it. For example, teaching in class is a context and location combination that should trigger off a lot of relevance to a teachers activity within the classroom.

If we add the temporal aspect, technology could become even more useful in channelling the right knowledge to us and in the right form. This might become a very useful thing because for example, a teacher’s timetable could be synchronised with the frequency of her RSS feed from Yahoo! Pipes or become a trigger for analytics to be fed in from the world on the common problem areas on the topic she is teaching.

Mobility is the other key aspect of the 3.O Web. By this we imply devices that are geared towards specific types of work or as generic tools, that can be added-on to the learner wherever she goes. Examples include some ongoing research on wearable headsets that provide the power of your PC, social network and the internet wherever you go.

It is then not inconceivable to think of the next generation of learning – Learning 3.O. This generation of learning is considered to be ambient – residing in our environment and ready for us to access when we need to. Pundits for this learning technology futurecast it to do to our world what electricity did for the industrial world.

In summary

And there will be more X.Os to come as we grapple with the fundamental transformation of our digital lives. There will always be competing approaches. The challenge for all of us is to be open & receptive to this change, critical in what we accept and be ready to experiment.

Read Full Post »

I came across an interesting set of concepts that quite predate the Learning 2.0 proclamation. Building upon Lave and Wenger’s communities of practice, Brown and Duguid developed the concept of Network of Practice. Ranging from communities of practice to electronic or virtual communities, and differentiated from formal work teams, it focuses on how individuals come together to learn and collaborate in the context of their daily practice or tasks.

Defining networks as a set of individuals that are connected together in a social relationship (strong or weak ties) and practice representing the common area of focus or substrate that links the individuals together, the network of practice is differentiated from other types of networks such as photo sharing insofar as this kind of a network is based on a practice area where individuals engage in a conversation to ask and share in order to perform at their work.

Networks of Practice (NoPs) include communities of practice (where ties are strong and face to face interaction is predominant) at one end of the spectrum, to electronic networks of practice (typically virtual/electronic communities brought together by weak ties) at the other end.

NoPs differ from formal work teams primarily in the way they are structured and by their control mechanisms. They also differ in terms of their size (they can get very large) and by restrictions on membership. I think, most importantly, they are differentiated by the expectations about participation from members.

I also found Eva Schiffer’s blog taking about an interesting activity that she coordinated. The activity was to take a community and map out the networks that the members formed in pursuance of their practice. Also, I found an interesting read also at Building new social machines.

Read Full Post »

The book by the same name written by C.K. Prahalad and M.S. Krishnan has much to offer us in the learning industry. There is a fundamental transformation in the way we do business and it is critical for companies to negotiate two fundamental pillars of this change – co-created experiences and access (rather than ownership) to global resources.

If we look at traditional distinctions such as between products and services, hardware or software, these distinctions are getting blurred. Rather, a new order is emerging that conforms to global standards yet is locally responsive. This change can be seen in companies like Bridgestone and Goodyear, traditionally thought of as a product (tires) companies. Both companies now offer their customers an experience rather than a product. This experience is based upon creating a revenue model based on actual usage rather than on the product itself. The relationship becomes more of an ongoing relationship with the customer and from just a business to business interaction, it starts focusing on the consumer directly. The service model is simple – provide guarantees, support and services on the usage of the tires (say in a fleet management scenario) rather than transactionally on just the tire. The experience includes then additional services such as fleet management, sensors in the tire that send real time usage information to the company and training  for tire users to manage their investment better.

A similar experience is being brought to us by TutorVista, an online tutoring service. TutorVista provides its customers with the ability to choose what they want to learn, when they want to learn and for what duration they need tutoring. The student can decide she needs tutoring on a particular area, go to Tutorvista and determine the exact training fit for her requirements.

This is different from mass customization where customers have preset choices or combinations thereof. It also moves away from the heavily used tools and techniques for market segmentation. The market segment consists of one consumer at a time. Personalized yet scalable, affordable and high quality. This is what they call N=1. The locus of value is seen to be shifting from products and services to experiences.

Making this happen means the firm has to be very flexible. Operationally it must be able to plan based on needs and trends, i.e. ability to reconfigure resources is key. Complexity increases in an N=1 world because we are dealing with more and more analytic or consultative selling rather than information based selling. Simplicity of the customer interface also becomes critical along with the ability to initiate and grow a dialogue with the customer. This also requires a new level of IT sophistication.

N=1 involves a new approach to access and use of resources. The authors term this R=G. We need to move away from owning access to resources to co-opting them rather than attempting to own them. There are two big advantages to this and one necessity. The advantages are that the firm can rapidly scale based on expectations and needs of customers and that each resource is an independent entity capable of providing innovations that can percolate to your customers (innovation arbitrage vs. traditional cost arbitrage). The necessity is that no one firm can even attempt to own all the different resources that it would need for creating new experiences for the customer in an N=1 world.

Business processes and associated analytics are what will be the key enablers of an innovation culture. And firms should move from a cost based to a value based model operating as a nodal enterprise in a complex network of global resources. N=1 and R=G need not be costly to create, rather it should be possible to create the social and technical enterprise infrastructure to support these in an affordable manner.

What does this augur for us in learning? We have seen outsourcing and leveraging a global vendor base as a trend and necessity in most situations for large global firms. I would also believe that R=G is thriving in the learning industry and innovation arbitrage is a key factor along with rapidly shrinking and sensitive-to-performance budgets for training. But N=1 is not and that I think is the challenge facing the learning industry as well – how to co-create effective learning experiences for learners. To take an analogy from the tire example above, WBTs/ILTs etc become the “tires”/products that are produced by the firm to train employees and partners. But the end experience for each customer is very personal; learning needs a personal touch. In a socially networked world, this can become a reality because it is a high touch network and based on relationships. This could imply that firms start breaking down content and instruction into manageable transformable forms. Or it may imply that L&D needs to play a more active role in ensuring learning happens by facilitating it more strongly in a 2.0 manner.

Read Full Post »

Discussion Thread: This post << Part 3 << Part 2 << Part 1

Before we go on to start detailing formal methodologies, we must make concrete the business case, context and critical success factors for these methodologies.

As organizations struggle to understand how they can leverage Learning 2.0 and vendors bring in their own interpretations of what Learning 2.0 really is. I think we need to start defining how and why this new style should be implemented at the workplace.

Because that is what Learning 2.0 really is – an emergent learning style with a strong basis in social constructivist learning theories that holds out the promise of making the learning curve steeper, creating a learning & sharing culture within the organization (and beyond), lowering costs and increasing effectiveness.

Learners are changing from passive receptors of information and training to active participants in their own learning. This is a viral change, so it is really fast. Today’s digital learners are part of communities. They share their interests with members of their community. They twitter. They blog. They rake in RSS feeds and bookmark their favorities on de.li.ci.ou.s. They share photos on Flickr and videos on YouTube. They share knowledge on Slideshare and Learnhub or Ning. They share ideas. They grow by meeting and engaging peers and gurus alike using the LinkedIn or Facebook. On their laptops and on their mobile phones.

Traditional instructors are now moving from being trainers to being facilitators, guides and coaches in a collaborative teaching-learning space. The instructors need not treat their learners as passive receptors, rather they can actively shape, by dialogue and discovery, the nature of their learning.

Learning Managers, though, have perhaps the biggest challenge. Undisputedly, an organization that has both the vision and a demonstrable culture of continuous learning, collaboration and improvement, will benefit natively from the formalization of this style and the adoption of the available tools. This kind of an organization worries about functional excellence and the ability to transform the domain in which they operate through leveraging individual and collective insight.

Organizations that have not reached that stage (perhaps the majority worldwide), will need Learning Managers to step up and leverage these new developments to foster that culture. They are the ones who are responsible for implementation of formal methodologies for Learning 2.0 at the workplace.

Inevitably, their role must transform. They must orchestrate learning rather than just be responsible for the creation of the learning content itself. They must be able to bring out functional excellence and the culture of sharing and continuous learning. Their goals and measures must be community led and guided by the organization needs. This will directly result in performance improvements because the community can be made responsible for those improvements.

The role of the vendors or internal team they manage must also change and evolve. For example, vendors or internal development teams or instructors need to play a more active role in building that culture. These teams have a great understanding of the content and organizations have literally paid millions to train and induct them. They have interfaced with engineering/domain experts, instructional design and styles guidelines and perhaps directly even the learners. They are a logical, key component of this new space and equal partners in fostering that culture and some could take on the responsibility for goals in functional excellence.

All this will reduce costs. First of all, the onus of learning and teaching through sharing will start getting distributed across the organization. Secondly, the steeper learning curves that can be fostered through community interaction will make “training” more cost efficient. Thirdly, informal interaction through these spaces will reduce the remediation training requirements. Fourthly, the need and scope for physical instructor led learning will reduce. Fifthly, user generated peer reviewed content will start replacing large parts of content creation teams, whether internal or vendor.

It will increase effectiveness because there is no one-size-fits-all approach in the new 2.0 style. As learners start sharing their knowledge with the overall purpose of bringing others up to speed, they will also translate their own learning styles when they teach. All of a sudden, it will be easier to find content that is taught/shared in a way that resonates with a specific learner’s style of learning. It will also increase effectiveness because it will start manifesting in the organization culture and appetite for learning. It will make learning more fun if you have special methodologies for motivating entire communities. It will engage the communities even more because they will feel aligned and geared for the organizational goals and be seen as active participants in achieving those goals. And finally, it will be real because you are learning with practitioners as well as theoretical experts.

So what would be the critical success factors for implementing Learning 2.0 led approaches. First, organizational initiative is key – without the mission of transforming your organization into a learning and performing organization, these initiatives will meet with limited success. Second, roles must be redefined to accommodate the new solutions. Third, group dynamics must be researched and customized for your organization. This is key because each group or community will need a formal process of norming and mentoring by the organization functional or business leaders. Fourth, formal 2.0 methodologies and tools must be instituted and a process created around some of the transition, maintenance and integration areas. Fifthly, we would need to identify technology systems, measures and other supporting infrastructure to manage these implementations.

Read Full Post »

%d bloggers like this: