Feeds:
Posts
Comments

Archive for October, 2010

By far the most definitive week of discussions around PLENK for me, this week’s Friday discussion is worth multiple rounds of further investigation and discussion. I cam in a trifle late and it was difficult to catch up without a starting context, so I caught up with the recording later. Here are my notes and deductions – my version of the conversation between George and Stephen. As always, please feel free to add or correct me in case of any thing I missed or misinterpreted.

Stephen Downes

  • There is a Tool selection or Dashboard approach to classifying technology experience
  • There is no great application out there that allows me to read and write like gRSSHopper. This is the workflow approach. We need to model workflow, provide end-to-end functionality and that is the most daunting piece.
  • Should we be looking at a theory of everything (like Atlas in geography or Set Theory of everything)? Technology will evolve over time, but the core patterns of use may not (in fact, they may).
  • Is there was a way to hide the modalities, so that we focus on the core?What are these core ideas? Personal autonomy, distributed knowledge and social learning. There are frameworks like the 21st century skills frameworks. These are very widely  fragmented. I would add pattern recognition as a fundamental skill – is the optimal tool one that would be based on network theory and pattern recognition?
  • Machine analysis can give us a syntax. The human side would give us semantics.
  • Can we figure out, in technological terms, how humans do it – derive meaning? From the neurological sense, it is a very organic process that evolves over time, not intentional or deliberate, each new experience creating more understanding.
  • Is the tool of everything going to be a pattern recognition tool?

George Siemens

  • First time adoption of tools is difficult, not because of the tools, but because of concepts.This is where companies like MS or Facebook helped by aggregating functionality and establishing common ways of completing standard tasks
  • Tools that are available but the level of integration is too low at this point. With connective specialization, it is an each to her own preference. Also at the point of adoption, it adds to the confusion.
  • Do we need a tool of everything or do we need a way to build capacity?
  • The theory of everything: maybe with a combination of critical literacies and attributes or ideas of the disciplines?
  • The hiding of modalities is important.
  • There are two dimensions to pattern recognition – technological and human. The technological example would be reading through a mass of data vs. navigating a structured analysis of the mass of data. On the human side, Learning Analytics tools provide valuable patterns of use. That is what computing can do and visualization is going to be very important.
  • That does not mean that technology will be able to model personal or network use of the resources, but technology can help.
  • We need to have a balance between what a computer does well and a human does well (form vs. meaning).
  • Experts and novices think differently – experts think in patterns and novices think sequentially, or (Cris2B) plan ahead vs plan backwards. Conceptually, once some patterns are built up, some context, we are able to recognize more complex patterns.

My 2 cents.

I think that we must first start by presentation and analysis (as best as the computer can visualize in a simple way) and let humans and our networks derive the meaning. This is what I hope an NBT will achieve.

Maybe at some point, the insight from how humans use that information for semantics, through reflection and practice,  will start becoming progressively templatized as we understand or build tools and processes that can model how humans function – how we evolve from novices to experts in an area. I call this Native Collaboration and see it permeating every function in learning.

The discussions are fast evolving to a stage where some formal models of Native Collaboration (which attempts to model, functionally and technologically how we learn) and NBT (my terminology for Network Based Training – an evolution from Web Based Training or WBT) will emerge where the NBT environment encapsulates the modalities in a fairly standardized manner while allowing personal autonomy and includes specific connectivist techniques for Native Collaboration. This is really exciting!



Read Full Post »

I was going through Techcrunch’s coverage of Jeff Jonas, Chief Scientist of the IBM Entity Analytics group and his concept – BIG data. I think this is a wonderful way to look at how we make sense of the over abundant flow of information around us.

Addressing the question what is data, Jeff says organizations are as smart as what they know. And what they know comes from data – structured or unstructured – which in turn forms their perceptions. The smartest any enterprise can be is the net sum of its perceptions, he says (one could argue it should be connections, not data). He includes observations as data and talks about the insight that is possible when these observations are linked and analyzed.

This piece was interesting. He says, almost all data starts out as unstructured, it requires humans to place them into a structure. That may be widely obvious, but useful to take a second look at and ask if the way we structure data actually creates the problems we have today. It is actually a problem to piece together conversations in Twitter, for example. But Twitter structured it in a particular manner already echoing what it thought was a monologous mode of interaction with a network. I compose a status message and tweet it. I retweet a message. I place a hashtag to organize it. I make a list to follow. I reply to a message. All these are essentially one-way communications, like email.

Talking about why more data makes us ignorant. Data is growing faster as computing power, connected-ness and bandwidth continue to rise exponentially far exceeding enterprise ability to process it. This is a widening gap. And the impact is even more crucial because data is segmented into silos making it quite impossible to connect multiple sources of data to make valid inferences that increase efficiency and adaptability. One important thing he highlighted is data amnesia (it happens to me very regularly) – a bank he was already doing business with called him 6 times to get him to sign up – they did not know as an organization that he was already a customer. With the explosion in data, there is a decreased ability to analyse and act on analyses, there are less and less humans in comparison that can do this.

On being asked if Google isn’t really already the king in that space, he said he thinks Google is a giant pixel sorter (every searchable object is a pixel…). In the sorting, it misses things such as local context. Maybe we need to have services layered on top of search. And that is the thing I really like – I call it reversing the search – pushing context to identify relevant information rather than the other way around.

What is most interesting is how he talks about Big Data – technologists are building systems to analyze individual puzzle pieces (or pixels). But what would you do with a puzzle piece? The real thing is how to fit that puzzle piece into the puzzle. The last few pieces in a puzzle are easier to do than the first few. This is context accumulation – context is hugely important to accumulate in systems that analyze data and make better predictions.

On privacy and freedom, while all this data collection and subsequent predictions may offer better experiences, but at the same time awareness about what the customer is giving up in making the choices she makes about privacy needs to be significantly raised.

This is not the semantic web or linked data. This is not so much about linking data or describing data (or relationships) than about trying to view data in context.

Read Full Post »

PLE/N Tools

Really nice collection of links for this week’s #PLENK2010 discussions. I especially liked Patterns of personal learning environments, Wilson. Wilson looks at patterns of use of and activity in personal learning tools and learning networks, revising a previous approach which was very functional and tool-specific.

One of the ongoing challenges I have is with the constant comparison between LMS and PLE, which I happen to think, is an apples to oranges comparison. They serve different needs and are differently located on the spectrum between self-learning and managed-learning (if there is such a phrase). The MOOC and the LMS are comparable, as are NBTs (which I define as Network Based Training, the natural networked learning successors to WBTs) and PLEs.

Let us picture this. The LMS is used to launch a WBT course. The course pops up in a player which is really a navigation shell that acts as a data conduit between the WBT and the LMS. Suppose the LMS is learning network and personal learning tools aware (with blog, wiki, Flickr, connection hub-bing, discourse monitoring etc. affordances being provided by whatever mechanism – web services, XCRI…) and the WBT is just base reference material not quite unlike the Daily in this MOOC.

The player could then be technically programmed to act as a conduit between the WBT and the network or personal learning tool (people, resources, context, conversation, bookmarking service).  Sort of a SCORM for networked learning environments.

What would you call the WBT then? A NBT.

Would the PLE look similar to a NBT. Yes, it would resemble a slice of the PLE, a workspace which we organize around a theme that interests us. Similarly, the NBT could be conceived of as a combination of slices of many different PLEs, in fact as many as those learners enrolled in the NBT.

But the NBT would necessarily be a more constrained, parameterized environment, designed or seeded by a network weaver, an atelier – the new avatar of the teacher, and adapted and grown by the learners, present and past. The PLE would grow unfettered, the whole being greater than the sum of individual slices.

Most of the discussion, even in Wilson’s paper, focuses around the tools in the end. What can tools do to present the solution to a pattern? In fact, almost every solution is expressed in technological terms (notice how many times the word “tool” appears in the first line of the solution).

It is almost as if technology is the master problem solver for every pattern of learning, but that may just be me.

I would rather focus on Critical Literacies. On having reasons. Just like I would not count an NBT operating in an LMS environment to be a true NBT – as in truly architected as a networked learning aware solution from the ground up, rather than pasted on a WBT as a quickfix.

And that is perhaps why I would choose to take a radical stand – PLE/N tools do not yet exist. I would like to take you back to how PLEs were defined in Week 1:

A Personal Learning Environment is more of a concept than a particular toolset – it doesn’t require the use of any particular tool and is defined by general concepts like: distributed, personal, open, learner autonomy.

and for PLNs:

A PLN is a reflection of social and information networks (and analysis methods).

We are confusing our current lists of PLE/N tools with the concept or the method, like trying to measure the length of a moving snake by a tape measure or measuring the volume of a gust of air with a sudden clenching of our fist.

By far the most important attribute of the toolset, if you can call it that, for a PLE/N would be its complete invisibility. It would be implicit for learners in the way it has been designed. It is then that we will be able to project our personal aura on it and make it personal, as open as we are, as connectedly aware as we want to be (or can be) and as autonomous as we will allow it to be.

And that will also take a fundamental rearchitecture of the way we conceive of learning resources, away from resources as objects or networks, to living and dynamic forms that reflect our social and information networks.

More of a hard left than a gentle meandering this one, would you say?

Read Full Post »

New Literacies

Listening to Will Richardson’s session on PLENK2010 this past Wednesday. He brought up NCTE‘s definition of critical literacies:

  • Develop proficiency with the tools of technology
  • Build relationships with others to pose and solve problems collaboratively and cross-culturally
  • Design and share information for global communities to meet a variety of purposes
  • Manage, analyze and synthesize multiple streams of simultaneous information
  • Create, critique, analyze, and evaluate multi-media texts
  • Attend to the ethical responsibilities required by these complex environments

He said that tools become the central point of the conversation without worrying about the context. And in fact, he did not think the high school graduates he knew would make the grade if these were the certifying literacies.

Reminds me of a vision statement that I read recently. This is from the NAE (Committee on the Engineer of 2020, Committee on Engineering Education, National Academy of Engineering). Describing the Engineer of 2020, the enlightened authors write:

What attributes will the engineer of 2020 have? He or she will aspire to have the ingenuity of Lillian Gilbreth, the problem-solving capabilities of Gordon Moore, the scientific insight of Albert Einstein, the creativity of Pablo Picasso, the determination of the Wright brothers, the leadership abilities of Bill Gates, the conscience of Eleanor Roosevelt, the vision of Martin Luther King, and the curiosity and wonder of our grandchildren.

Yeah. Right.

Jenny started the discussion on ethical responsibilities. I think it is important to also evaluate ethical responsibilities in the traditional context as well, not just for issues that tools such as Facebook are creating in the context of the Internet. The traditional system throws up as many horrific examples of violation of ethical responsibilities.

Read Full Post »

George Siemens bemoans the emerging trend that “higher education is not in control of its fate as it has failed to develop the capacity to be self-reliant in times of change”. Referring to a dilution of the stance against corporatization, and the way external innovation is driving change at the academy, George may just be right.

In my experience with universities, the pace of change is extremely slow. Maybe it is the product of excessive intellectual rigor, but it is slow. It is surprising that it is slow given that education is a prime objective of these organizations and one would expect that they be ahead of the curve in anything that a corporate might throw at them.

It is not that there is no innovation or that some universities are not ahead of the curve already, but it is a general sentiment that I share from experience.

What are the causes? 

I think (in my experience so far and may not be very generalizable) a main cause is systemic – the bureaucratic processes of governance coupled with a healthy dose of ignorance contribute to the extremely slow pace of reaction and action. The second cause is an arrogance that what exists is the best way to be. The third cause is blind mistrust of everything for-profit, as if employing intellectual property created by for-profits is something to be totally suspicious about (or looked down upon). The fourth cause is the ability to stifle innovation through politics and the threat of conformance, sometimes hiding behind procedure. The fifth cause is not enough thinking (and expertise) about the teaching process; the focus is more on each individual subject/discipline.

The for-profits have not done enough to allay the fears either. Over promising and under-delivering, delivering problems more than solutions, salivating at size & recurring revenues are accusations that can be made.

Education is not only a public domain. To the extent that it is a State responsibility, the State must ensure that capacities are built up to navigate new terrains. What I see around me is lip service when it comes to teacher training and a three letter word called ICT. To the extent it is a corporate goal, for-profits must realize they need to build far greater credibility and demonstrate far greater responsibility to the domain and all that they impact.



But by far, apart from the student, the most important stakeholder today is the teacher. The teacher forms, whether in for-profit or otherwise, the core around the learning experience and the most visible force in its implementation. By enabling students, by continuously evolving, by exploring the connectedness of knowledge and by deliberating openly on technologies, processes and techniques that impact the learning experience, will teachers be able to drive change.

I don’t think teachers realize the power they own in changing the learning experience. Looking at what is happening closer home, I see that they are content using their power to stay in the same place unmindful of the explosive pace being generated around newer ways of teaching and learning (and even of developments in their chosen field/discipline!). Both the for-profit and non-profit organizations are exploiting that non-use of power.

This must change.

Read Full Post »

Day 2 Power of Ideas #ETPOI

Market Sizing and Competition Analyses

Vijay Shukla

Competitive Analysis

You may not be all that unique. Understand the primary need and business is crucial and fundamental. Competition can be identified once you identify the need that you are addressing. And competition may be visible or invisible. Invisible competition is more dangerous because we don’t know who and where they are. Remember, you may get the feeling of uniqueness perhaps because other competitors have done the research and did not find your space/offering to be profitable. Put some rigor in your understanding of the business.

Note: One stop shop doesn’t bring the same premium in India. Companies are making more through specialization. So focus on the advantage/specialization.

Market Sizing

Don’t get caught in your own trap. “We want one so everyone wants one”. At early stage, you extrapolate from your experience, but that should only be an added factor for starting business. Disassociate yourself from the idea, so you can start looking at it objectively.

What’s the Market? What’s my total addressable market? Do the research before you meet the potential market. Are they willing to experiment, can they take decisions here, have they worked with smaller players, do they have budgets for your segment – use these filters to figure out who to go after at the beginning. These filters are really your business insight. Remember, buying behaviors of clients do not change just because you are starting up.

Make sure you validate data by asking commonsense questions. Check for counter sources and filters that prune the final numbers. Be accurate rather than precise.

Customers are different from users. Customers pay. They are your market. There are influencers – key opinion leaders. There may be many other influences, and you have to take them into account.

Differentiation

Would it make a difference if you did not exist? That is an important question.

Why would they not care?

  • Too many options that appear the same to customers
  • Demand is lesser than supply
  • Low entry barriers
  • Partnership with jobbers – balance shifts away from you
  • What is advertising doing for you?

How does brand matter in these conditions? Good businesses do hard labor to meet competition – to sustain their brand. Good businesses in these conditions become more efficient with the use of processes and technology.

There are many things that can be recommendations, but that will depend upon a host of factors.

Business Models

Rahul Aggarwal

“A business model describes the rationale of how an organization creates, delivers, and captures value”

Value Creation – building blocks

  1. Value conception – who conceptualizes and who controls
    1. Proprietary
    2. Collaborative (partnership/JV, crowdsourcing, co-creation)
  2. Value composition – who composes the value
    1. Form – product, service, hybrid
    2. Product vs service is getting blurred – but usually time to market, infrastructure requirements and growth curve; also timeframes for getting the revenue
    3. Challenge is to collapse the time to market
  3. Value production – who produces, who controls production assets?
    1. internal
    2. partner/JV
    3. Outsourced
    4. Crowd-sourced
    5. Pay for service (BOOT etc.)
    6. Platform
  4. Value targeting
    1. Forecast demand – push model (car company)
    2. React to demand – pull model (Dell, BTO supply chain)
    3. Activate latent demand or create new demand (space tourism?)
  5. Value pricing
    1. Amount, time, currency, payer, ownership
    2. Traditional buy
    3. Fractional Ownership
    4. EMI
    5. Rental
    6. Cross-subsidization
    7. Lottery
    8. Auction
    9. Free for the user
    10. Equity
  6. Value delivery
    1. Physical, virtual, hybrid
    2. Ownership
    3. Aggregation (info vs. look and feel vs. delivery vs. service)

Business Model Innovations – Blue Oceans – unexplored markets, unlocks hidden value, not easy to copy.

What will lead to consumers demanding/consuming more (Microfinance, JustBooks)? What will lead to customers’ latent demand getting activated? What will unleash totally new value for the customer? How can I capture more value?

What will differentiate your company from the rest? What is a defensible competitive position (what will tie in the customer? Example, Mahindra knows exactly the demand). What will enhance your influence and control in the ecosystem?

How linear is the relationship between resources and revenues (you should be looking at non-linear relationships)? What competencies will you need? For established companies, to what extent will systems and processes need to change? To what extent are you dependent on third parties?

Read Full Post »

In case you did not know already, I am at the Indian Institute of Management, Ahmedabad, India for an 8-day workshop that brings together 74 of the top ideas from among more than 16000 submissions to the Power of Ideas initiative. The first day so far has been interesting and I like the fact that I am returning to an academic environment after a long break, albeit for a short while.

Sandeepan Budhiraja’s first session on Identifying Customer Needs and Market Research was, interesting, particularly when he talked about focus on heavy users as the primary concern of business. When I tried to ask if this was applicable to all markets and economies, referring to the Long Tail and Power Laws being an opposite source of focus, the consensus was that the long tail consumers made incidental (a word I inappropriately used) purchases and that could not be the basis of product strategy/positioning.

Ignores pretty much what is happening in companies like eBay and Google and the media industry where distribution costs are negligible! In fact one opinion was that the monetization of the long tail is either the preserve of very large companies or at best something that is a more long-term outcome, both conclusions I would disagree with. Some links I found:

The other two sessions by Akshat Rathee and VC Karthic were interesting. They shared insight on the business planning components, presentation skills and key insights when talking to Venture Capital. Here is a short set of notes from those encounters.

First translate what your vision is – where you were, are, want to be

Putting together the Information Memorandum (IM)

  1. Build pitch
    1. Pitch is contractual – this is what I provide with a difference from others
      1. Question: Making a pitch involves propositions that need to be commonly understood: e.g. I will provide you a GFX Super accelerator for half the cost – is probably not a good idea because it is too technical
      2. Let the pitch give rise to questions, it should intrigue; make a statement that can be tuned at a second step to thinks like target segment, how will it be delivered etc.
  2. Build scale up vision while maintaining low-cost and high profitability
  3. What do we need money for? Investment is for growth; Asking money for marketing is a bad idea; for capital, working capital is a good idea – What do we need money for? (Split money between debt and equity); state your Internal Accrual; include CAPEX
  4. Exit
    1. How do we repay debt or show high returns to the investor (33% margin = 2X, meaning you have to grow by 100% every year if we don’t think of shares sale)?
    2. Who are the people who would like to buy this company x years from now giving investor an exit option; what would be total cash dividend that investor would get on exit? Specific names and numbers. IPO as an exit (strategic/PE investor).
    3. From the venture perspective:
      1. They want a team that can execute this the best way
      2. They want billion dollar market
      3. Want 5-20X return
      4. A good exit in 2-3 years
      5. Ways in which VCs can exit: Sells his stake, Sell the company, Goes IPO?
      6. Keep pitch simple, limited vocabulary
      7. Need to feel comfortable at a personal level, exploit any prior connects
      8. Identifying the pain; attacking the business thoroughly demonstrates the market research
  5. Things they look out for
    1. Competitive sustainable advantage
    2. Patent – adds guarantee Show the demo
  6. Business Plan – how you are going to make money
  7. State Assumptions: They (Analyst) will check formula and then assumptions. Plan is based ground up not on desire. Make assumptions that follow from what we know about the industry and the business. They must intuitively make sense. Quote the exact source. Backup with verifiable data. Make sure you know conflicting sources. Know how to defend each assumption.
  8. Now we are ready to make a business plan.
  9. Put in a FAQ or sections for team, capability etc.

Would welcome any corrections from the other participants! I think we have a really cool bunch of people here.

Read Full Post »

The debate at Oxford Union this Wednesday on informal learning was very interesting, more so because some wonderful people on Twitter were actually relaying it blow-by-blow and also because I was testing my multi-tasking skills by juggling between the Twitter conversation and the PLENK session!

The motion was: The House believes that technology based informal learning is more style than substance.

Dr. Allison Rossett, Nancy Lewis, Mark Doughty argued for:

Informal learning is appealing and exciting but it has no authoritative voices, no assessments or guidance, and therefore no substance. The motion isn’t about us and how we like to learn. It’s about our need to know that the organisations and people we trust know what they are doing. Informal learning doesn’t provide that. It has no thermostat or control. We all love technology, but on the scale of substance and style, it’s still all about style. If you care about organisations, be they of pilots, doctors or cooks, if you care about performance then we urge you, support the motion.

Prof. Dutton, Jay Cross and David Wilson argued against:

Informal learning is not trivial; it is in every corner of institutions. People in the room are using technology to check facts as we speak. Technology-based informal learning enhances information and reach. It makes experts more accountable and raises the bar. And for parts of the developing world it is the only learning available. Therefore, we urge you to vote against the motion.

The main arguments were:

For the motion (more style than substance):

  • gets viewed as a cheaper alternative by managers, but no measure of formal effectiveness for learning managers
  • need assurance that our doctors are medical experts and our pilots can fly a plane
  • formal gets things done
  • not well-researched
  • no north star to guide, no common understanding of what it is
  • does not work when failure is not an option (mission critical)

Against the motion (substance and style):

  • Internet has become a first recourse for information
  • institutions need to learn to harness the network
  • (Cross) co-exists with formal learning on a continuum, only visible separation inside school learning
  • Not the tools but the collaborative activities that will sustain and evolve
  • it is part of work, we do not need to separate it

The ones against won comprehensively and there is an online vote if you want to add your weight. I think there are some important pieces to this debate.

One, learning does occur informally, whether with the use of technology or without it.

Two, by definition it is informal, loosely (if at all) structured, not pre-meditated or goal driven (let me go to the water cooler to get agreement on the next strategic shift in business). It is a space where data is not as important as the intelligence in the conversation, as the alignment between connections.

It is a space where in principle decisions may occur or new ideas may emerge or new connections may be made. It is a space that can trigger a lot of formal work. And since it is informal it may not always be serious.

Three, the separation categories of formal and informal make sense when one is trying to push out the other as being an equally or more effective way of learning. To make that claim, informal learning will have to defend itself against vague arguments of mission-criticality, dissipated theorizing and non-existent assessment methods. 

I say vague arguments because saying a doctor trained by informal methods (if any are identified) will fail to become a medical expert (or succeed) is an improperly framed, populist argument.

It assumes a distinct category for formal and informal. It assumes that informal learning is all about informal or non-serious, undirected chatter which depends on serendipitous events to become or be considered meaningful. It assumes, on the other side, that formal learning undisputedly generates medical experts or pilots. That every site of formal learning is serious, directed and purposeful.

It also throws out any chances of even considering that informal learning plays a huge role in the organization or in school learning.  In fact the argument that informal learning does not work when failure is not an option precludes the very idea of allowing mistakes to happen during formal learning (as Sir Ken Robinson argues in his TEDTalk – How Schools kill Creativity).

I would vote against the House on this one and also chasten it for selecting the motion the way it stands, more to provoke extreme reactions than to promote constructive debate.

Read Full Post »

With a little help from Jatinder, a kindred soul in the making of simulators that happen to attract Brandon Hall Awards, I tried to visualize a model of PLEs operating in a connective environment. It started with a reply I made to Janet and Carmen on what I think should be:

…let us contrast the MOOC environment with an LMS. Can we think of this environment as self configuring instead of being configured by an administrator. How about when a person wants to join a “course”, she gives rights to the MOOC server to “pull” content she wants to share in context of the course into the course environment…the content stays with her, but instead of (or in addition to) the LMS “pushing” some general stuff, it configures a learning space based on the expertise and contributions of its members?

Like if I join a space or a conversation, I bring not only my personal self but also my blog, my Zotero collection, my Diigo links, my tweets, my network etc., but also decide to bring in a relevant “slice” of these and other influences to the course or research I am taking on. Maybe such environments understand a shared semantic vocabulary for the subject so that they can quickly organize the combined course network without my explicit instructions. Wouldn’t this be a self-organizing, emergent ecology more in line with Connectivism and a way to differentiate against an LMS?

The first visualization I thought of was that of puddles and rain. Simply put, when the rain falls, puddles of water form. Some puddles mix with other puddles, self-organizing, to form streams, some stay quietly content to stay aloof and disconnected. Depending upon how much it rains and what the surfaces are that receive the rainfall, we will see patterns. There may be a point of bifurcation when the entire surface gets covered. When rain stops, and puddles start drying, a pattern of decay forms quite unlike the pattern of growth which was an emergent, complex pattern to start with.

So replace puddles with PLEs, the surface and environment with the network (a super-PLE?) ecology and the rain with a certain eventedness (a MOOC?) and you have my picture of what goes on in connective learning. Weird idea? I sincerely hope not.

So I thought I would bring about a better visualization with Jatinder’s kind help. Picture this (disclaimer: not to suggest any connection between the names of various people in my network on the visual and social connotations of the word butterfly, more from the effect of a butterfly flapping its wings….):

(Images courtesy various artistes on the web, but in particular for the incredible post here – did you know the Fibonnacci Sequence appears in the sunflower!)

This could be an environment unlike the above, with cacti and barren deserts instead, a metaphor perhaps for rigid institutional environments. The point is that each of the elements will feed on each other in complex ways, uncontrollable, still with distinct patterns. Of course, Stephen invoked that knowledge as a plant, meant to be grown metaphor when talking about connectionist networks. I am not suggesting that one plant is altogether separate from the other and knowledge is silo-ed, they will have dependencies and some common roots. But each plant will have a tapestry of complex patterns to reveal, strands of knowledge and butterflies will cross-pollinate.

But it is a picture where PLEs are an extension of the self, disembodied but in many ways a natural extension, making us a distributed entity operating as a singularity(?). I like this way of thinking (although the quickly engineered visual may not make the grade). And I think this way of visualizing gives us credible alternatives to the way LMSs are built today.

As always, would love to know what you think!

Read Full Post »

Missed Janet Clarey’s great interactive talk this Wednesday but caught up with the recording. I think it was a great session on many accounts. Janet brings her great experience in Corporate Learning Development research at Brandon Hall into the session she leads. Thanks, Janet!

The main questions that she addressed were:

  1. What are Web 1.0/2.0 learning  models/trends? Which theories are they informed by? What data do they collect and manage?
  2. How can innovations like Augmented Reality and Foursquare be used to support learning?
  3. Can informal learning really work in the face of regulatory requirements or mission critical situations?
  4. Take a social learning and networking enabled LMS like SABA. How is it really different from what we are doing in the open MOOCs?
  5. Can there be a hybrid model spanning eLearning 1.0 and 2.0?

Very interesting questions and even more interesting responses from participants. Let’s back up a bit. Responding to a July 2010 discussion around Critical Literacies and the eXtended Web, I looked at what my starting points for a PLE would be and why we need to closely look at what the PLE Architecture should be based upon. More recently, as George mentioned, there is an extremely interesting discussion going on in the Learning Analytics Google Group – I do recommend that you go through the bibliography and Rebecca’s summary of  discussions.

As background, as well, there is an interesting discussion I had with Rob Wilkins and Janet Clarey on LMSs, Assessments and RoI early last year after Janet’s set of great interviews with leading LMS providers, where I argue that LMSs can’t be social as an add-on (keep-up-with-the-trends thought or to do eLearning 2.0 the eLearning 1.0 way) and why current LMS metrics are woefully inadequate to provide us any strong indicator of learning or performance.

Back to Janet’s talk and the first question. Her slide on eLearning 1.0 emphasizes technology as a support for most of the eLearning dimensions that are in use today – courses (self-paced and virtual instructor-led), certification, LMS/LCMS, Authoring tools etc. Participants responded to her “Informed by what theory?” question by evoking concepts and theories such as cognitivism, constructivism and constructionism and characterizing eLearning 1.0 as “sage-on-stage”, body of knowledge etc.

I have made this point before, but it is hard for me to think of LMSs in the 1.0 era as anything but tools for learning automation, which was the pressing need then as organizations started adopting technology to manage learning. Because of this reason, it is also a little superficial to ask what theories informed eLearning 1.0 supportive technology. The theories influenced the way content was designed and instruction delivered rather than how the LMS or Virtual Classroom was built. I would put LMSs such as Moodle and LAMS and platforms such Mzinga’s Omnisocial in the eLearning 2.0 category instead as supportive tools informed by theory. Janet’s consequent question of what data are we collecting, reporting and analyzing in the 1.0 world, evoked the standard responses – time spent, scores etc.

elearning 2.0. I had problems with putting disruptive technology as the core behind eLearning 2.0. While that may be an important factor, it can’t be only thing at the core. I am not sure that blended learning, mobile learning,P2P, 3D-immersive environments and “search learning” (whatever that is) would fall under eLearning 2.0 which she also characterizes as “Self-serve. Social. Mobile” – at least not the way we have been talking about it.

What theories inform eLearning 2.0? To my utter surprise, nobody put up Connectivism up there (connectionist was the closest)! I think the data aspect, where I did get to see artifacts and relationships, would have benefited from some discussion around intelligent data (George went to it later in  the session).

Next were a few slides on network maps, augmented reality, location aware apps. I thought it was a good idea to provoke thought of how these tools could be used as part of the learning process. There are perhaps hundreds of ways to do that and to conjoin these with existing approaches/theories and design approaches is not very difficult. In my belief, Linked Data will play a massive role in terms of distributed connective knowledge (but that is another story) as will serious gaming and simulation combined with these new technologies. Obviously, data acquisition and capture will also be enhanced (and there are privacy and ethical concerns around this).

George referred to the Semantic Web and Web 3.0. It is interesting to note the title of a post that Stephen wrote about three years back “Why the semantic web will fail“. But of what theories inform the eXtended Web, participant responses included marketing, monetization model, authority, self watching vs crowdsourcing, surveillance (someone suggested sousvelliance) and personal learning. Steve LeBlanc asked for a list of differentiating characteristics, I would respond that these are the subjects of the PLENK2010 discussions – PLE, MOOCs, Connectivism, Intelligent Data, semantic web, Linked Data, extension of the Internet as a Internet of things.  Again, I think Connectivism would form a important influencing theory of the eXtended Web.

For me there are two important aspects of the data aspect of the eXtended Web – data trails (George) and sliced PLEs, and, new forms of collaboration leading to new learning analytics (like Connection Holes) that can replace the traditional 1.0 methods and tools.

Can informal learning work in mission-critical situations or in situations that demand proof of regulatory compliance? For the former, yes, absolutely. Where informal learning and connective learning models for learning and performance really succeed is because they realize that knowledge (and expertise) is distributed and problem solving is essentially a process of connection-making.

For both, there is a larger question – what are we measuring? Regulatory compliance – that organizations prove that employees spent time and obtained passing scores on key topic such as sexual harassment at the workplace – is built at cross purposes with the aim of the regulations (say, employees reflect and practice sensitivity to and abstinence from sexual harassment at the workplace and companies don’t have to submit proof of deviation just like you have to let a software vendor know if you are not license-compliant). Maybe the parochial measures prescribed by the legislations need to change rather than stating that traditional formal elearning does provide an accurate measure and meets the objectives of the legislation.

The argument is carefully articulated by Stephen in his post Having Reasons where he states:

The whole concept of ‘having reasons’ is probably the deepest challenge there is for connectivism, or for any theory of learning. We don’t want people to simply to react instinctively to events, we want them to react on a reasonable (and hopefully rational) basis. At the same time, we are hoping to develop a degree of expertise so natural and effortless that it seems intuitive.

I think the question, although someone did answer it from one perspective, “will the ability to repair a nuclear reactor emerge from the water cooler”, is a horrifying and irresponsible one intended to discredit the concept of informal learning. What if I flipped the question and asked “will the ability to repair a nuclear reactor come from learning online at your own pace” – which discredits WBTs as a possible solution altogether. It is not a new question and I think Jay Cross has defended it somewhere too. It trivializes the problem and the solution.

Janet also showed a learner home page in SABA and immediately compared the “technology” to the “technology” in the MOOC saying how is this really different. I think that is where the disconnect is – you cannot put technology and affordances of tools at the core, whether disruptive or not. It is also the reason I continuously state that current LMSs are building social learning add-ons, not rethinking from the ground up. Theory will inform not only how the technology will work but also how learning will happen. I know Stephen would have a mouthful to say on  this as well (pity he was not there).

On the discussion whether the two generations can give rise to a hybrid, there are mixed opinions. Connectivism is a very young theory. Even before it started, the challenge was still to put an implementation (practice) face to the theory. These pressures to generate a pedagogy or instructional design approach or practical guidance among other pressures, may prompt us to jump to a hybridization of the concepts.

But in a sense, we need to let this discussion evolve – the debate on my earlier post around constructivist and connectivist PLEs generated show us a healthy state on the road to resolving these practice challenges. Like in the response on sense-making among other comments on the PLE post (which I still have to respond to), Stephen is perhaps correct in assuming a pure unadulterated stance on what connectivism and connective knowledge are and how they can change what we believe and practice in learning.

I struggle with it all the time, but I think a pure stance is much-needed with occasional intolerance to evolve to a state where it can widely inform practice.

Read Full Post »

%d bloggers like this: