Archive for October, 2010

By far the most definitive week of discussions around PLENK for me, this week’s Friday discussion is worth multiple rounds of further investigation and discussion. I cam in a trifle late and it was difficult to catch up without a starting context, so I caught up with the recording later. Here are my notes and deductions – my version of the conversation between George and Stephen. As always, please feel free to add or correct me in case of any thing I missed or misinterpreted.

Stephen Downes

  • There is a Tool selection or Dashboard approach to classifying technology experience
  • There is no great application out there that allows me to read and write like gRSSHopper. This is the workflow approach. We need to model workflow, provide end-to-end functionality and that is the most daunting piece.
  • Should we be looking at a theory of everything (like Atlas in geography or Set Theory of everything)? Technology will evolve over time, but the core patterns of use may not (in fact, they may).
  • Is there was a way to hide the modalities, so that we focus on the core?What are these core ideas? Personal autonomy, distributed knowledge and social learning. There are frameworks like the 21st century skills frameworks. These are very widely  fragmented. I would add pattern recognition as a fundamental skill – is the optimal tool one that would be based on network theory and pattern recognition?
  • Machine analysis can give us a syntax. The human side would give us semantics.
  • Can we figure out, in technological terms, how humans do it – derive meaning? From the neurological sense, it is a very organic process that evolves over time, not intentional or deliberate, each new experience creating more understanding.
  • Is the tool of everything going to be a pattern recognition tool?

George Siemens

  • First time adoption of tools is difficult, not because of the tools, but because of concepts.This is where companies like MS or Facebook helped by aggregating functionality and establishing common ways of completing standard tasks
  • Tools that are available but the level of integration is too low at this point. With connective specialization, it is an each to her own preference. Also at the point of adoption, it adds to the confusion.
  • Do we need a tool of everything or do we need a way to build capacity?
  • The theory of everything: maybe with a combination of critical literacies and attributes or ideas of the disciplines?
  • The hiding of modalities is important.
  • There are two dimensions to pattern recognition – technological and human. The technological example would be reading through a mass of data vs. navigating a structured analysis of the mass of data. On the human side, Learning Analytics tools provide valuable patterns of use. That is what computing can do and visualization is going to be very important.
  • That does not mean that technology will be able to model personal or network use of the resources, but technology can help.
  • We need to have a balance between what a computer does well and a human does well (form vs. meaning).
  • Experts and novices think differently – experts think in patterns and novices think sequentially, or (Cris2B) plan ahead vs plan backwards. Conceptually, once some patterns are built up, some context, we are able to recognize more complex patterns.

My 2 cents.

I think that we must first start by presentation and analysis (as best as the computer can visualize in a simple way) and let humans and our networks derive the meaning. This is what I hope an NBT will achieve.

Maybe at some point, the insight from how humans use that information for semantics, through reflection and practice,  will start becoming progressively templatized as we understand or build tools and processes that can model how humans function – how we evolve from novices to experts in an area. I call this Native Collaboration and see it permeating every function in learning.

The discussions are fast evolving to a stage where some formal models of Native Collaboration (which attempts to model, functionally and technologically how we learn) and NBT (my terminology for Network Based Training – an evolution from Web Based Training or WBT) will emerge where the NBT environment encapsulates the modalities in a fairly standardized manner while allowing personal autonomy and includes specific connectivist techniques for Native Collaboration. This is really exciting!

Read Full Post »

I was going through Techcrunch’s coverage of Jeff Jonas, Chief Scientist of the IBM Entity Analytics group and his concept – BIG data. I think this is a wonderful way to look at how we make sense of the over abundant flow of information around us.

Addressing the question what is data, Jeff says organizations are as smart as what they know. And what they know comes from data – structured or unstructured – which in turn forms their perceptions. The smartest any enterprise can be is the net sum of its perceptions, he says (one could argue it should be connections, not data). He includes observations as data and talks about the insight that is possible when these observations are linked and analyzed.

This piece was interesting. He says, almost all data starts out as unstructured, it requires humans to place them into a structure. That may be widely obvious, but useful to take a second look at and ask if the way we structure data actually creates the problems we have today. It is actually a problem to piece together conversations in Twitter, for example. But Twitter structured it in a particular manner already echoing what it thought was a monologous mode of interaction with a network. I compose a status message and tweet it. I retweet a message. I place a hashtag to organize it. I make a list to follow. I reply to a message. All these are essentially one-way communications, like email.

Talking about why more data makes us ignorant. Data is growing faster as computing power, connected-ness and bandwidth continue to rise exponentially far exceeding enterprise ability to process it. This is a widening gap. And the impact is even more crucial because data is segmented into silos making it quite impossible to connect multiple sources of data to make valid inferences that increase efficiency and adaptability. One important thing he highlighted is data amnesia (it happens to me very regularly) – a bank he was already doing business with called him 6 times to get him to sign up – they did not know as an organization that he was already a customer. With the explosion in data, there is a decreased ability to analyse and act on analyses, there are less and less humans in comparison that can do this.

On being asked if Google isn’t really already the king in that space, he said he thinks Google is a giant pixel sorter (every searchable object is a pixel…). In the sorting, it misses things such as local context. Maybe we need to have services layered on top of search. And that is the thing I really like – I call it reversing the search – pushing context to identify relevant information rather than the other way around.

What is most interesting is how he talks about Big Data – technologists are building systems to analyze individual puzzle pieces (or pixels). But what would you do with a puzzle piece? The real thing is how to fit that puzzle piece into the puzzle. The last few pieces in a puzzle are easier to do than the first few. This is context accumulation – context is hugely important to accumulate in systems that analyze data and make better predictions.

On privacy and freedom, while all this data collection and subsequent predictions may offer better experiences, but at the same time awareness about what the customer is giving up in making the choices she makes about privacy needs to be significantly raised.

This is not the semantic web or linked data. This is not so much about linking data or describing data (or relationships) than about trying to view data in context.

Read Full Post »

PLE/N Tools

Really nice collection of links for this week’s #PLENK2010 discussions. I especially liked Patterns of personal learning environments, Wilson. Wilson looks at patterns of use of and activity in personal learning tools and learning networks, revising a previous approach which was very functional and tool-specific.

One of the ongoing challenges I have is with the constant comparison between LMS and PLE, which I happen to think, is an apples to oranges comparison. They serve different needs and are differently located on the spectrum between self-learning and managed-learning (if there is such a phrase). The MOOC and the LMS are comparable, as are NBTs (which I define as Network Based Training, the natural networked learning successors to WBTs) and PLEs.

Let us picture this. The LMS is used to launch a WBT course. The course pops up in a player which is really a navigation shell that acts as a data conduit between the WBT and the LMS. Suppose the LMS is learning network and personal learning tools aware (with blog, wiki, Flickr, connection hub-bing, discourse monitoring etc. affordances being provided by whatever mechanism – web services, XCRI…) and the WBT is just base reference material not quite unlike the Daily in this MOOC.

The player could then be technically programmed to act as a conduit between the WBT and the network or personal learning tool (people, resources, context, conversation, bookmarking service).  Sort of a SCORM for networked learning environments.

What would you call the WBT then? A NBT.

Would the PLE look similar to a NBT. Yes, it would resemble a slice of the PLE, a workspace which we organize around a theme that interests us. Similarly, the NBT could be conceived of as a combination of slices of many different PLEs, in fact as many as those learners enrolled in the NBT.

But the NBT would necessarily be a more constrained, parameterized environment, designed or seeded by a network weaver, an atelier – the new avatar of the teacher, and adapted and grown by the learners, present and past. The PLE would grow unfettered, the whole being greater than the sum of individual slices.

Most of the discussion, even in Wilson’s paper, focuses around the tools in the end. What can tools do to present the solution to a pattern? In fact, almost every solution is expressed in technological terms (notice how many times the word “tool” appears in the first line of the solution).

It is almost as if technology is the master problem solver for every pattern of learning, but that may just be me.

I would rather focus on Critical Literacies. On having reasons. Just like I would not count an NBT operating in an LMS environment to be a true NBT – as in truly architected as a networked learning aware solution from the ground up, rather than pasted on a WBT as a quickfix.

And that is perhaps why I would choose to take a radical stand – PLE/N tools do not yet exist. I would like to take you back to how PLEs were defined in Week 1:

A Personal Learning Environment is more of a concept than a particular toolset – it doesn’t require the use of any particular tool and is defined by general concepts like: distributed, personal, open, learner autonomy.

and for PLNs:

A PLN is a reflection of social and information networks (and analysis methods).

We are confusing our current lists of PLE/N tools with the concept or the method, like trying to measure the length of a moving snake by a tape measure or measuring the volume of a gust of air with a sudden clenching of our fist.

By far the most important attribute of the toolset, if you can call it that, for a PLE/N would be its complete invisibility. It would be implicit for learners in the way it has been designed. It is then that we will be able to project our personal aura on it and make it personal, as open as we are, as connectedly aware as we want to be (or can be) and as autonomous as we will allow it to be.

And that will also take a fundamental rearchitecture of the way we conceive of learning resources, away from resources as objects or networks, to living and dynamic forms that reflect our social and information networks.

More of a hard left than a gentle meandering this one, would you say?

Read Full Post »

New Literacies

Listening to Will Richardson’s session on PLENK2010 this past Wednesday. He brought up NCTE‘s definition of critical literacies:

  • Develop proficiency with the tools of technology
  • Build relationships with others to pose and solve problems collaboratively and cross-culturally
  • Design and share information for global communities to meet a variety of purposes
  • Manage, analyze and synthesize multiple streams of simultaneous information
  • Create, critique, analyze, and evaluate multi-media texts
  • Attend to the ethical responsibilities required by these complex environments

He said that tools become the central point of the conversation without worrying about the context. And in fact, he did not think the high school graduates he knew would make the grade if these were the certifying literacies.

Reminds me of a vision statement that I read recently. This is from the NAE (Committee on the Engineer of 2020, Committee on Engineering Education, National Academy of Engineering). Describing the Engineer of 2020, the enlightened authors write:

What attributes will the engineer of 2020 have? He or she will aspire to have the ingenuity of Lillian Gilbreth, the problem-solving capabilities of Gordon Moore, the scientific insight of Albert Einstein, the creativity of Pablo Picasso, the determination of the Wright brothers, the leadership abilities of Bill Gates, the conscience of Eleanor Roosevelt, the vision of Martin Luther King, and the curiosity and wonder of our grandchildren.

Yeah. Right.

Jenny started the discussion on ethical responsibilities. I think it is important to also evaluate ethical responsibilities in the traditional context as well, not just for issues that tools such as Facebook are creating in the context of the Internet. The traditional system throws up as many horrific examples of violation of ethical responsibilities.

Read Full Post »

George Siemens bemoans the emerging trend that “higher education is not in control of its fate as it has failed to develop the capacity to be self-reliant in times of change”. Referring to a dilution of the stance against corporatization, and the way external innovation is driving change at the academy, George may just be right.

In my experience with universities, the pace of change is extremely slow. Maybe it is the product of excessive intellectual rigor, but it is slow. It is surprising that it is slow given that education is a prime objective of these organizations and one would expect that they be ahead of the curve in anything that a corporate might throw at them.

It is not that there is no innovation or that some universities are not ahead of the curve already, but it is a general sentiment that I share from experience.

What are the causes? 

I think (in my experience so far and may not be very generalizable) a main cause is systemic – the bureaucratic processes of governance coupled with a healthy dose of ignorance contribute to the extremely slow pace of reaction and action. The second cause is an arrogance that what exists is the best way to be. The third cause is blind mistrust of everything for-profit, as if employing intellectual property created by for-profits is something to be totally suspicious about (or looked down upon). The fourth cause is the ability to stifle innovation through politics and the threat of conformance, sometimes hiding behind procedure. The fifth cause is not enough thinking (and expertise) about the teaching process; the focus is more on each individual subject/discipline.

The for-profits have not done enough to allay the fears either. Over promising and under-delivering, delivering problems more than solutions, salivating at size & recurring revenues are accusations that can be made.

Education is not only a public domain. To the extent that it is a State responsibility, the State must ensure that capacities are built up to navigate new terrains. What I see around me is lip service when it comes to teacher training and a three letter word called ICT. To the extent it is a corporate goal, for-profits must realize they need to build far greater credibility and demonstrate far greater responsibility to the domain and all that they impact.

But by far, apart from the student, the most important stakeholder today is the teacher. The teacher forms, whether in for-profit or otherwise, the core around the learning experience and the most visible force in its implementation. By enabling students, by continuously evolving, by exploring the connectedness of knowledge and by deliberating openly on technologies, processes and techniques that impact the learning experience, will teachers be able to drive change.

I don’t think teachers realize the power they own in changing the learning experience. Looking at what is happening closer home, I see that they are content using their power to stay in the same place unmindful of the explosive pace being generated around newer ways of teaching and learning (and even of developments in their chosen field/discipline!). Both the for-profit and non-profit organizations are exploiting that non-use of power.

This must change.

Read Full Post »

Day 2 Power of Ideas #ETPOI

Market Sizing and Competition Analyses

Vijay Shukla

Competitive Analysis

You may not be all that unique. Understand the primary need and business is crucial and fundamental. Competition can be identified once you identify the need that you are addressing. And competition may be visible or invisible. Invisible competition is more dangerous because we don’t know who and where they are. Remember, you may get the feeling of uniqueness perhaps because other competitors have done the research and did not find your space/offering to be profitable. Put some rigor in your understanding of the business.

Note: One stop shop doesn’t bring the same premium in India. Companies are making more through specialization. So focus on the advantage/specialization.

Market Sizing

Don’t get caught in your own trap. “We want one so everyone wants one”. At early stage, you extrapolate from your experience, but that should only be an added factor for starting business. Disassociate yourself from the idea, so you can start looking at it objectively.

What’s the Market? What’s my total addressable market? Do the research before you meet the potential market. Are they willing to experiment, can they take decisions here, have they worked with smaller players, do they have budgets for your segment – use these filters to figure out who to go after at the beginning. These filters are really your business insight. Remember, buying behaviors of clients do not change just because you are starting up.

Make sure you validate data by asking commonsense questions. Check for counter sources and filters that prune the final numbers. Be accurate rather than precise.

Customers are different from users. Customers pay. They are your market. There are influencers – key opinion leaders. There may be many other influences, and you have to take them into account.


Would it make a difference if you did not exist? That is an important question.

Why would they not care?

  • Too many options that appear the same to customers
  • Demand is lesser than supply
  • Low entry barriers
  • Partnership with jobbers – balance shifts away from you
  • What is advertising doing for you?

How does brand matter in these conditions? Good businesses do hard labor to meet competition – to sustain their brand. Good businesses in these conditions become more efficient with the use of processes and technology.

There are many things that can be recommendations, but that will depend upon a host of factors.

Business Models

Rahul Aggarwal

“A business model describes the rationale of how an organization creates, delivers, and captures value”

Value Creation – building blocks

  1. Value conception – who conceptualizes and who controls
    1. Proprietary
    2. Collaborative (partnership/JV, crowdsourcing, co-creation)
  2. Value composition – who composes the value
    1. Form – product, service, hybrid
    2. Product vs service is getting blurred – but usually time to market, infrastructure requirements and growth curve; also timeframes for getting the revenue
    3. Challenge is to collapse the time to market
  3. Value production – who produces, who controls production assets?
    1. internal
    2. partner/JV
    3. Outsourced
    4. Crowd-sourced
    5. Pay for service (BOOT etc.)
    6. Platform
  4. Value targeting
    1. Forecast demand – push model (car company)
    2. React to demand – pull model (Dell, BTO supply chain)
    3. Activate latent demand or create new demand (space tourism?)
  5. Value pricing
    1. Amount, time, currency, payer, ownership
    2. Traditional buy
    3. Fractional Ownership
    4. EMI
    5. Rental
    6. Cross-subsidization
    7. Lottery
    8. Auction
    9. Free for the user
    10. Equity
  6. Value delivery
    1. Physical, virtual, hybrid
    2. Ownership
    3. Aggregation (info vs. look and feel vs. delivery vs. service)

Business Model Innovations – Blue Oceans – unexplored markets, unlocks hidden value, not easy to copy.

What will lead to consumers demanding/consuming more (Microfinance, JustBooks)? What will lead to customers’ latent demand getting activated? What will unleash totally new value for the customer? How can I capture more value?

What will differentiate your company from the rest? What is a defensible competitive position (what will tie in the customer? Example, Mahindra knows exactly the demand). What will enhance your influence and control in the ecosystem?

How linear is the relationship between resources and revenues (you should be looking at non-linear relationships)? What competencies will you need? For established companies, to what extent will systems and processes need to change? To what extent are you dependent on third parties?

Read Full Post »

In case you did not know already, I am at the Indian Institute of Management, Ahmedabad, India for an 8-day workshop that brings together 74 of the top ideas from among more than 16000 submissions to the Power of Ideas initiative. The first day so far has been interesting and I like the fact that I am returning to an academic environment after a long break, albeit for a short while.

Sandeepan Budhiraja’s first session on Identifying Customer Needs and Market Research was, interesting, particularly when he talked about focus on heavy users as the primary concern of business. When I tried to ask if this was applicable to all markets and economies, referring to the Long Tail and Power Laws being an opposite source of focus, the consensus was that the long tail consumers made incidental (a word I inappropriately used) purchases and that could not be the basis of product strategy/positioning.

Ignores pretty much what is happening in companies like eBay and Google and the media industry where distribution costs are negligible! In fact one opinion was that the monetization of the long tail is either the preserve of very large companies or at best something that is a more long-term outcome, both conclusions I would disagree with. Some links I found:

The other two sessions by Akshat Rathee and VC Karthic were interesting. They shared insight on the business planning components, presentation skills and key insights when talking to Venture Capital. Here is a short set of notes from those encounters.

First translate what your vision is – where you were, are, want to be

Putting together the Information Memorandum (IM)

  1. Build pitch
    1. Pitch is contractual – this is what I provide with a difference from others
      1. Question: Making a pitch involves propositions that need to be commonly understood: e.g. I will provide you a GFX Super accelerator for half the cost – is probably not a good idea because it is too technical
      2. Let the pitch give rise to questions, it should intrigue; make a statement that can be tuned at a second step to thinks like target segment, how will it be delivered etc.
  2. Build scale up vision while maintaining low-cost and high profitability
  3. What do we need money for? Investment is for growth; Asking money for marketing is a bad idea; for capital, working capital is a good idea – What do we need money for? (Split money between debt and equity); state your Internal Accrual; include CAPEX
  4. Exit
    1. How do we repay debt or show high returns to the investor (33% margin = 2X, meaning you have to grow by 100% every year if we don’t think of shares sale)?
    2. Who are the people who would like to buy this company x years from now giving investor an exit option; what would be total cash dividend that investor would get on exit? Specific names and numbers. IPO as an exit (strategic/PE investor).
    3. From the venture perspective:
      1. They want a team that can execute this the best way
      2. They want billion dollar market
      3. Want 5-20X return
      4. A good exit in 2-3 years
      5. Ways in which VCs can exit: Sells his stake, Sell the company, Goes IPO?
      6. Keep pitch simple, limited vocabulary
      7. Need to feel comfortable at a personal level, exploit any prior connects
      8. Identifying the pain; attacking the business thoroughly demonstrates the market research
  5. Things they look out for
    1. Competitive sustainable advantage
    2. Patent – adds guarantee Show the demo
  6. Business Plan – how you are going to make money
  7. State Assumptions: They (Analyst) will check formula and then assumptions. Plan is based ground up not on desire. Make assumptions that follow from what we know about the industry and the business. They must intuitively make sense. Quote the exact source. Backup with verifiable data. Make sure you know conflicting sources. Know how to defend each assumption.
  8. Now we are ready to make a business plan.
  9. Put in a FAQ or sections for team, capability etc.

Would welcome any corrections from the other participants! I think we have a really cool bunch of people here.

Read Full Post »

Older Posts »

%d bloggers like this: