Content-type: text/html Downes.ca ~ Stephen's Web ~ Is E-Learning 2.0 For Real?

Stephen Downes

Knowledge, Learning, Community

Jan 27, 2006

I have been participating in an online forum called the "Learning 2.0 Summit," a discussion that is (ironically) invisible to the wider web. Participants include people like Jay Cross, Robin Good, Mark Oehlert, Howard Rheinhgold, George Siemens, and a couple dozen more. This article collects some of my posts to date in that discussion.

Re: How can we create cultures that value learning?

I would caution against using the marines as an example. The marines are a very atypical organization, with an unusual management structure, and very different work objectives. They also get a lot more funding than typical organizations.

As to the main question in this thread - I would move the context of learning from the organization to the individual. I know there is going to be a tendency to talk about things like 'the learning organization' or 'how organizations value learning', etc. I think this is a mistake. Learning is something that happens in an individual, and while it make be influenced, either helpfully or otherwise, by an organization, it is in fact the entire environmental surround of an individual that will hold such sway.

As for the idea of learning being 'something we sell' - again, there are two perspectives here. One is the perspective of the learning provider. It is important to note that while the intent of the learner is (presumably, though not necessarily) to learn something, this learning is not what is sold: rather, what the vendors vend is a variety of assistance and aids to reach that objective (similarly, farming suppliers don't sell farmers 'crops' even though that is of course what farmers want, they sell things that will lead to crops, such as tractors and seeds).

From the point of view of the learner, the expectation is either that (i) they can buy some indicator of learning, such as a degree, which will satisfy some unrelated objective, or (ii) they want to learn something. These are two very different cases. It is unlikely people who want the first will be easily converted into people who want the second (yet the laws of economics dictate that we increasingly sell to them, and promise more and more indicators of learning (get your degree fast!) for less and less effort. This detracts from the objectives of those seeking the latter, who are not looking for some get-rich-quick aid, but instead to actually learn something. It is no concidence that such a person will seek out a learning vendor as a last resort, rather than the first.

The 'creation' of a culture (as if) that values learning lies, I believe, mostly in leaving behind the desire to sell to people who want simply to purchase the indicator, and to provide genuine support to those wishing to learn. But who is going to leave behind such a lucritive (and gullible) market as the get-rich-quick set? Anyone here?

Re: What's the ideal way to maintain an electronic portfolio of one's accomplishments?

Personally I think that the design of the portfolio is one of (or reflects) the accomplishments of the individual. There probably isn't a 'best way' to do it (why do we always seek a 'best way'? what's the 'best way' to get to San Francisco?).

Should more formal approaches to learning to take advantage of web 2.0 tools be made available?

I do not think 'haphazard' is necessarily a bad thing. People sink or swim in everything (including swimming). There are dangers in the attempt to guarantee success.

That said - my inclination would be to put the learning in the tool. A wiki that guides you in the productive use of the wiki. A blogging software that leads you to information on how to successfully blog. Social bookmarking that tells you how to successfully socially bookmark.

Now that I think about it - that help is already there, isn't it? Perhaps the learning using these tools is not so haphazard after all! Perhaps the fact that the tool comes with the learninge explains the runaway success of the tools. Certainly, there is no shortage of literature from, say, the arena of game design describing how the game should teach the user how to play as the game is played. The same appears to be inherent in the web 2.0 tools that have been successful.

It is easy to fall into the trap of believing that where there has been no (explicit) teaching, there is no learning. We should be cautious.

Re: Why is meta-learning - helping people to learn to learn - such a tough sell in practice?

A participant wrote, "Look at the failure of Chris Argyris or Doug Engelbart to get cognitive process improvement to work. One of the saddest things I've ever seen was Doug Engelbart's reply when I asked him, after he'd worked 60 years on essentially the same project, who was closest to implementing his vision. 'No one,' he replied. 'No one.' What's up with that?"

I taught logic and critical thinking for 8 or 9 years. Based on reports I have had from former students, nearly all of them have applied these techniques in their day-to-day lives. For some of them, it was the most significant course they ever took.

This is not because I am a great instructor, but because of the inherent value of the material. Logic works, we know it works, we can prove it works, and it is immediately applicable in daily life. The same is true for things like probability and statistical reasoning, semantic and linguistic analysis, informal reasoning and fallies, and the like.

My own view of the field of education is that is characterized mostly by unsupportable, and probably false, theories of learning (and indeed, I would suggest that most people in the field would agree that most of the theories taught in the field - there are so many theories!). In this, I think, we have the reason why 'learning to learn' has been, historically, such a failure.

Re: Where's the quid pro quo?

Why is it presupposed that people need rewards for sharing?

So far, what we have seen - whether it be in the areas of learning objects, blog posts, or whatever - is that far more content is produced than could ever possibly be shared, and that in environments where reuse is being encouraged (such as in learning) the problem appears to be more in how to get people to reuse what has been produced instead of to produce that which will be reused.

My observation is that people will create, not just a bit, but a torrent of interesting and useful content. Sometimes finding it is a challenge, sometimes repurposing it is a challenge, but rewarding the creators has never been a challenge. And - from what we have seen in surveys of academics - if any reward is required, it is nothing more than recognition and attribution. Giving credit where credit is due.

* * *

Let us suppose, for the sake of argument, that people will share freely when they can (certainly there is evidence that this is the case, but adducing this would be a side-argument).

The situation in corporations, then, is reminiscent of Rousseau's "Man is born free, but everywhere he is in chains." In other words, while people would share freely, the constraints imposed by the corporate environment - such as shortages of resources, an exclusively reward-driven culture, etc).

But instead of asking what it is that is dysfunctional about corporate culture that would preclude the free sharing of rosources - even when it is empirically demonstrable (and argued over and over) that such sharing is healthy for both individuals and the corporation, we are being asked to design some reward structure that leaves the (dysfunctional) corporate structure intact.

As for Murray's second point - that there are alternative, non-monetary rewards (I am reminded of Maslow) - let me point out that the existence of a reward does not entail the necessity of a reward (no more than does the existence of crime in our society prove that we need crime in order to have a society).

Re: Is Learning 2.0 for real?

I have already posted this link elsewhere, but it is relevant here. Here is my take on E-Learning 2.0.

* * *

That's a lot of baggage, especially in the blog post, but I'll get to the heart of Valerie's post:

"The reason organizations *have* training departments is so that learning needs can be identified and means for assisting workers in meeting those needs be assembled."

This is based on the premise that having better-educated (or better-trained) employees is better for the company. In this we both agree, I think. In any organization - whether it be a nation, a company, or even a curling club - the more knowledge and skills the members possess, the better for the organization.

The same is true for individuals. It is widely known that people with greater knowledge and skills are more likely to advance, are more likely to earn higher incomes (unless they are crooks, which explains CEOs), are even more likely to be healthier and to live longer. Most people know this, a certain percentage take an active interest in expanding their own capacities, and everyone, to more or less a degree, continues to learn simply through doing and experiencing.

The discussion, therefore, boils down to some more particular questions:
- how is it determined what a person (in an organization) learns, and
- how is it determined how a person learns.

The proposition cited above, therefore, resolves into the following assertion:

"The organization (and more specifically, designated entities in the organization, such as managers and training departments) are better able to determine for an individual in that organization both what to learn and how to learn."

Of course, the use of 'better' presupposes some goal or end-state, which will more likely or more rapidly be reached using one alternative than the other. Looking at this, though, we can easily see that there are alternative, possibly competing, end points, and specifically:
- the end-state desired by the organization, and
- the end-state desired by the individual

Now of course it is true that the end-state desired by the individual often matches that desired by the organization (or to paraphrase a cynically stated but accurate observation from another thread, people want what the boss wants). This is rational, especially insofar as the members depend on the organization for income and other supports.

But, of course, this is not true in all cases. Sometimes the interests of the individuals vary from those of the organization. This is especially the case when the organization is badly run, when the individual's term with the organization is limited, or when the individual's interests (such as raising a family) differ from the organization's (which is to make money).

All of that said, then the proposition cited above also asserts the following: "Learning in the organization ought to serve (exclusively?) the end-state desired by (representatives of?) the organization."

It is my argument that both assrtions are false, and specifically:

First, that it is not demonstrable, and most likely false, that the organization is better able to determine how and what to learn than is the employee. This is the case no matter what end-state (the individual's or the organization's) is being sought.

Second, that it is not demonstrable, and most likely false, that the end-state desired by the organization ought to be favoured over the end-state desired by the individual, even in cases where the learning occurs (and may even be paid for) by the organization.

The first argument effectively runs counter to the assertion that organizations are most effectively run in a hierarchical manner, with all decisions flowing from the top. There is substantial evidence and argumentation that runs counter to this, which I won't detail here.

The second argument flows simply from the observation that the relation between an individual and an organization is more or less contractual (some people dispute Rawls, so I'll hedge here), and at the very least, an exchange of mutual value. It would be in the organization's interests also to not pay its employees at all (which was once the practice, and is effectively the practice in some developing nations), but demonstrably, it is better to serve the individuals' interests (by paying them) than to merely serve the needs of the organization.

In other words, my argument boils down to two proposition:

- democracy is better than dictatorship, and
- freedom is better than slavery

From this, I derive what is the substantial statement of my position: organizations (nations, corporations, curling clubs) that operate in a free and democratic fashion are more effective than organizations that are not, and hence the mere (and accidentally historical) fact that corporations do not function as free democracies does not offer support for the practices employed by those corporations.

In other words, the mere assertion that 'corporations do such and such' does not stand as an argument (and glib blog assertions that I consider only the 'academic' view do not change that). My assertion is that if corporations do in fact want to improve their return, their productivity and their efficiency, then their practices will in fact become more democratic and more free. And specifically, in this context, that they will become smarter, faster, if they let individuals manage their own learning.

* * *

A commentator wrote, "One could frame this as a hierarchical approach (the dictatorship/slavery side of the equation), but one could also frame it in terms of another analogy: team sports v. individual competition. To be effective, a sports team trains together. They learn the basics of the game, but they also learn the special skills required in their position, and then to become excellent, they learn to work together as a unit."

No, this is changing the subject. There is no reason why individuals cannot choose to work with, and learn how to work with, a team, and do so freely. I would also point out that there is no reason why individuals cannot choose to be coached or given directions.

It's like the case of a tourist visiting Rome. When you get off the train, you have an array of options. You can simply wander about on your own. You can board a scheduled bus. You can take a taxi. You can use a map. You can ask for directions. How can have someone lead you to where you want to go. You can travel alone. You can join a group of travelers. All these are your options.

What you are saying is something like, "You have to join a tour group and be told where to go." And you are defending this by saying, "Otherwise, you would simply wander on your own and be lost." But the latter is a mischaracterization of my position. What I am arguing for is the choice, and not some particular outcome of the choice. A person could choose to join a tour group - and many do. But many don't - just look! - and the recommendation that people need to be told to join a tour group seems odd and unwise. Or as I would characterize it, undemocratic and unfree.

It is interesting that you brought up the subject of sports teams. Perhaps things are different in the U.S., but my experience of sports is that it is to a large degree a voluntary activity. People choose which sports they will play. They practice on their own, or with groups of friends they select. They may be coached, but they choose their own coach (and anyways, because coaching is expensive, this is only a small percentage of their practice). A person who eventualy becomes a professional athlete does so much more as a result of individual choice (their teachers or parents would likely have recommended that they become an accountant).

The mere fact that althletes work in teams does not somehow eliminate choice from their lives. It does not automatically subject them to arbitrary discipline. Only in certain highly unusual circumstances would an althlete put his skills (and his career) entirely in the hands of others (and they usually end up regretting it).

The commentator noted, "I'd just as soon my brain surgeon hasn't gotten his degree from Wattsamatta U..."

This is a straw man. Nobody wants brain surgery from someone who hasn't studied brain surgery. But if the person had freely chosen to study brain surgery, and if Wattsamatta U. was an accredited and highly acclaimed school of brain surgery, what's the objection then?

Here's a better choice for you: would you want to have your brain surgery performed by someone who loves medicine, chose to enter the field, and who filled his time learning about how it is done? Or would you rather have your surgery performed by someone who was assigned to be a brain surgeon, who would rather be a carpenter, who was told which courses to take whether he liked them or not?

The commentator stated his position, "I think that there's a place for management-mandated skills and knowledge..."

Perhaps, but this has not been demonstrated. And to my perception, the degree to which we rely on management-mandated skills and knowledge is the degree to which we are prepared to tolerate poor and indifferent learning in an organization. If your employees are not freely choosing to engage in training which helps both themselves and the organization, then the organization has deeper problems than untrained staff, and papering this over with force-fed safety instructions or whatever is a fool's paradise (oddly enough, the employees know this; it's management that doesn't seem to get it).

The commentator concluded, "We've strayed from the original topic of this item - so if folks want to discuss whether learning 2.0 is for real..."

I don't think we have. I find it interesting and revealing that the first response to what I wrote in E-Learning 2.0 was what is to my perspective an attack on personal freedoms. Look at what Valerie says, first thing: "It sorta presupposes that all the learning via web 2.0 technologies in organizations will be individuals venturing out on their own to seek what riches can be found, though. Which is understandable, considering the intense individual orientation of these technologies."

Of course, there is no such presupposition in web 2.0. What web 2.0 gives people is choice. To suggest that web 2.0 (or e-learning 2.0) reduces to wandering around on your own or adopting an individual-oriented stance is to significantly misunderstand this technology and how it is employed. Indeed, I think the authors of the various social networking, collaborative writing, messaging and even blogging applications would be most astonished to see their work being described thus.

If we analyze each one of the web 2.0 technologies, particularly as they are applied to learning, we will see that they are each of a kind. They give people a voice. They allow people to make choices. They expand individual capabilities and capacities. They reduce the amount of direction, and increase autonomy. What we get in web 2.0 is a freer and more democratic web, one where people can read and listen to each other, at their their own pace and in their own way, instead of to AOL and the New York Times (though these too remain among the choices). And, importantly, that's why people are taking to these technologies.

Understanding about web 2.0 isn't about understanding how to execute an HTTPRequest in Javascript. It is about understanding why you would want to execute this function, what capacities this function enables, and what makes it preferable to people over a screen dump of today's newspaper. Why, indeed, would people choose to write their own encyclopedia from scratch when there is a perfectly good Encyclopedia Britannica already available to them? If you don't understand this, then in my view, you don't understand web 2.0, and if your response is simply that they should be forced to read the Times or Britannica because they contain information people must read (and might not read otherwise) then in my view the response is simply not adequate.

As to the question, is web.20 (and e-learning 2.0) for real, well, this is an empirical question, and I would suggest that the rise of things such as blogging, social networking, tagging, Wikipedia, and the like, suggest that it is an empirical question well and truely answered by the evidence. Think about it. Web readers, in massive numbers, are choosing to read each other rather than the official sources. What does this tell us? What does this mean? To miss the importance of these questions is to miss what is genuinely different about web 2.0 and to misunderstand the direction that the web - and learning - is taking.

Or to put it more bluntly: if you see e-learning 2.0 as just another means of implementing mandated learning in the workplace, then you are not really seeing e-learning 2.0 at all, and you are simply trying to recast it in old familiar terms. And in this light, e-learning 2.0 will seem like nothing of particular interest at all. As, indeed, it wouldn't be.

* * *

George - I find it interesting (and more than a little ironic) that you are arguing for a subjectivist perspective (eg., from their point of view, the managers may be right...) But we'll take that up on Sunday.

From my perspective - if you will - what I am sketching is a map, using a Peters projection, which takes into account the curvature of the world. From my perspective, the responses thus far have been, "yes, but how will a flat earther use this?" and when I say that they should adopt the stance that the Earth is round, the rejoinder is that, "Yes, but they will continue to operate as though the Earth is flat - what use is your map to them."

And in the end I have to reply: none.

From my perspective, the idea that people need command and control managing is as archaic as the belief that the Earth is flat, and that web 2.0 tools are intended for those that believe in a round world, not a flat one. I believe that I can adduce tons of evidence to support my position, including evidence that speaks to the values of the flat-earthers (who, after all, still want to trade for spices from India, even if they believe there's no westward sea route to get there).

I have no doubt that no small number of people here and elsewhere will as a consequence simply dismiss me as some sort of academic or idealist or radical - goodness knows, it has happened enough before. But from where I stand, what I describe is from the evidence of my own eyes - the ships really do disappear behind the horizon, people really do perform better when they're free - and if what I see is radical, then what's happening in the world is radical, and calling me a mere academic or idealist or radical isn't going to change that (though it does make me feel bad, and I have to go commiserate with my cat, because I would much rather be valued for my observation than devalued for describing what I see).

Footnote on Terminology

Posted to Ideant on the same day:

Ulises writes, "Stephen hints at the false dichotomy between self-directed behaviour and self-interested behaviour (although I am not sure I get what he means later by 'managed behavior'). According to my argument, the difference would be inconsequential, as collective moral behavior emerges from the aggregation of self-directed/interested behavior."

First - there is in my mind a significant difference between 'self interested behaviour' and 'self directed behaviour'. People may choose of their own accord to do something that is not in their own self-interests; this happens all the time. Self interest may be a motivation for a self directed behaviour, but it is not the only one.

I am interested in self directed behaviour, something which I also characterize under the heading of 'autonomy'. When I say 'autonomy' I mean self directed behaviour, and the converse. The contrary to self dfirected behaviour is 'managed behaviour', that is to say, cases where actions are not autonomous. If you are given a directive with which, for whatever reason, you must comply, then this is a case of managed behaviour.

It should be noted (because people often conflate my view on this) that autonomous behaviour does not preclude, or contradict, group behaviour or even various types of coordinated behaviour. I can tell you 'jump three times' and you can do it: this is an example of group or coordinated behaviour, but insofar as no corecion was involved, is nonetheless autonomous behaviour.

In order for a pattern of emergence to be identifiable in a network of behavers, the behaviour of each individual must be autonomous. It may, of course, be influenced by external individuals, but the decision of how to act must be up to each individual. This individual may choose to act in tandem with external agents, or at opposition to them, and this behaviour may be self interested or selfless.

Autonomy, on my view, is one of four necessary conditions for the possibility of emergence. The remaining three are diversity, interactivity, and openness. It should be clear that these are in fact all facets of the same condition, but I don't think that condition has a name as yet. Certainly I have no desire to give it one.



Stephen Downes Stephen Downes, Casselman, Canada
stephen@downes.ca

Copyright 2024
Last Updated: Dec 15, 2024 4:58 p.m.

Canadian Flag Creative Commons License.

Force:yes