Monday, October 26, 2009

Timidity Does Not Pay Either

In the last post, I talked about the risk one takes when they venture out on their own, following their own curiosity, in their research. Not going along with the crowd is a high risk, high reward situation. It is high risk because you might find that you have just wasted your time. It is high reward, because, if you are successful, you may get your name associated with something. This may sound like I am recommending that one go along with the pack. I am not. There are risks there as well.

If one follows the path of well defined research, there are risks and rewards as there are with anything. The reward may be that you are the one to find the thing that everyone is looking for. But, that reward may not be so great and the risks are not trivial. Consider the following analogy. Let's say that pirates buried treasure somewhere along a stretch of beach several miles long. If enough people scour the beach with metal detectors, somebody will be he person lucky enough to find it. What will that person get credit for? They will get credit for being the lucky one. That's it, and they may not even be able to keep the treasure.

How different is this from someone who studied old maps, read old ship logs and then determined where the treasure would likely be. And then, before actually looking for the treasure, they acquired salvage rights. The risk with this approach is that after all that work they may not find anything. But, if they do, they get prestige, recognition and mostly likely can keep the treasure. This is analogous to the situation I described last week. But let us return to the beach full of beachcombers with metal detectors.

The person who finds the treasure with a metal detector is not likely to gain the prestige and respect that the person who predicted its location would. The beachcomber would be seen as a technician who was merely applying their technique and got lucky. So, one of the risks associated with going along with the pack is that you may be seen merely as a technician. One of the hallmarks of lack luster research is that it is technically solid, means little and contributes less.

This disdain for the technician goes a long way back in the history of science. In fact, that is where the word 'science' came from. Prior to the mid 1800's what we currently call science was known as natural philosophy. As more and more people began to focus on data collection and less on the larger problems to be solved, natural philosophers began to chafe at the idea of having these people included in their ranks. So, around mid century William Whewell suggested the term 'scientist' (from the Latin word for knowledge) to refer to these technicians of knowledge acquisition who did not live up to the full meaning of natural philosopher. Today, of course, the term scientist is used as a term of respect rather than disdain. But it reflects the prevailing view that technicians somehow fall short of the mark.

If one is too far embedded in the current paradigm they risk being considered as little more than a technician. And, like the beachcomber who got lucky, they are unlikely to get full credit for whatever they discover. Is being a technician the only risk associated with going along with the crowd. No, not at all. It may be that the whole crowd is looking on the wrong beach. And that we will consider next.

Monday, October 19, 2009

Retroactive Blessings

Sometimes, an intellectual pursuit become research retroactively. It receives what amounts to a retroactive blessing. Let's say that a scholar pursues an avenue of inquiry that appears to all his or her colleagues as being somewhat fanciful. By fanciful we mean that it does not follow any of the commonly accepted methodologies; it is not attempting to answer any of the current questions; and it does not appear to be yielding anything of obvious value. This scholar's colleagues may dismiss this activity as not being research. They may call it an intellectual pursuit. They may even call it a legitimate intellectual pursuit. But, they would probably stop short of calling it research.

Then over time, let's say five or ten years, it begin to bear fruit. Since this is a hypothetical, we can push it a bit. So, let's say it bears fruit in a big way. It popularizes a new methodological technique, it helps answer an unanswered question, or it opens up a whole new vein of productive research. Would we consider the work done for the past five or ten years to be research? I think the answer is that there is no question. It would be viewed as research.

Now let's consider what would happen if it did not bear fruit. Everything else was the same. There were five or ten years of investigation. But they came up empty. Would it then be considered research? Probably not. So the very same activity becomes research if it pays off and is not research if it doesn't.

Scholars like to say that research does not have to pay off in order to be research. But when they say that, they are talking about some fairly narrowly defined activities within the bounds of convention. So, if I set up and experiment, for example, to test a principle then it would probably be considered research regardless of the outcome as long as the experimental design was solid and the principle being tested was viewed as non trivial. However, if I just follow my curiosity, where ever it takes me, it would have to pay off eventually to not be considered folly.

Why do we do this? Well, overwhelmingly when people just drift off on their own the results are not productive. So, we allow scholars to take a fair amount of personal risk in their endeavors. If their interests do pay off eventually, then they are acknowledged retroactively. If they do not, then they just have to face the fact that they wasted their time. So, is it better for researchs to stay close to the conventional? Maybe not. That approach has its risks as well.

Monday, October 12, 2009

But, Is It Research?

I am embarking on a development effort to create quest based learning tasks in Second Life. This is a part of the vein of work that I have been pursuing with virtual worlds and video games. A question that faculty often have to deal with when pursuing their interests is - Well, that all sounds very interesting, but, is it research?

Business schools have an inferiority complex when it comes to research (and rightly so) leading the faculty to question such pursuits with regard to their validity as research endeavors. This is an incredibly important topic for business school faculty, so I thought I would digress a bit on this issue of research.

Business school research is, at best, a poorly defined concept, with definitions and criteria varying widely from school to school and among faculty members within a school. Under the most lax definitions, everything is research and under the most stringent, nothing is research. So clearly the concepts needs a little clarifying.

Wernher von Braun is credited with the oft cited observation "research is what I am doing when I don't know what I am doing". The original quote included the modifier "basic" which changes the meaning slightly. However, this is the version that you see cited most often.

If not knowing what you are doing is the criteria for research then business faculty across this great nation and around the world are doing a great deal more research than they are getting credit for. And their is some justification for this claim. Research is the process by which we create new knowledge. And if we already know what we are doing, then we are not creating anything new.

Plato struggled with this same problem. He wondered how you could recognize truth unless you already knew it was truth. This, in turn, led to his observation that you never really learn anything new, you just remember things you already knew. This is actually an astute observation on Plato's part. But, I will just make him look silly if I try to explain it.

Putting aside the philosophical subtlties regarding truth and knowledge, I think it is fair to say that when people do not know what they are doing, we should take it at face value. That is, it is not research. They simply do not know what they are doing. At the same time we do not want to completely ignore the fact that in order to do research you must step away from the things you know and discover things that you don't know. Just exactly how that is done can be very tricky at times.

Monday, October 5, 2009

Why Study Games?

The past few posts have been about video games and this naturally leads to the question - why study video games? I would like to break this down into two questions: 1) why study games?; and 2) why study video games? These are really two quite different questions and need to be addressed separately. In this post I will take on the first one.

In his book The Grasshopper: Games, Life and Utopia Bernard Suits points out that in a utopian world, where work was not required for survival, most people would busy themselves with games. There is something fundamentally satisfying about games and it is one of the few activities that people pursue for its own sake. That is, we work so we can play. But we play because we like to play.

There is an important message in that observation. If work were more like play then we would want to work for its own sake. Or, if work were more like a game, then it would be more inherently satisfying. Same thing goes for education. If school were more like play we would want to go to school for its own sake. And if school were more like a game, it would be more inherently satisfying.

Unfortunately, people like to think that work and school should be hard. They should be unpleasant. So any attempts to make them more like play or more like games would only diminish their value. But, I believe, that this perspective is merely a rationalization. It is an attempt to make the best of or deal with a bad situation. Believing that work is supposed to be unpleasant allows us to accept our fate if we are doing work that we find as unpleasant. However, history does not support this view.

Work in the industrial age was far more unpleasant than it is today. In fact, it was not only unpleasant, it was dangerous, tedious, and detrimental to one's health. In the past century and a half great strides have been made in making the workplace safer, more pleasant and more satisfying. You would be hard pressed to find anyone who beleives that work was somehow better in the factories of Victorian England.

Education wasn't much better. At the dawn of the twentieth century most education was some form of recitation. Students would memorize materials, then stand up in class and recite what they had mastered. It wasn't until the mid 20 century that schools began to focus on problem solving skills and greater student engagement. Needless to say, school became much more interesting and today we look back distainfully on those days of 'rote memorization'.

The question is, with all the improvements that we have seen in the workplace and in education, is this as good as it gets? I don't think so. As we gain more and more insight into the nature of games and why they are so inherently satisfying, we can apply that understanding to the workplace and to schooling. Imagine what would happen to the economy if people preferred to lean new skills and apply them than anything else. If all those unproductive hours spent in front the the boob tube could be put to productive use, we might see another major improvement in quality of life.