Sunday, September 5, 2010

Did I Learn Anything?

I started this blog almost two years ago in an attempt to learn more about blogging. So, it seems relevant to ask – Did I Learn Anything? The answer is yes, absolutely yes, I learned a great deal more than I could possibly explain. However, for the sake of economy I will pare it all down to five major lessons.

1) Blogging is good even if nobody reads it – it is writing practice and it becomes part of the historical record. People used to write letters to each other and save them in the attic. Historians pour through these personal correspondences to get a ground level view of a period from the perspective of people living during that period. In the future, historians may consult any records you may have created. Further, blogs are both easier to search and easier to read than personal letters. So Blogs, like letters and diaries become a rough draft of history.

2) Blogs can be used to great advantage as public research notebooks or even public journals. Maybe you don’t want to wait for historians. Maybe you have something to say today and would like to share it with others who have similar interests. You don’t have to be an academic or professional researcher. You can be a movie or music critic. You can provide opinions on beers or single malt whiskies. Maybe you have a special interest in your family history or the history of your home town. Anything that captures your interest probably captures the interest of somebody else as well. So you blog become the focal point of a shared conversation on a topic of interest to you and others.

3) Using a technology like blogging changes the way you look at the world. Your view of the world probably solidified sometime between your 18th and 25th birthdays. But the world continued to change. The longer you go without an update the farther out of touch you become. Blogging can help you keep up to date in two important ways. First, just reading the blogs of others will keep you up with the way people are thinking at the moment. Blogger are more likely to live in the present than people who don’t blog. Second, there is always a culture surrounding a new technology. They are more likely to buy certain products such as Smart Phones and more likely to watch certain shows like The IT Crowd or The Guild. Exposure to this sort of thing keeps you current in your thinking.

4) The only way to really understand blogging is to blog. Like so many things you can’t learn how to blog by reading, talking or planning. You have to try it. If you have nothing to say, your muse is probably asleep from lack of use. Write you first post on not having anything to say. Write your next one on how hard it is to choose a topic. Just start writing. Eventually, as in conversation, the words will begin to flow.

5) If you start blogging you will start doing other things. No, blogging does not lead to crack cocaine. But, if you blog, you might start posting message to a forum, or sending emails through “Contact Us” pages. You might start emailing more or IM’ing more or texting more. You might join a chat room or a Facebook group. You might even create an alter ego in Second Life. Just take that first step and other steps will likely follow.

And with those observations, I close my first blog. It is now part of the historical record and I am on to other things.

Monday, August 30, 2010

Another Academic Year Begins

How many jobs are there where you get a fresh start twice a year. You get to start over again, fix your mistakes and put more effort into the things that worked. Well, there are not many jobs like that but being an academic is one of them. We get two fresh starts every year. We call them semesters. This really is one of the best things about being an academic. Every time a semester ends you can review it and see what did not work out as well as you would have liked. And every time a semester begins you have a chance to try again.

Fortunately, I am happy to say, most of the things I am working on are working well. I have three major classes which are my teaching focus and they are all advancing nicely. I used to teach way more classes and did not do nearly as good of a job. But, I am pleased to say that my classes are going well.

My research is not going as well as it has in the past. But, there are several reasons for that which are not really important enough to get into here. But, with my new interest in imagination I will be slowly getting that back on track again.

I have invested heavily, time wise, over the past five years in virtual worlds. Sadly, all that effort did not pay off as well as I thought it would. There are many reasons for that and they are probably not important enough to elaborate on. This is also one of the reasons why my research lagged. I was hoping for some productive new veins of research   in virtual worlds. That did not come about.

However, I am turning my attention from virtual worlds to social interaction technologies. Hopefully, this will pan out a little better.

A new semester begins to today. And with it begins new initiatives, new ideas, new hopes and new directions. What more could one ask for?

Wednesday, August 25, 2010

A Progression of Ideals

It may come as a surprise to most people but scientists do not study the things in the world, they study idealizations of them. This is a very Platonic approach and even more surprising when one considered the fact that modern scientists probably see themselves more as the children of Aristotle than as the children of Plato.

Galileo introduced the idea of studying idealizations into physics. If you take an introductory class in physics you will learn about ideal springs, frictionless inclined planes, and free fall in a vacuum.  These are all idealizations.

Max Weber introduced the concept of ideal types into social science. He said there is no such thing as a bureaucracy, but we study this idealization because it has greater value in advancing our knowledge.

In information in particular and areas of technology in general, we are concerned not with idealizations of physical phenomena, nor with idealizations of social phenomena. We are concerned with idealizations of artifacts that will exist in the future. And studying artifacts that exist in the future requires imagination. 

Tuesday, August 17, 2010

The Role of Imagination in Information Systems Research

I like being propelled along in my research by the pursuit of compelling new ideas. New ideas are exciting and often provide the energy needed to make progress on older ideas once they have lost their luster. I have discussed a number of new ideas in this blog. Most new ideas don't really go anywhere of significance but some do seem to last over time. For example, it was over fifteen years ago that I started exploring the role of stories in computer ethics. And I am still working on that very rich vein of research.

I have latched on to a new idea that I will be pursuing this year and hopefully for many years to come and that is the role of imagination in information systems research. This idea did not just occur to me. Rather it coalesced out of a lot of other ideas I was working on.

Certainly the role of imagination in writing stories is a big part of the work I am doing in writing stories to explore the ethics of technology. In that course, I have been increasingly emphasizing the use of the imagination. It is the imagination that allows us to consider, compare and choose between possible worlds. And choosing between possible worlds is at the heart of the ethics of technology. 

As part of the justification for development of the imagination, I began to develop arguments for the importance of imagination  as a business skill. My students are, after all, business students. And success in business is a thing they care about. As I worked on these arguments, I began to realize that most business skills can be considered as analytical skills or imaginative skills. For example, to understand why a thing is happening you need analysis. But to figure out what you should do requires imagination.

My interest in the imagination began to creep into other area that I was work on until, one day, it dawned on me that imagination is vastly more important than analysis in information systems research. I will develop that theme over the next few posts. 

Tuesday, August 10, 2010

The Travails of Writing and Publishing

I love to write. In fact, I can't help writing. I write everyday and some days have quite an impressive output to show for it. Writing is a way to organize your ideas and express your creativity. It feels good to write. And if I go a day without writing it is like a day without exercising or a day disrupted by an ad hoc event or a power outage. It just knocks me out of my groove.

You would think that writing and publishing go hand in hand. You write a clever piece and a publisher gets it to market for you. But, it is nothing like that. Publishing is a business and publishers care about sales. Even academic publishers have similar concerns. They are less heavy handed to be sure. They care about readership and citations and reputation. But all of that translates, ultimately, into sales. If a publisher cannot get people to pay for reading their publications they cannot survive.

This, in turn, translates into a wide variety of seemingly odd behaviors. Recently, there was a discussion on a listserv that I subscribe to over whether or not it is appropriate for a journal editor to ask authors to cite papers from the target journal. Presumably, the editor did not want the author to miss an important paper on the topic of submitted paper. While that is possible, it is more likely that the editor was concerned about exposure for the journal's offerings or even, cynically, the citation index.

I have sent book proposals to publishers only to have them rejected because they did not match the publishers 'list'. That is, they want books of a particular kind and that is the primary concern. This does not make sense if you believe the publisher's first concern is quality publications. However, if you realize that similar publications will be attractive to their customer base hence making marketing much easier, it does make sense.

I once submitted a proposal to a publisher for a book that I expected to be around 60,000 words. It was  almost finished and that seemed like a nice round number once I had completed the revisions. The publisher said they could not sell a book unless it was 100,000 words. They encouraged me to extend the book in question to that length. I could not do it. I had said what I had to say in 60,000 words and to stretch it to 100,000 words would require a lot of filler. Next time you are wading through a book with a lot of pointless junk in it, remember that you are probably wading through filler that the publisher requested in order to accept the book for publication.

Due to my passion for writing, I have numerous books in various stages of completion. Realizing my innate inability to accept the terms of publishers I have decided to put them on my website and give them away for free. If you would like to see what I have, go to my webpage and click on Books in Progress. This is not as crazy of an idea as it may first seem. As an academic I write to achieve recognition for my ideas. Giving the manuscripts away for free will maximize exposure. Granted, I won't make any money on them, but few academics actually make much money on their books. And if they do make money, it usually involves some sort of compromise - it is a text book really written by a committee or it is a popular press book which is a little thin on academic substance.

The only serious drawback is that of recognition within my own institution. If I can claim to have a book published by an academic press, that carries academic prestige. If I say I put my book on my website for free download, there is substantially less prestige accorded. However, I have never been one to care much about prestige. So, putting my book on my website for free works well for me. Now, if nobody cares enough to download them even for free, well then I do have a problem.

Monday, August 2, 2010

Looking Ahead to AY 2010-2011

When I started this blog, one of the things I intended to do was to give people a glimpse of what academic life is like. So, I thought I would shift gears for the month of August and talk a little bit about what I am thinking about for the upcoming academic year.It is now August and that means two things. Summer classes are over and I have to start working on my classes for the Fall.

There is a caricature of an old professor teaching from yellowing, faded notes that he developed when he was a newly minted PhD and hasn't  updated since. If you ever find that guy, let me know. I'd like to get a few tips from him. I post my teaching materials on  a website for students to download. For years I have been attempting to get them set up so I can just copy the materials from one semester to the next. I have yet to achieve that. I am always tinkering, trying out new ideas, polishing bits that didn't work as well as I would have liked, experimenting with new techniques and assignments. So, every semester I review and revise my notes and syllabus before uploading them once again. A chunk of August will be devoted to that. I like to begin the school year with all my teaching materials completed because things will get crazy as the year gets underway and I will not have time to get back to my notes later in the semester.

I will also be doing some writing. Two of the classes I teach do not have appropriate books available for them. This is because I cooked up the ideas for the classes rather than just following somebody else's work. I write a lot but since my ideas are not mainstream I have not had a great deal of luck with publishers. So, I decided to make them available as a free download at my website. More about that later.

I am also planning to take on a new research project this coming year. It is an outgrowth of my work in other areas. I want to examine the role of imagination in information systems research. Talk about outside the mainstream. No wonder I have trouble finding publishers.

We will also taking some new directions with the school beginning this year. We moved into a new building a couple years ago. And we have a new Dean, whom I have not yet met, beginning in September. The school is primed for change. It will be interesting to see which way it goes.

Monday, July 26, 2010

Role Play and Roles

The roles that I discussed in the last post can be thought of as roles defined in a bottom up fashion. That is, you do something which indicates a role preference and then get suggestions for other things that maybe be suitable to someone who enjoys that role. But, bottom up role definition has some problems.

Let's say that you are a creative, artistic, imaginative person who has for some reason been steered away from being who you are naturally. Let's say further that you have adopted a serious person complete with a set of interests that are appropriate for that serious persona but not really suitable to who you are. If your serious person reads only nonfiction then it is likely that bottom up approach will only reinforce this mismatch. You order dull books on history and the algorithms suggest more history books. How do you break this cycle?

I think one answer may be virtual world role play. I run into endless people in Second Life who are there because they can behave in ways that feel much more natural and yet would not be acceptable for some reason in real life.

Several months back, I wrote a few posts on a concept called StrengthsFinders which was developed by the Gallup Corporation. The premise of StrengthsFinders is that there are certain things that you are hard wired to do and doing those things are virtually effortless for you. I think this idea can be extended beyond business strengths to roles in general. There are things you can do that are virtually effortless and feel natural. If you do those things life is easier and more satisfying. However, for any number of reasons people often get steered away from roles that are natural for them and into roles that are not natural. Role playing in a virtual world allows you to explore different roles and possibly find roles that are more satisfying for you. 

Monday, July 19, 2010

Facebook and Roles

Last time I explained how you adopt roles based upon purchases at places like Amazon or Netflix. People who bought (or rented) Product A also liked Product B. Next let's consider how your role becomes solidified through use of a social interaction technology such as Facebook.

Consider some of the actions you perform on Facebook. You friend people. You join groups. You use applications such as quizzes or play games such as Farmtown. And you post bits of news about yourself and read bits of news that others have posted. Each of these activities refines the role you play as yourself.

First of all, you select friends based on some similarity of interest. Some have friends only from work. Others exclude friend from work. Some select friends that they know for real. Others only have virtual friends. Some people will accept friend requests from anyone while others are very choosy. The point is that you are defined to some extent by the company you keep online and the friends you select will begin to define you over time.

You join groups based upon your interests or based on the recommendation of friends. The groups will in turn affect the advertisements that you see, further reinforcing your evolving role. When you post information about your day "I had a great lunch at a new restaurant" you are suggesting possible activities for your friends. Note the similarity here between the Amazon claim "People who liked Product A also liked Product B" and "I liked Restaurant X so you might like Restaurant X." Postings, obviously, are not limited to restaurants. People post their experiences with books, movies, concerts, and all manner of activities.

This is not, of course, limited to Facebook. Your Twitter feeds or your virtual world avatars have a similar effect. Through mediated social interaction you negotiate and refine your role and the roles of other's around you. Next time, I will take this a step further with role play in virtual worlds and see how that impacts your evolving roles.

Monday, July 12, 2010

Roles and Identity

I am not going to be coy about it, carefully building an argument and then springing the conclusion on you. I am going to go straight to the punchline and then go back and support it. The next era of computer applications following the Age of Information will be the Age of Roles and Identity. This next step is no more obvious given our understanding of information systems than the evolution of information systems was obvious given our understanding of automation. In fact, most people who were in the thick of things in the automation stage of computer applications simply could not grasp the changes that were about to come. And similarly, people who are in the thick of things in information systems will probably have a hard time grasping the shifting focus to roles and identity. So, let me begin with a very concrete example of how information leads to the definition of roles and identity.

Let's say you go to Amazon and order a book. Next time you show up at the website they may very well suggest another book for you based on the previous purchase. You may buy  the recommended book or you may not. If you show up again, you will be offer more suggestions. Over time, assuming that you do buy another book now and then you will have sorted yourself into a de facto category of people who like a particular cluster of books. That cluster of books, to some extent defines you. And the longer you accept this role, the more it becomes who you are.

Another great example is Netflix. You order a dvd from Netflix and then rate it after you have viewed it. Netflix then turns around and suggests others shows you may like based upon the dvds you have ordered and your rating of them. Over time, as with Amazon, you sort yourself into a de facto category of people who like a particular cluster of movies. And, that cluster of movies, to some extent defines you. And, again, as with the books, the longer you accept this role, the more it becomes who you are.

We have seen in recent years that Amazon has tried to extend this idea. People who bought this book liked this music or this video game. Your cluster becomes larger and begins to define you more fully. This is how information leads to roles and identity. Now how about the other direction? The need to define roles (or categories) more precisely will lead to a need for more information. Do people who read Piers Anthony vote in a consistent manner? No? Well how about people who read Piers Anthony, listen to Red Hot Chilli Peppers and watch The IT Crowd? If that still isn't enough, how about if they have Starbucks coffee more than twice a week and go to the gym at least once?

Information gives rise to roles and roles give rise to the need for more information. But, we are nowhere near finished. Over the next few posts we will explore this further. What if you are the kind of person whose doesn't like to be that kind of person? How many kinds of people are there? Matching who you are with what you are. And the value of role play. I think it will get interesting. I hope you stay tuned.

Monday, July 5, 2010

And Then There Was Information

Some time in the late 1970's William Kent published a book entitled Data and Reality in which he provided some philosophical grounding for the design of databases. Although it is still somewhat difficult to get through today, it was totally incomprehensible in the late 1970's. One of the zen koans that he provided was that database design should model information rather than the way information is processed. For database designers of the day, this was no more meaningful than the sound of one hand clapping.

Databases in the 1970's were design to support data processing systems. They were really little more than fancy files with some handy functional features like support for transactions, data independence and the like. The were storage and retrieval mechanisms used to support data processing. In fact, early data bases did not even have query languages.

But  a new idea was emerging in the late 1970s and that was the idea that all the data that was being processed might have a secondary purpose as information about the organization. This information could, potentially be used to better understand and more effectively run the organization. As obvious as this is today, it was a novel observation at the time and it took a  while to catch on.

Of course today well designed databases are designed primarily for information and processing is merely the means by which the information is kept up to data. But as data processing (automation) gave way to information systems there were a lot of entrenched views of data processing systems that needed to be over come.

In keeping with the expanding model I am developing here I would like to point out a few things. First, when automation was the primary usage of computer applications, it was far from obvious that the next stage would be information systems. In fact, it took quite a while for that idea to catch on. Second, as automation systems created information giving rise to information systems, information systems in turn created a greater demand for automation. That is, when you are modeling the organization in information you need more and more automation to provide more and more information.

Now we are in the era of information systems and the question that I started this thread with was - can we see what the next stage of evolution will be. If the model is consistent, it will not be obvious from what we have today and as we evolve into it, it will create an even greater demand for information. But, to find out what that next stage is, you will have to come back next week.

Monday, June 28, 2010

Next Came Automation

By the mid to late 1960's the usage of computer systems in automating business record keeping processes was well underway. Although the machines were still called computers, most people had forgotten about the role of the computer as a computer and thought about it as a data processing machine. At the time, the term of choice for what computers were doing was data processing. In fact, one of the premium professional organizations was called The Data Processing Management Association. And many corporate departments who were responsible for these activities had the term Data Processing figuring prominently in their name.

Over the two decades from the mid 1960's to the mid 1980's most corporate jobs changed substantially as work that had previously been done by people was picked up by the computer in automated data processing systems. And people, who had, themselves, been the data processing system were now users of the automated data processing system.

Early computer systems were often justified on the basis of the number of jobs they replaced in the cost benefit analysis. "If we automate this process, we can get rid of twenty clerical positions." would be a typical claim. Although, as I remember from the time, it is unclear that any jobs were every really lost.The notion of automation required that jobs be lost to justify it. However, the real reason why these systems were being built was that everybody else was building them. And if you didn't keep up you would surely fall behind.

We can make two observations about the relationship between computation and automation that will help us extend this model later. First, computation gave rise to automation although that extension is not at all obvious. One would not look at a computer systems and automatically see an automation system. Second, the increased usage of the computer for automation gave rise to a greater demand for computation. This is, the world demand for computers turn out to be far more than five because data processing dramatically increased the demand for computers.

Monday, June 21, 2010

In the Beginning there was Computation

There is an apocryphal quote attributed to Thomas Watson the head of IBM at the time the computer was invented. As the story goes, when asked about the size of the world market for computers, he replied "I think there is a world market for maybe five computers". Although there is little evidence that he actually said this, the story has taken on the status of urban legend and is also revealing of our understanding of computers at the time.

The computer was not really invented in the 1940's, it was resurrected. Charles Babbage did the foundation work in the early 1800's attempting to develop a machine that could compute the roots of polynomials. That notion of a computational machine stuck in the very name of the device that we still, to this very day, call a computer. Early computers were not seen, as they would be later, as information processing machines. They were seen as computational machines. Hence the name and the misunderstanding of the market implied by the apocryphal quote. If the computer had remained a mega calculator there may very well have been a very limited market for it.

Fortunately, some of the engineers at IBM, at the time, had a little imagination and could see beyond the basic computational capabilities. I can imagine that this caused no end of internal conflict at IBM as their cash cow, at the time, was the Electronic Accounting Machine which processed information on paper cards. Suggesting that the computer be used for this purpose was not only a suggestion that they replace their entire business with a new mode of processing, but it also suggested that information be encoded, not on cards which you can see, but in bits of electricity that you cannot see. So, although this seems obvious in hindsight, it was quite a leap of imagination at the time.

By the late 1960's the use of computers in automating business record keeping systems was in full swing. In fact, computation was, at this point, a fairly minor use of computer power. The computer resources used, for example, to compute your pay check are minor compared to the processing necessary to get the information ready for the computation and producing the paycheck once the computation is completed. If the name of these machines was updated along with their function, they would have been called automaters rather than computers. But, that did not seem very important at the time. So the old name stuck.

The point that I would like to close this piece with is - the computational power of computers led to the use of computers for automation. And this theme of computer usage leading to greater usages will be expanded upon in subsequent posts.

Monday, June 14, 2010

How Long Will The Information Age Last?

Alvin Toffler published a book in 1980, called The Third Wave, which caught the attention of the mass market as well as numerous academics. In this book, he attempted to explain the turmoil we were experiencing in the 1960's and 1970's as the third wave of change for human civilization. More specifically, he saw the evolution of human civilization as undergoing three major transformative revolutions. The first was the agricultural revolution. The second was the industrial revolution. And the third (the third wave) was a transition to a post industrial society. This post industrial society has become known as the Information Age.

I am neither a staunch supporter nor a staunch critic of Toffler's characterization. On the positive side, he provides a thought provoking characterization of the evolution of civilization. It is easy to understand and it got a lot of people thinking about what was going on. On the down side it is one of many, many possible characterizations and I always have a bit of trouble when authors include the present in their historical perspectives. That said, he did popularize, although not coin, the term Information Age.

I think that most people tend to accept the fact, free of critical reflection, that we are in the Information Age. It is not clear to me that this is true. But if it is, we can ask how long will the information age last and what will come next?

The agrarian age lasted thousands of years while the industrial age lasts only hundreds. If progress is speeding up and we can draw on the patterns of the previous ages, then the information age should last only decades. If this is the case, then something new ought to be coming along very shortly. What is it?

I don't want to fall into the same trap that I just criticized Toffler for and provide a historical perspective that includes the present and the near future. I do think, however, we can look at patterns from the past to the present and provide reasonable speculations on what might be coming next. Over the next few posts I am going to look at the evolution of computer applications from the automation of the 1960's to the social interaction technologies of 2010 and see if we can find any useful patterns.

Monday, June 7, 2010

Global Warming as Apocalyptic Thinking

It is a good thing that not many people read my blog or I would set off a fire storm with this next claim. However, I believe, that the current obsession with global warming is yet another example of apocalyptic thinking. Bear with me as I make my case.

First, let me say that I agree philosophically with much of what the global warming movement is concerned about. We are wasteful of our resources. We are overly dependent on fossil fuels. We should learn to live in harmony with our environment. We are living a technologically subsidized unsustainable live style.

The problem is that these are philosophical, perhaps even spiritual beliefs. When people try to make them scientific claims, in support of their beliefs, I have a problem with that. It is like saying 96% of all atheists go to hell. Therefore you should be a Christian. Or in a more realistic vein, one might claim that religious people live 14% longer than non religious people. There are lots of good reasons why one might choose to be a Christian. But, statistics about atheists are not among them.

Using scientific data to shore up philosophical beliefs is always problematic. Don't forget that 'scientific' data was used to prove racial inferiority in the 19th century. So, I am always concerned about the barrage of 'scientific' data used to support the global warming belief system. Why? I believe that underlying this movement is subconscious vein of apocalyptic thinking. Following the subtext model from the previous post consider this subtext of global warming:

1) We have gotten off track. We have sinned. We had allowed our greed to consume us. Our consumption is our of control. We have failed to show respect for things greater than ourselves. We have engaged in the sin of hubris thinking we are greater than we are.

2) Things will be set right by forces beyond our control: We must pay for our sins. We will suffer. We will be punished. The earth will strike back at us and we will be exiled, once again from Eden. Glaciers will melt. Seas will rise. Populations will die. Fertile farm lands will become dessert.

I don't mean to suggest that there is not a problem we should be concerned with. We have way more people on the planet than we can sustain without substantial technological leverage. If it weren't for technological advances of the past couple centuries, Malthusian cycles would have kicked in. But technological subsidies are not limitless and the Malthusian cycles will kick in at some point. Fuel efficient cars and alternative energies may buy us a little time but they do not solve the problem.

If we were serious about solving the problem we would be talking about population control rather than whipping ourselves up into an apocalyptic religious frenzy that avoids the real problem.

Well, for those of you who do read this blog and are fed up with my ranting about apocalyptic thinking, you will be happy to know that in the next post I will move on. Thanks for your patience.

Monday, May 31, 2010

The Y2K Bug as Apocalyptic Thinking

There are two key elements to apocalyptic thinking. First, we have gotten off track some how, and, second, punishment will be doled out or things will be set right by forces beyond our control. Neither of these components holds up well under scrutiny. But, the beliefs that make up our worldview are generally unquestioned assumptions about reality that do not hold up particularly well under scrutiny. If they did hold up well under scrutiny they would be part our scientific knowledge and not part of our worldview. (Of course, belief in science is a part of our worldview that also does not hold up well under scrutiny. But, that is another argument for another day.)

The claim that we have gotten off track some how suggests that there is a track that we should be on. If there is one, it is largely one of our own construction. That is, we believe a bunch of things about how the world should be and when our behavior varies from those beliefs, we feel we have gotten off track.

Second, as discussed earlier, there is a difference between retribution and likely outcomes. If you run across a busy street there is a good chance that you make get struck by a car. To suggest that you 'deserve' to get struck by a car suggests that our moral equilibrium is being enforced by transcendental powers.

Consider the subtext of the Y2K Bug. I apologize for the dramatic wording but, I believe that it helps make the point.

1) We have gotten off track: We have sinned. We have been seduced by the idol of technology. We have given ourselves over to technology sacrificing our free will and free spirits. We rely on technology for our very being. Our worship of and reliance on technology is just modern day idolatry.

2) Things will be set right by forces beyond our control: We must pay for our sins. We will suffer. We will be punished. The coy mistress of technology will turn her back on us leaving us vulnerable to loosing all that we find holy. Banks will fail. Airplanes will fall out of the sky. Medical technology will take lives instead of saving them. As people struggle to survive, civilization may come to an end.

When the clocks ticked over to the year 2000 and nothing happened, did anyone ask - why? Did anyone try to justify their beliefs? No, because beliefs are beliefs and require no justification. Believers still believe and will point to all the effort that went into preventing the problem. Disbelievers still disbelieve and see all the spending leading up to the year 2000 as money wasted. Never the twain shall meet and it will never be resolved.

Monday, May 24, 2010

Remember the Y2K Bug?

It hasn't been that long. A little more than a decade ago, in the time leading up to the new millennium, there were dire predictions about the potential impacts of a bug in computer software known as the Y2K Bug.

The premise was very simple. Since the 1960's dates were represented in computer programs as a two digit number. So, 1968, for example, was represented as 68. You could always tell which year preceded another year because the earlier year would have a lower number. 68 comes before 69 and so on. However, at the turn of the millennium, this was no longer true. The year prior to the change in millennium would be 99 and the next year would be 00. So, any bit of decision logic within a program that relied on subsequent years having a higher number would fail.

There were, no doubt, many software programs that contained this flaw so I don't want to dismiss it out of hand. However, what is really hard to believe was the extent to which the implications of this flaw were extrapolated. As the clock ticked over to the year 2000 computer systems throughout the world would just shut down; crash from the Y2K bug. Elevators would stop working; Planes would fall out of the air; Life support equipment would stop its life support functions. Even more dire consequences were batted around. It could be the end of life as we know it. It could be the end of Western civilization.

Less dire but more personal consequences were predicted. You won't get your bills on time so you can't pay them on time. Since you don't pay your bills, you will loose your house and car and your good credit score.

And, yet, none of that happened. Why not? And why did we ever believe it would?

Let me offer a few observations. I am reluctant to use the word 'facts' as 'fact' are a matter of perspective and perspectives on this issue were all over the map.

First, the less one knew about computer systems the more likely one was to believe in the Y2K bug.

Second, most computer software, especially that written by competent software professionals, would not have this problem.

Third, if software written by professionals did have this problem, fixing it would be a fairly straightforward maintenance task.

Fourth, the real focus of this problem was computer code written in the 1960's and 1970's by non professionals who not only created the date problem but wrote such poor code that maintenance would be a nightmare. The underlying problem here was not the way the date was represented. The underlying problem was the lack of maintainability of the computer code.

So, there was a real problem but it was of very limited scope and very limited consequence. How, then, did this get extrapolated into the end of civilization as we know it? The answer, I believe, is that it is a great example of modern, secular apocalyptic thinking. And that will be expanded upon in the next post.



Monday, May 17, 2010

Examples of Apocalyptic Thinkings

There is no shortage of examples of apocalyptic thinking throughout history. Certainly, Old Testament prophets who railed against the evils of society were a classic example. Society had gotten off track some how and the judgement would come. We see this repeated as recently as in the early days of the United States where a series of religious revivals called Great Awakenings sprewed forth apocalyptic hellfire and brimstone rhetoric. The most recent of these was barely a hundred years ago. So this is not an artifact of the far distant past.

There is also no shortage of incidents where people thought the end of the world was coming. Most recently, we had a fascination with the Prophecies of Nostradamus regarding the new millenium and when that did not happen, we turned our sights to the Mayan Calendar prediction of the world ending in 2012. In fact, there was a blockbuster movie made with that name (2012) and exploring that theme. So, again, these ideas are not just artifacts of the distant past.

There are more striking examples of apocalyptic thinking in recent years. David Koresh, leader of the Branch Davidians, more common know as the Waco sect, believed that we were in the end of days. And the enormous popularity of the Left Behind Series (a series of books about the rapture) suggests that apocalyptic notions still capture the interest of many.

But, are these just examples of people far from the mainstream? Or do regular, otherwise normal, people show signs of apocalyptic thinking? They do, and the examples may surprise you.

Monday, May 10, 2010

Apocalyptic Thinking in Practice

Equilibria are very common in nature. But, are equilibria enforced by laws of nature or by a higher power? If you believe that an equilibrium is enforced by a higher power and that the equilibrium is based on human values, then you are engaging in apocalyptic thinking.

Consider a few examples. Let's say that a child constantly talks back to his or her parents until the child gets sent to his or her room; or some other appropriate punishment. We might say that they got what they 'deserved'. If somebody consistently performs poorly at their job until they get fired, we might also say they got what they 'deserved'. In these two case the 'higher power' (the parents in the first case and the boss in the second case) brings things back into an equilibrium based on human values (respect and hard work).

However, if we turned a glass full of water upside down and the water poured out, we would not say that the water got was it 'deserved'. The water simply followed the laws of nature. But, with people, at what point is the equilibrium enforced artificially and at what point is it merely natural law? If somebody spends their money foolishly and goes bankrupt, is that artificial or natural? If a company goes backrupt due to mismanagement is that natural or artificial? If an economy goes into a recession after a period of growth, is that natural or artificial?

Clearly, at some point, equilibrium have natural and not artifical causes. However, if you beleive that all equilibria are the result of intervention from a higher power metaphysical or divine, you are a card carrying apocalyptisit.

Next we will look at some examples of apocalyptic thinking in history up until modern times. In the following posts, I will provide some very recent examples to show that apocalyptic thinking is still alive and well.

Monday, May 3, 2010

Elements of Apocalyptic Thinking

The roots of apocalyptic thinking are so fundamental to the way we see the world that it is difficult to see this as anything other just the way the world is. However, it is not the way the world is. It is the way we see the world. Before attempting to justify that claim, let me lay out the elements of apocalyptic thinking.

First, there is a belief that their is a 'right' way that things should be and they have gotten off track from that. Second, if we don't do something to set things right, forces beyond out control will set them right for us. Further, there is an element of retribution in the forces beyond our control setting things right. That is, there is some element of punishment for not have kept things on track for ourselves.

Let's say that an unfortunate investor put money in a lot of risky "get rich quick" schemes and landing up loosing everything as well as incurring a lot of debt in the process. Consider the following two statements.

1) He got what he deserved for trying to get rich quick.

2) The probability of loosing everything increases as the riskiness of investments increases. However, there is also a chance you could win big. There are also possibilities for small losses or small gains.

The first of the two statements is more of a moral assessment than an objective assessment. People should work hard and invest prudently. If they do not, bad things will happen. The second allows for the fact that when people gamble, some people actually do win. So retribution is not built into the fabric of reality.

We would like to believe that retribution is built into the fabric of reality. Consequently, we tend to notice instances where the apparent retribution takes place, while ignoring cases where it does not. Thus, our selective data gathering tends to support what we would like to believe is the case.

There are several problems with the elements of apocalypticism. First, the 'right' way is something we impose on the world based on our human values. Doing all the 'right' things makes the human race prosper and grow. This probably not the 'right' thing for the other inhabitants of the planet. Second, the forces of nature are, well, the forces of nature. They do not bring things back into line with human values. They just do what they do. Third, although we do see elements of equilibrium in nature, equilibrium is not retribution.

We apply this apocalyptic notion of retribution to issues of all sizes from individual to social to global. And the magnitude of the retribution ranges from small hand slaps to total destruction of human life. In the next post we will take up the range of applications of this notion.

Monday, April 26, 2010

Religious Roots of Apocalyptic Thinking

About 3,500 years ago a Persian prophet named Zarathustra asserted that Gods could not be good in some cases and bad in others which was the case, at the time, in most polytheistic mythologies. He said good is good and evil is evil - setting the stage for later beliefs that the earth is a battleground between the forces of good and the forces of evil. This world view, at the risk of being simplistic, allows evil to gain ground at times only to be pushed back by the forces of good. The tension between good and evil is at the heart of apocalyptic thinking. Even if one does not believe in good and evil as a religious doctrine, the underlying concept can also be found in secular thinking. People who believe in balance and harmony see imbalances or dis-harmonies being brought back into balance and harmony as an aspect of the natural order.

Another key religious doctrine is the concept of interventionism. This can best be understood in opposition to a well know opposite doctrine. Deism, a religious view that became popular after the enlightenment, claimed that God created the world and left it to run on its own. Their view can be summed up easily in an example. Let's say you have two clock makers. One made a clock that runs perfectly and requires no further work. The other made a clock that requires constant attention and repair. Who is the better clock maker? Clearly the first one is. The deist view of interventionism is that it is like the second clock maker. Why does God have to intervene all the time? Why couldn't he have just made the world perfect and let it run?

Despite the strengths of that argument, most people are interventionist. They believe that God has a hand in things that go on in the world and that he steps in from time to time to set things right. This notion of the forces of good stepping in to set things right provides much of the religious basis for the apocalyptic view of the world.

Monday, April 19, 2010

Psychological Roots of Apocalyptic Thinking

There are two uniquely human cognitive or psychological characteristics that lead to an apocalyptic view of the world. This will not immediately make sense to the reader. But, I ask your indulgence as I develop the argument. One of these characteristics is our ability to imagine and the other is our moral sense.

Our ability to imagine allow us to, among other things, envision a world different from the one that exists. We can see a world with farms instead of chance discoveries of grain, domesticated herds rather than following wild herds, man made shelters instead of caves. Our ability to imagine allows us to create societies, governments, economies, technologies and so on. Because we can imagine a world other than the one we live in, we sometimes see the imagined world as better and try to bring it about, or wish to have it brought about.

Our moral sense allows us to see beyond our basic needs and consider the needs of others. Others can include other people we know as well as people we don't know and future people. We sometimes even extend our moral sense to animals and nature. Our moral sense allows us to consider a world in which fairness, justice, respect, orderliness, and predictability figure in heavily.

Given our imagination and our moral sense, it is possible for us to envision a world, unlike the world in which we live, where fairness, justice and so on rule. It is also possible to desire to bring such a world about or desire to have it brought about. And it is this desire, I believe, to have a fair and just world, that provides the psychological basis for apocalyptic thinking.

Monday, April 12, 2010

Allowing Possibilties

Another way to increase your mental flexibility and improve your ability to be proactive with new ideas is to take an idea that you may normally reject out of hand and allow the possibility that it may be true - or, at least, that it may have merit. We tend to see life as black and white, and even the shades of gray do not allow for the variety of possibilities that are really out there. Is monarchy better than democracy? Would society be better if everybody was not equal? Should there be a mandatory age at which people are terminated? I choose these examples because they are ideas that most people would not even want to consider.

However, each of these ideas does have merit. And by finding the merit in these ideas, you can see that it is better to consider ideas than to reject them out of hand. I suspect that most people, after considering them, would still reject them. However, consider this: Monarchies can be more efficient than democracies. Regardless, of what we might want to believe, everybody is not equal. And as people age, their maintenance costs increase while their productivity decreases. Might that money not be better spent on younger people?

This exercise is useful to see that even the most offensive of ideas still have some merit. And that raises the question - what other ideas that really do have merit have we simply rejected out of hand?

In the following few posts I am going to consider a question that I alluded to earlier. In our modern secular age, do people still believe in the apocalypse? I am gong to argue that they do. I am going to argue that the children of the enlightenment are not nearly as secular and modern as they like to think that they are and that there is abundant evidence that apocalyptic thinking still dominates our modern world view. Do you think I can convince you? Well, stay tuned.

Monday, April 5, 2010

Being Proactive with New Ideas

The alternative to being reactive to new ideas is to be proactive with them. That is, instead of accepting new ideas when there is absolutely no other choice, embrace them early and consider the possibilities and implications.

There are three primary advantages to being proactive. First, you have more time to adjust to new ideas. That is you can work through the implications and how they will affect you before you actually have to accept them. This advanced preparation is less of a psychological shock. Second, you can, potentially, take advantage of new ideas by thinking them through before hand. And third, you are less likely to become out of touch over time as each new idea by itself is not as threatening as an accumulation of new ideas.

There are, of course, disadvantages to being proactive with new ideas. You don't want to jump on every bandwagon that comes along. Not every new idea survives. In fact, most don't. So, if you are overly proactive to new ideas, it looks like you are just following fads. And, in fact, you may well be.

So, how do you embrace important new ideas and not jump on every bandwagon that comes along? First, you have to evaluate new ideas critically. Look at them intellectually rather than emotionally. Our emotional reactions to new ideas may not be the best assessment as they are, well, new. Emotions are good judgements but tend to work a little better in situations where we have some experience. We need to step back and evaluate new ideas critically rather than just reacting to them. Second, we need to reflect on our successes and failures. Did we accept ideas that didn't work out? Did we fail to accept an idea that did prove its metal? Over time you can get better at this.

Personally, I think being proactive is the best strategy in a dynamic world. And if you are going to be proactive, you have to refine your approach.

Monday, March 29, 2010

Being Reactive to New Ideas

In the last post, I mentioned that you can face new ideas with one of two predispositions. You can be reactive or proactive. A reactive person resists new ideas until the support for them is so overwhelming that they have no choice but to accept them. The primary benefit of this approach is that you don't have to accept every silly idea that comes along. New ideas often come along like fads and die out just as quickly. The reactive person invests no time or energy in these ideas until they have clearly proven their worthiness. The problem, of course, is how much evidence is required to prove the worthiness of an idea?

At one extreme if the person does not require much evidence before accepting a new idea, then they are really just being a proactive person but not doing it very well. At the other extreme, they may be way out of step with the people around them having failed to accept ideas that have become mainstream.

Personally, I would find it difficult to be a reactive person for two reasons. First, I would find it hard to know when it is time to give in. And, second, I would find it difficult to have to finally give in to an idea that I had resist for a long time. Further, I would find the idea of having to continually adjust and give in to new ideas difficult. But that is me and others may be different.

I would think that being reactive would be a good approach for one who is living in static times. However, for one who is living in dynamic times where things are constantly changing, this would be a strain. I would think that for dynamic times such as the present that being proactive to new ideas would be a lot easier on a person. And that is what I will turn to in the next post.

Monday, March 22, 2010

A Willing Suspension of Disbelief

I borrowed this phrase from William Coleridge who said that the enjoyment of fiction requires "A Willing Suspension of Disbelief". In other words, you have to believe, at some level, that the fictional representations are or could be true. If you read fiction with skepticism, you may fail to fully appreciate the literary experience. However, I would also like to apply this phrase to the advancement of knowledge.Which, in my opinion, also requires a willing suspension of disbelief.

At any given moment in time, most of what most people believe is not fully true or possibly out right wrong. We are constantly changing, updating and modifying our shared bodies of knowledge. These changes can come in huge jumps like Newton's theory of gravity or Einstein's theory of relativity. And they can come in little hops like the decision to exclude Pluto as a planet. Personal knowledge changes as well. Anyone can attest from their own personal experiences that things that they used to believe no longer seem to hold. The question is - how do we get from one position on what we believe to be true to another position on what we believe to be true?

It seems to me that this can be done, generally, in one of two ways: proactively or re actively. We do it re actively when we simply cannot hold an old view any longer. We do it proactively when we allow for the fact that new evidence may arise and that we may have to change our minds about some things. If we are being really proactive, we can anticipate the implications of new information and consider what might possibly be true as a result. And if we wish to be proactive, it requires a willing suspension of disbelief.

I am not going to judge whether it is better to be proactive or reactive. This is probably a matter of personal taste, personality, disposition, flexibility and any number of other things. I can say that for me, the preference is very much in the proactive camp. I prefer to know what might be true long before it becomes established. However, for the sake of fairness, I am going to look at the pros and cons of being reactive versus being proactive. Then I will develop an example - apocalyptic thinking. Yes, it could be true.

Monday, March 15, 2010

Believing Fiction

Last week's suggestion for improving you mental flexibility was pretty straightforward - just continue to expose yourself to new ideas. This week's suggestion is going to be a little further out there.

Next time you read a work of fiction, try an convince yourself that the story is actually true. This is not too difficult if it is a romance or detective fiction. Such things could actually happen and many times are based on true stories. However, what about horror or science fiction? Could you convince yourself that Stephen King's The Stand or Micheal Crichton's The Andromeda Strain really happened? Could you convince yourself that either is based on a true story?

Actually, the premise of The Andromeda Strain (a lethal microbial life form was brought back to earth on a space probe) is plausible. So, is the premise of The Stand (the military experiments with a deadly virus which escapes into the world). Since they are plausible, they could have really happened. Why wouldn't you have heard about them? Well, there are lots of reasons why such events might be covered up. As you look for explanations to support your claim that these things really did occur, you find that it isn't that difficult to come up with plausible scenarios. In fact, this is what conspiracy theorists do all the time.

The point here is not to make you paranoid or to turn you into a conspiracy theorist. The point is to show you that, with the proper motivation, it is not that difficult to convince yourself of something. And, if you managed to convince yourself of something here that you know is not true, how many of the other things that you believe to be true are nothing more than things you have convinced yourself of in the past for various reasons.

For any given person, much of what they believe to be truth is, in fact, not true. George Washington never did cut down a cherry tree. And the people in Columbus's Day did not believe the world was flat. Many of you believed, at some point, that Pluto was one of the nine planets. Similarly, for any given person, some of what they believe to be false, is, in fact, true. Nobody has it all exactly right. But your mind, vulnerable to inflexibility, will lead you to believe that you do have it exactly right. Hopefully, the mental exercise described here will help you maintain greater flexibility and allow you to update your view of the world as new information comes along.

Monday, March 8, 2010

Receiving New Ideas

There is an oft told zen story about a student who is frustrated with his inability to grasp the ideas that his master is trying to teach him.

"I am just not getting it," the student complains to the master, "what should I do?"

The master invites the student to sit and offers him some tea. The student accepts the offer and the master begins to pour tea into the students cup. The cup fills and begins to overflow onto the table.

"Master," the student exclaims, "My cup is full."

"That is your problem," the master replies, "your cup is full."

When your mind is full of things, you cannot receive new ideas. It is too full of the old ideas. And the longer the old ideas stay in there, the harder is it to replace them with new ones. This is a problem because the world is constantly changing and it is necessary to accept new ideas in order to keep up with it. After a while you become very rigid in your views. They become more inconsistent with the world around you. And you can find fewer and fewer people who would agree with you on things. So, what do you do.

Well, the answer is fairly simple, actually. You have to be diligent in your acceptance of new idea. When you read the newspaper, for example, instead of reading it from the perspective that they are all idiots and you are the only one who really knows what is going on, read it from the perspective that there may actually be something in there to be learned. That doesn't mean to just naively accept everything. But it does mean to give it a fair chance.

And, that is only a start. You should seek out new ideas and new ways of looking at things. Read books, take classes, explore new ideas. Watch movies and TV shows that you would not normally be attracted to. It has to be an active effort.

Just like you have to get off the couch and get some exercise, you have get out of the valley your mind has settled into and exercise it with some new ideas. New ideas are the key to mental flexibility and it takes effort to achieve it.

Monday, March 1, 2010

Keeping Your Mind Sharp and Agile

Over time our thinking become very rigid. Our minds shrink wrap around the things we know and resist letting in new ideas. This is not good for most people and for academics it is a disaster. It would be analogous to a professional athlete becoming soft and doughy from lack of exercise. Fortunately for professional athletes their careers are relatively short and when the natural effects of aging set in, they no longer have the demands of their profession to deal with. However, for academics this is not the case. Academics can continue to practice their profession well into old age. We have a comedic archetype of an aging academic reading from yellowing pages of notes lingering for years at deaths door while continuing to deliver lectures. This archetype is a bit unfair. But, it is not unheard of to have academics continue to work well into their 70's or even 80's. So, the question is - how do you keep your mind nimble and sharp and resist the forces of aging?

Well, part of the answer is that many academics don't. Some begin their careers fixed in their views and some acquire the rigidity over time. However, this is unfortunate and not necessary. It is possible to remain nimble and flexible in your thinking. And it is not a great deal different from maintaining physical flexibility.

In the absent of any efforts to combat inflexibility, our bodies become inflexible over time. Muscles shorten and we loose our range of motion. We combat this by stretching. And the same thing is true of mental flexibility. We combat it by stretching our minds. I would add, parenthetically, that we also become inflexible emotionally, psychologically, spiritually and in any number of other ways. The remedies there are the same - stretching. But, I am going to limit my discussion to mental flexibility.

Over the next few weeks I am going to look at three ways to maintain mental flexibility: exposure to new ideas; using fiction for mental exercise; and the pursuit of non conventional ideas. Warning: This will start out exactly as you expect and get very weird before it is all done.

Monday, February 22, 2010

Is This Really The Best Way To Do Things?

The past few posts on the nature and role of the university may cause people to ask if this is really the best way to do things. The answer, I believe, is yes! If universities were any more efficient they would be dangerous. In order to explain that outrageous claim, allow me to digress for a moment.

Machiavelli, the author of late Medieval book of real politic called The Prince, provided sage advice for keeping a Prince in power. However, what few people know is that he had second thoughts later in life about the advice that he had given. When you think about it, a monarchy is really the most effective form of government as long as two conditions occur. First, the monarch must understand what it good for the people he or she is governing. And, second, the monarch must be competent enough to achieve what is good for the people. If the monarch is corrupt (or at least does not make the needs of people primary) or if he or she is incompetent, then monarchy is not such a good idea. And therein lies the problem. Given what we know about people and human nature, it is unlikely that these conditions will be met. Hence we need a form of government that does not rely so much on a single person.

In steps democracy. It isn't that democracy is the most effective form of government. It isn't. It is slow, ineffective and often contrary. However, given our understanding of human nature, it is the most likely to be the most effective over time. It is not the most efficient. It is the most risk free. A monarch can use his or her power to very efficiently take a nation in a very wrong direction. If a democracy goes in a wrong direction it does so very slowly with much debate and discussion and many opportunities for correction.

Universities are similar to democracy in that they are slow, inefficient, and often contrary. However, as the guardians of reality, that is exactly what we want. We do not want to go tooling off in the wrong direction with great efficiency. We want to make sure that if we go in a wrong direction we do so slowly with much debate and many opportunities for correction.

Indeed, if you look at the history of universities, this is exactly what happens. It is exactly what we want to have happen. And therefore it is the best way to do things over the long term.

Monday, February 15, 2010

Guardians of Reality

Our perception of reality is in a constant state of flux. I use the phrase "Our perception of" to avoid philosophical arguments although the statement is equally as true without it. What appeared to be real to an ancient Egyptian was very different from what appeared real to an ancient Greek. Different again for a Medieval noble, and different again for scholar during the enlightenment. Our modern view and postmodern views are, again, very different. We can loosely define reality as our perceptions of the physical world, our social structures and values, and our spiritual expressions. It is what we think, believe, feel, experience and so on. And it is constantly changing. The comedian Lily Tomlin once said "What is reality, anyway? Just a collective hunch" And that is about as serviceable as any philosophical attempt to nail it down any further.

Reality changes because we are changing. We try new things. We learn new things. We reject old idea and accept new idea. We have new experiences and new concepts. If our perception of reality were static, they would, over time, fail to meet our needs. And, at the same time, if they changed too fast we would have a hard time keeping up with it all. So, in order to maintain the stability of reality we need a social institution that is on one hand tasked with advancing reality, and, at the same time responsible for maintaining its stability. And that social institution is the university. A couple easy examples will clearly illustrate this.

First consider the role of the university in education. On one hand the university indoctrinates students into the corpus of existing knowledge. This is a reality maintenance function. On the other hand, university classes encourage students to think for themselves. This is the advancement function. How can you tell students on one hand to learn what you are teaching them and one the other hand to think for themselves? Well, it is just part of the role of the university in maintaining and advancng reality.

Second consider the role of the university in research. On one hand the university generates new ideas. H.L. Mencken once said "There is no idea so stupid that you can't find a professor who believes it". One of the responsibilities of the faculty is to put forth and entertain new ideas. There is no tenet of our worldview that didn't start out as a stupid idea at some point. And yet, there are processes in place to keep stupid ideas from escaping out into the world of real people. There are tenure committees, peer reviews, commentary, viscous fights between differing schools of thought and so on. So the university allows reality to advance by adding new ideas, while keeping that advancement from happening too rapidly.

So when we complain about the vagaries of the university - the teachers who no longer want to teach, the researchers who no longer wish to pursue research, the administrators who came to the university to avoid administration only find them selves in administrative roles - you have ask: is the guardianship of reality important and can you think of any better way to do it?

Monday, February 8, 2010

Academic Service

There is a rich assortment of Academic Service roles within the University which I am going to simplify into two categories: voluntary committee work and paid administrative work. Voluntary committees usually involve some sort of policy making while administrative roles generally involve running something. Neither hold up particularly well to scrutiny.

Most academics have some sort of voluntary committee work and the effort required can vary greatly. There are committees that literally never meet and these are considered plum assignments as one can meet their obligations for service without doing anything. Other committees meet frequently and are usually addressing a problem that the committee members feel is important. The two extremes are rare and most committees meet now and then with limited attendance and limited productivity. As far as I can see, committees serve two purposes. First, they engage faculty in the workings of the university and allow faculty to meet other faculty that they might otherwise have no way of knowing. This is a good thing because faculty tend to become rather isolated in their teaching and research. Getting to know other faculty helps develop a sense of community among the faculty. The second is that committees keep faculty engaged in the policies of the university. Again, faculty tend to become rather isolated in their teaching and research. So committees allow them to stay in touch with any changes that may be brewing. People often forget that these are the two primary purposes for committee work and think that committees should be productive; that is, they should get something done. This misses the point and if a committee gets something done, it is a by product of the other two objectives.

Far fewer faculty have paid administrative roles. These roles include running a department, a program, or a school, all the way up to major administrative roles within the university. As I mentioned earlier, most academics prefer life at the university to the administrative life in a corporation. So why do some faculty migrate into these roles. In fairness, I should say that some if not many did it reluctantly. However, may actually pursue these roles. And there are two reasons, as far as I can see why they would do this. First, it needs to be done. That is, somebody has to do it. Faculty are an odd group of people and are reluctant to be led by someone who does not understand what they do. As a practical matter this means another academic. Academia is a culture unto itself. And one of the tenets of that culture is to only accept leaders from within the ranks. The second reason, also touched upon earlier, is that at some point most academics run out of steam for teaching and research. If they wish to remain vital and contributing administrative service roles provide that opportunity.

People looking at the univeristy from the outside often see the inner workings as bizarre, non productive and often neurotic. But, there are good reasons for the university being the way that it is. Universities are the "Guardians of Reality". And that will be the topic for next time.

Monday, February 1, 2010

Service: The Saftey Net of the Dispossessed

One of the great ironies of academic life is that many bright young people pursue academic careers because they find the idea of administrative life in the corporate world to be less than desirable. And, then, they find themselves, after a productive decade or two, in the administrative life of the university. The reason for this is that it is very difficult to sustain your productivity in research over the long term. And it is equally as difficult to sustain your enthusiasm for teaching.

There are three reasons why it is difficult to sustain your productivity in research. First, research requires mental energy. Mental energy declines as one ages and one is unlikely to engage in challenging new research as they get older. Second, research requires enthusiasm. Younger researchers are often driven by a desire to discover and be recognized for that discovery. As you publish paper after paper that few people care about, it is difficult to maintain that idealistic enthusiasm. Finally, the audience for research can be very fickle. What was a hot topic one decade can be a hard sell the next and an impossible sell after that. Since one is unlikely to embark on new avenues of research later in their career, they find that there is simply no audience for what they would like to write papers about.

Similarly, it is difficult to sustain your enthusiasm for teaching. Initially, it is a heady experience standing up in front of an audience of students and telling them things that they want to know or need to know. It is also quite satisfying to adjust over time to their challenges. Further, it is exciting to learn new things and pass them on. However, at some point there are no new challenges in the classroom. You have been asked every conceivable question multiple times. Students fade into one another as you have difficult remembering all the names. And you, some times, dread giving a lecture as you know you may very well bore yourself.

This does not happen to everyone. But it does happen to an overwhelming majority. This is really the point where one should move on to other things. But, if you have been an academic all your life and know nothing else, what can you move on to. The answer, of course is service to the university. There are any number of service roles from voluntary committee work to well paid administrative positions. Next time we will explore the richness of those alternatives.

Monday, January 25, 2010

The Curse of Soft Requirements

I suppose that soft requirements is an oxymoron. If something is required, it is a requirement. If it is not required, then it is not a requirement. However, I think most people would understand the phrase soft requirements as things that you need to do where the exact nature or criteria are unclear. As academics we have two soft requirements: keeping up with the advances in our fields and publishing our own advances in respectable outlets. Both of those soft requirements are worded vaguely on purpose, because in the real life of an academic they are equally as vague.

Keeping up with the field means that as new things happen, you stay on top of them. This, however, can mean many, many different things. Is it keeping up with research, or news events, or technology, or policy changes, or current trends among practitioners? It could be any of those things. In my field of Information Technology, you could spend all your time keeping up with advances in just one area of the technology. This is unusual, however, as most fields do not have as much in the way of evolving and emerging technologies. Most people believe, somewhat naively, that if you are an academic in a specific field that you will be on top of everything happening in that field. This would only be possible, of course, if there were 5000 hours in a day and you had several hundred clones. Nonetheless, people tend to be aware of what comes across their field of vision and unaware of everything that doesn't.

Publishing in respectable outlets is also a bit squishy. In an ideal world, it would mean respected peer reviewed journals published in paper form. However, the world is not ideal. Peer reviewed journals often have a long lag in publication and focus on concerns largely of interest only to academics. This makes their relevance questionable. If you are in a hard core academic discipline, this is not a problem. But in a professional school it is. On the other hand, if you go to web based publication or widely read practitioner publications, it is considered to be lacking in rigor. So, you can't win.

That should be the mantra of academics. You can't win. And that is the curse of soft requirements. No matter what you do, there will be a few who think it is great and many who think it is a silly waste of time. So, you can work long hours only to find that others think you are just wasting your time. And this causes many academic to simply not work long hours. It is a rational response. You can work like a fool and face criticism or you can do nothing a face criticism. As Mark Twain would say, the wages are the same and one way is a lot easier.

After a while you find that academics who are productive can't help being productive and academics who can help it stop wasting their time. But what do they do instead? They have to do something and most turn to service, the homeless shelter for the academically dispossessed. And that will be the topic for next time.

Monday, January 18, 2010

Blessings and Curses

Now and then I like to provide some insight into life as an academic. And since I don't have a better topic on my mind this morning I thought I would write about one of the greatest blessings of my academic professional life. And that is that I have an enormous amount of control over my time. The things that I have to do are limited. I have to meet with my classes and when it gets down to it that is probably about it. If you don't show up for your classes the university could terminate your tenure. So that can be thought of as a hard requirement.

There are also firm requirements. I need to prepare for my classes. While I actually spend an enormous amount of time preparing for my classes, there is a wide variation in how much time academics spend in general. I spend a lot of time for several reasons. First, I am interested in the material that I teach so broadening and deepen my knowledge is something I enjoy doing. Second, I like teaching so I am constantly looking for ways to improve the delivery. Third, my field of Information Technology is constantly changing so just keeping up with what is going on requires time (in fact, a LOT of time). It would not be unrealistic to say that I spend two to four hours in preparation for each hour I teach. However, I think I am at an extreme end of the spectrum. It is easily conceivable that once one has their lectures nailed down they may spend very little, if any, time in preparation. But, clearly, if one puts in no preparation time for their classes, it will eventually show up on course evaluations. So, it is best to be prepared. This is a firm requirement because you have some wiggle room in how you pursue it. But, if it gets bad enough you can be pulled out of the classroom with dire consequences.

If one were to think about this in terms of hours per week, the hard and firm requirements could translate into as few as four hours a week (two 2 hours classes with no preparation time) and as many as 22 1/2 hours per week (three 2 1/2 hour classes with two hours of preparation for each hour of class) .

One of my colleagues once joked that the nice thing about being an academic is the flexibility - you can work any 80 hours a week that you choose. But how did we get from a range of 2 to 20 hours of hard and firm requirements to 80 hours a week? The answer is that the flexibility which is a blessing is also a curse. And as we get into the soft requirements next time that curse will become more obvious.

Monday, January 11, 2010

Power, Wealth and Fame

I listened to a book on CD last week entitled 48 Laws of Power by Robert Greene. It provided 48 laws (actually conflicting pieces of advice) on how to become powerful or more powerful. It was worth listening to and most of the advice was fairly sound although it would still take a fair amount of thought and reflection to apply it to the greatest advantage. Many of the reviews of the book on Amazon were offended by the Machiavellian tone of the book and this got me to thinking about power and related goals such as wealth or fame.

Aristotle said that happiness is the only goal that we seek as an end in itself. We want to be happy simply because we want to be happy. However, other goals, such as power, wealth or fame, we want because we think they will make us happy. I think the thing that offended the reviewers on Amazon was that this book provided rules to make yourself more powerful without asking if you wanted to be powerful, how much power you really wanted or whether you wanted to pursue power as an end in itself.

We actually know a fair amount about how to achieve power, wealth and fame. The problem is that most people are unwilling to do what it takes to achieve them. Why is that? I think the problem is that these are not end goals in themselves. They are sub goals in the pursuit of happiness. If we have to do something that makes us unhappy in the pursuit of happiness then we have defeated our attempts. But, let us say for the sake of argument that we can pursue these goals without doing anything unpleasant. Is there still a problem?

Yes, there is. The problem is conversion. Money itselfs does not make one happy. It is the things one can do with money that increases happiness. Power does not make one happy. Again it is the things one can do with power that may increase their happiness. If one acquires a large amount of money, power or fame and has not figured out how to convert them into happiness then the whole exercise has been pointless. There is nothing inherently wrong with the pursuit of power, wealth or fame as long as it is done in the context of a meaningful and satisfying life.

Monday, January 4, 2010

Ah, 2010

The New Year is officially underway. Two thousand and ten. Or, I suppose, Twenty Ten. I am not sure who gets to decide what the proper way to say it is. But no matter. It is upon us and I think it is going to be a great year. There has been much talk about the past decade and how is was the decade from Hell. I have to admit that it was a rough decade and I do think things will get better in the next decade.

It is nice having these cyclical patterns to time - weeks, months, years, decades. They give us a way to structure our time and have built in points of reflection and improvement. I have resolutions for the New Year as I do every year. I am a big fan of reflecting on a time cycle; seeing what when right and what went wrong; and having a shot at improving it next time around. In an earlier post I mentioned how I do this each semester.

Over break, I made substantial revisions to the two classes that I will be teaching in the Spring. I also made some further progress on a book that I am writing entitled Writing Stories to Explore the Ethics of Technology. I have decided that when I have the first draft of the book completed, I will make it available on my website for free. I think more academics should do this. The whole publishing business has gone so far off track that it can only be justified as an alternative to nothing. However, making things available for free on the Internet is probably a lot closer to the original ideas of freely sharing scientific and scholarly knowledge. So, I will give it a shot and see how it goes.

I am looking forward to 2010 how ever you pronounce it and will come back at the end of the year and reflect on whether or not it met my expectations.