One of the things that makes the study of complex systems difficult is the fact that we are almost always studying emergent properties. Emergent properties are properties of the system that cannot be predicted from or even explained in terms of the properties of the constituent parts. A simple example of emergent properties is water. The properties of water cannot be predicted from or explained in terms of the properties of hydrogen and oxygen. When you put these two elements together the combination produces a whole new set of properties.
An example of emergent properties in social science would be the political systems that arise from people living together under certain circumstances requiring organization. But are political systems 'real'. Have they always existed? The answer is no. They are not real and they have not always existed. Over time people noticed these emergent properties and began to group them into categories and give them names. The Romans recognized the emergent properties of people living in organized societies and called these emergent properties 'the thing of the public" or res publika or republic. Now we think of the republic or any political system as a real thing.
Economics came into being much later with Adam Smith identifying a collection of emergent properties based on wealth rather than power. But who is to say that organizing properties based on wealth and power rather than class, location or time is the best way? The point is that these are constructs that we are studying that become real over time by virtue of the fact that they are being studied.
In the same way, computer systems have emergent properties. When people or societies interact with computer systems other emergent properties arise as well. Are these things 'real'? No, they are just constructs that we create to organize our knowledge and give us ways to think about things that we are trying to understand. These things are not 'real'. They are just our best attempts to organize our knowledge and understand our experiences.
It is good to keep this in mind as we talk about virtual worlds, or interactive entertainment or any number of other phenomenon that arise through the interaction of complex human systems with complex computer systems.
Monday, June 29, 2009
Monday, June 22, 2009
Interactive Entertainment
Over the past decade we have seen a dramatic increase in the popularity of massively multi-player online role playing games also known as the unpronouncable acronym MMORPG's. These have been largely virtual world video games such as World of Warcraft. However, virtual worlds such as Second Life have also provided interesting possibilities for online role playing. Some media pundits see this as the next generation of entertainment also know as interactive entertainment.
To see the difference between traditional home entertainment such as television and interactive entertainment, consider the following scenarios. With traditional home entertainment, you select a show that you want to watch, put the television on the proper channel, plop down on a chair, and veg out while you are being entertained. It is passive and non participatory.
Now, consider online role playing. You select a role or a scenario to participate in. You log into a virtual world. The story unfolds as you interact with the environment. You are thinking, planning and engaging. You might even interact with other role players. Instead of following a rigid script, the scenario evolves in a unique fashion based upon the actions of the player(s).
If you were to go to a automobile dealer to buy a car and there was only one model available, you would find that unacceptable. You would think the world a very dull place if everyone had to drive the same car. If you went to the store to buy a shirt and only one kind of shirt were available, you would find that unacceptable. However, if everyone watches exactly the same television show, we somehow find that perfectly acceptable. And if we watch the same show again, it is still the same show. Why is it that we find that acceptable. The answer, perhaps, is that we have come to expect it. However, online role playing games are likely to change that.
But, are online role playing games just a new form of entertainment? Or is there more to them than that. I think there is quite a bit more and we will turn to the various uses of online role playing games for serious play next.
To see the difference between traditional home entertainment such as television and interactive entertainment, consider the following scenarios. With traditional home entertainment, you select a show that you want to watch, put the television on the proper channel, plop down on a chair, and veg out while you are being entertained. It is passive and non participatory.
Now, consider online role playing. You select a role or a scenario to participate in. You log into a virtual world. The story unfolds as you interact with the environment. You are thinking, planning and engaging. You might even interact with other role players. Instead of following a rigid script, the scenario evolves in a unique fashion based upon the actions of the player(s).
If you were to go to a automobile dealer to buy a car and there was only one model available, you would find that unacceptable. You would think the world a very dull place if everyone had to drive the same car. If you went to the store to buy a shirt and only one kind of shirt were available, you would find that unacceptable. However, if everyone watches exactly the same television show, we somehow find that perfectly acceptable. And if we watch the same show again, it is still the same show. Why is it that we find that acceptable. The answer, perhaps, is that we have come to expect it. However, online role playing games are likely to change that.
But, are online role playing games just a new form of entertainment? Or is there more to them than that. I think there is quite a bit more and we will turn to the various uses of online role playing games for serious play next.
Monday, June 15, 2009
Social Interaction Technologies
Social interaction, according to Wikipedia, "is a dynamic, changing sequence of social actions between individuals (or groups) who modify their actions and reactions according to those of their interaction partner(s). In other words, they are events in which people attach meaning to a situation, interpret what others are meaning, and respond accordingly."
That is to say that social interaction is the mechanism by which people modify their (social) behavior in response to the actions of others. If you lived on an island with no other people, it is unlikely that your behaviors would change much beyond the behaviors necessary to survive. If you lived in a small tribe any changes in your behavior would most like be directed towards survival of the tribe as well. It seems that as the social unit becomes larger and more complex, then the behaviors that you modify as a result of social interaction become more complex and of larger scope as well.
The point that I am reaching for here is to suggest that social interaction technologies have made or will make the social unit extend to the entire planet. So, we will be interacting with each other and modifying our behaviors in ways that affect the entire human race. Some of this will undoubtedly be good. And some will undoubtedly be bad.
The question is - do we just let things unfold and see what happens; or should we take a hand in influencing the future? It is a tough call because most people believe that it is better to not make things worse with your actions than it is to make things worse in an attempt to make things better. But, is this still true? Are we at a point where the impact of future changes is such that not doing anything is a greater evil than doing the wrong thing? I really can't answer that. All I can do is raise the question.
That is to say that social interaction is the mechanism by which people modify their (social) behavior in response to the actions of others. If you lived on an island with no other people, it is unlikely that your behaviors would change much beyond the behaviors necessary to survive. If you lived in a small tribe any changes in your behavior would most like be directed towards survival of the tribe as well. It seems that as the social unit becomes larger and more complex, then the behaviors that you modify as a result of social interaction become more complex and of larger scope as well.
The point that I am reaching for here is to suggest that social interaction technologies have made or will make the social unit extend to the entire planet. So, we will be interacting with each other and modifying our behaviors in ways that affect the entire human race. Some of this will undoubtedly be good. And some will undoubtedly be bad.
The question is - do we just let things unfold and see what happens; or should we take a hand in influencing the future? It is a tough call because most people believe that it is better to not make things worse with your actions than it is to make things worse in an attempt to make things better. But, is this still true? Are we at a point where the impact of future changes is such that not doing anything is a greater evil than doing the wrong thing? I really can't answer that. All I can do is raise the question.
Monday, June 8, 2009
Big History
I am currently (not at this second, but at this time) listening to a wonderful lecture series from The Teaching Company called Big History. The lecturer is Professor David Christian from San Diego State University. I mention this for three reasons.
First, the lecture series from the Teaching Company are wonderfully interesting lectures on a diverse range of topics and I highly recommend them. I have listened to dozens of these lectures comprising hundreds of hours of informative enjoyment. I listen while in the car or while out walking or hiking. For the enjoyment value alone these lectures are worthwhile. But, that is not all.
Second, exposure to a diverse range of ideas is very important as you never know where important insights may come from. I can say with a fairly high degree of certainly that I have not listened to a single set of lectures that has not provided me with insights well beyond the topics of the lectures. These insights usually apply to things I am currently working on or thinking about and provide me with new ways of looking at problems.
And third, this ties in with what I was saying in my last post. So it is on point to this thread. However, that will take a little explaining. Christian's thesis is that you can unify all of history from the Big Bang to modern times, despite the vast differences in the scale of time and space by looking at history as a process of the creation of great complexity. Further, if we see this complexity as occurring in steps, we can see it line up with our current academic disciplines. Cosmology (Big Bang), Astronomy (Stars and Solar Systems), Geology (Planets), Chemistry (Particles), Biology (Life Forms), Sociology (Societies), and so on. This is a very clever idea and I have not done it justice here, but have sketched out enough to make my point.
Social interaction is the engine by which societies form and evolve. As such, it is on a par with, say Chemical Reactions or Biological Reactions. We have a reasonably good grasp these days on chemical reactions, less on biological reactions, and less yet on social interaction. Tinkering with chemical reactions when you don't understand them puts you at risk for blowing yourself up. Tinkering with biological reactions when you don't understand them puts you at risk for poisoning yourself. Tinkering with social interaction must put you at risk for something. But we don't even understand enough to know what we might be risking. Yet we have these powerful accelerants called social interaction technologies. And we have no idea what the consequences of these technologies might be. So, context may be a little more important than we realize.
First, the lecture series from the Teaching Company are wonderfully interesting lectures on a diverse range of topics and I highly recommend them. I have listened to dozens of these lectures comprising hundreds of hours of informative enjoyment. I listen while in the car or while out walking or hiking. For the enjoyment value alone these lectures are worthwhile. But, that is not all.
Second, exposure to a diverse range of ideas is very important as you never know where important insights may come from. I can say with a fairly high degree of certainly that I have not listened to a single set of lectures that has not provided me with insights well beyond the topics of the lectures. These insights usually apply to things I am currently working on or thinking about and provide me with new ways of looking at problems.
And third, this ties in with what I was saying in my last post. So it is on point to this thread. However, that will take a little explaining. Christian's thesis is that you can unify all of history from the Big Bang to modern times, despite the vast differences in the scale of time and space by looking at history as a process of the creation of great complexity. Further, if we see this complexity as occurring in steps, we can see it line up with our current academic disciplines. Cosmology (Big Bang), Astronomy (Stars and Solar Systems), Geology (Planets), Chemistry (Particles), Biology (Life Forms), Sociology (Societies), and so on. This is a very clever idea and I have not done it justice here, but have sketched out enough to make my point.
Social interaction is the engine by which societies form and evolve. As such, it is on a par with, say Chemical Reactions or Biological Reactions. We have a reasonably good grasp these days on chemical reactions, less on biological reactions, and less yet on social interaction. Tinkering with chemical reactions when you don't understand them puts you at risk for blowing yourself up. Tinkering with biological reactions when you don't understand them puts you at risk for poisoning yourself. Tinkering with social interaction must put you at risk for something. But we don't even understand enough to know what we might be risking. Yet we have these powerful accelerants called social interaction technologies. And we have no idea what the consequences of these technologies might be. So, context may be a little more important than we realize.
Monday, June 1, 2009
Technology versus Context
Most classes in Information Technology focus on the technology rather than on the context of the technology. For example, if one were to take a class in a programming language, say C# just to use a current example, one would, hopefully, learn how to program in that language. This is a good thing and I don't wish to diminish it. However, I do wish to point out what is lost.
There is a history of programming languages. Over the past fifty years programming languages have changed dramatically. If you were to take someone who learned a programming language at one point and show them programming ten years later, they would probably not recognize it. In fact, today, most programming is the assembling of reusable components in an integrated development environment. That is very, very different from what I learned. In fact, it was an uphill climb for me to get used to this new paradigm.
C# is an example of an imperative programming language. There are also functional and logic languages that were more popular when artificial intelligence was ascendant.
Over the years the monarchy of programming languages has changed considerably. Languages like Cobol, PL1, C++, and Ada were the languages du jour while many current students have never even heard of them. Java is currently the language du jour and it won't be long before it joins the rest of the pack in obscurity.
The problem here is that if you teach a person to program, they learn how to program in that one language. If you teach a person the context, they can learn new languages as they evolve. This is not generally a problem as most students only program in the early part of their careers. By the time their skills have become out of date they have moved on to other things like design, or management or working with clients.
One of Murphy's Laws of Technology states that if builders built buildings the way that programmers write programs, the first woodpecker that came along would destroy civilization. We have seen how shortsightedness in the financial industry can create problems. I wonder if we are not being similarly shortsighted in our technological infrastructure. Having programmers write programs that they know they will not have to maintain cannot be a good thing.
But, there is an even larger point here. Programming languages are just one instance of education in Information Technology. Overwhelmingly classes in Information Technology focus on the technology rather than the context. As I think about it, I also wonder if other disciplines don't do the same. Well, I am getting a little far afield here and should probably think about this a little more.
There is a history of programming languages. Over the past fifty years programming languages have changed dramatically. If you were to take someone who learned a programming language at one point and show them programming ten years later, they would probably not recognize it. In fact, today, most programming is the assembling of reusable components in an integrated development environment. That is very, very different from what I learned. In fact, it was an uphill climb for me to get used to this new paradigm.
C# is an example of an imperative programming language. There are also functional and logic languages that were more popular when artificial intelligence was ascendant.
Over the years the monarchy of programming languages has changed considerably. Languages like Cobol, PL1, C++, and Ada were the languages du jour while many current students have never even heard of them. Java is currently the language du jour and it won't be long before it joins the rest of the pack in obscurity.
The problem here is that if you teach a person to program, they learn how to program in that one language. If you teach a person the context, they can learn new languages as they evolve. This is not generally a problem as most students only program in the early part of their careers. By the time their skills have become out of date they have moved on to other things like design, or management or working with clients.
One of Murphy's Laws of Technology states that if builders built buildings the way that programmers write programs, the first woodpecker that came along would destroy civilization. We have seen how shortsightedness in the financial industry can create problems. I wonder if we are not being similarly shortsighted in our technological infrastructure. Having programmers write programs that they know they will not have to maintain cannot be a good thing.
But, there is an even larger point here. Programming languages are just one instance of education in Information Technology. Overwhelmingly classes in Information Technology focus on the technology rather than the context. As I think about it, I also wonder if other disciplines don't do the same. Well, I am getting a little far afield here and should probably think about this a little more.
Subscribe to:
Posts (Atom)