The All-Seeing Eye

Musings from the central tower…

What’s in a name?

This blog’s URL is “panoptical.wordpress.com.” I chose the name “panoptical” for several reasons. First, the inspiration for this blog was a concept I came across several months ago while studying Foucault that I call the “panoptic model of power.” The second is that “panoptical” means “observing all,” and I intend this philosophy blog to be highly interdisciplinary: I intend, to the extent possible in my spare time, to observe all. The third is that “panoptical” gets few google hits and is therefore a reasonably distinctive name.

The panoptic model of power merits more explanation, because I intend to delve very deeply into that subject and I have plans to use this model extensively to explain all manner of social institutions, from the free market to the public school system. Foucault made a study of Jeremy Bentham’s panopticon, a physical prison building designed, before modern surveillance techniques, to make it easy for a single observer to supervise a large number of people. The structure of the panopticon consists of a single, central tower surrounded by a large number of individual cells situated such that each cell can be seen into from the vantage point of the tower. Ideally, the inmates should be isolated from each other, so that no communication is possible. Additionally, the inmates ought not to be able to see into the central tower, so that at any given time they will not be able to determine whether or not they are under surveillance.

The proposed psychological effect of the panopticon is that the inmates exercise self-surveillance and self-discipline. Because the inmates know, at any given time, that they might be under surveillance, they will tend to watch their own behavior to ensure that it conforms to the way they would act if some authority figure were actually watching. It may also be an important aspect of the panopticon that the inmates are isolated from each other. The twin effects of isolation and self-surveillance serve to magnify the power of the central authority over the inmates.

The implication of the panopticon is that this panoptic magnification of power also takes place outside the physical structure. In other words, isolation and self-surveillance occur in individuals in our society due to various other institutions and social factors, and it may be the case that when these things come together with a perceived authority or set of norms, they govern the individual as surely as if the individual were actually in a prison cell. This is where Foucault comes in, because he re-envisioned power as the cumulative effect of every relationship and institution, rather than as the simple effect of one person ruling or dominating another.

This is all a vastly brief summarization of a set of theories that are farther-reaching in their implications than perhaps anything I’ve ever studied, so if things seem a bit unclear, don’t worry – I’ll be going over all of these issues with a fine-toothed comb. To give you an idea of just how far-reaching these implications are, I’ll say this. For about five years I adopted the political and economic philosophy of Libertarianism, studying many of its facets and related ideas, such as Objectivism, Austrian Economics, praxeology, and anarcho-capitalism. All of those systems, at their very core, assume a theory of power that Foucault may have made obsolete. The panoptic model of power and its implications could, therefore, lead me to retrace five years worth of steps and start over at the beginning. The scope of that project is why I felt I needed a new blog and the inspiration, the panoptic model of power, is where this blog gets its name.

Advertisements

January 27, 2008 Posted by | About | , , , , , , , , | 1 Comment

Traveler’s Dilemma and Opportunity Cost

As a followup to my last post, I thought now would be a good time to say some things about opportunity cost, or The OC.  Please do not confuse this with any other things called The OC.

Opportunity Cost is an economic analytical tool – a measure of the cost of a missed opportunity.  It goes like this.  Let’s say you have a dollar and you’re standing on the street at a hot dog cart.  The cart is selling pretzels for $1 and hot dogs for $1.  If you buy the hot dog, you can’t buy the pretzel.  Therefore the opportunity cost of buying the hot dog is one pretzel.  Conversely if you buy the pretzel you can’t buy the hot dog.  So the opportunity cost of the pretzel is one hot dog.  Simple, right?  At first glance this seems a trivial and reductive measure for an economist to be thinking about, but in real life, when applied to more complex situations, we can see the value of considering the opportunity cost.

Let’s try another example.  You have a dollar but you aren’t hungry, and you’ve got a year.  You can keep the dollar in your pocket and at the end of the year you’ll have a dollar.  Or you can put the dollar in a bank and at the end of the year you’ll have, let’s say, $1.05.  The average person would think that if they kept the dollar in their pocket, they haven’t lost anything, and in one sense this is true.  However, they have missed something – the opportunity to earn $.05.  The opportunity cost of holding onto the dollar was five cents.  Doesn’t seem like much, but what if it’s a thousand dollars?  A million?

The point is, when there’s money at stake it pays to consider the opportunities that you have when making choices, because in some sense, missing the opportunity to earn money is sort of like losing money, even if it’s money you never actually had.  An opportunity is worth something.  If you don’t believe me, play poker.  If you fold a hand, and it turns out at the end that you would have won if you had stayed in, you will feel like you have lost something.  What you’ve done is missed an opportunity, and the loss you’re feeling is the opportunity cost.  Folding may have been the right decision based on the odds, but you’ll still feel bad that you didn’t get the pot.

So what does opportunity cost have to do with the Traveler’s Dilemma?  Well, it’s another way of evaluating possible plays in the TD, and it demonstrates a major flaw in the models used by game theorists to “solve” the TD.

To recap, in the TD, game theory says that logically speaking, a player ought to play (2).  Many people intuitively feel that they should play higher, and (100) is perhaps the most common play, with (95) to (100) comprising the majority of plays in some experiments.  According to Kaushik Basu, (2) is the correct or best play, because of a game theory concept known as the Nash Equilibrium.  Further, (100) is the worst play, because it is the only play in the game that is “beaten” by every other play.  In other words, if player 1 plays (100) and player 2 plays anything else, player 2 will be rewarded more points than player 1.  If this logic is to be believed, (2) is a better play than (100), and people, if they are acting “rationally,” ought to play it.

But let’s look at the OC to see if that’s true.  Let’s say player 1 plays (100) and player 2 plays (2).  The rewards, then, are 0 points to player 1 and 4 points to player 2.  However, given player 2’s play of (2), the highest score player 1 could possibly have acheived by making a different play is 2, by playing (2).  Any play other than (2) results in a score of 0.  So, player 1 lost 2 points by not playing (2), or, to put it another way, his opportunity cost for playing (100) was 2 points.

Player 2, however, is in a much worse situation.  He played (2) and got 4 points.  Given player 1’s play of (100), the highest score possible for player 2 was 101, with a play of (99).  In other words, by playing differently player 2 could have gotten 101 points, but instead he got four.  That means that his opportunity cost for playing (2) was 97 points.

So, in the above example, on the face of it it seems like player 2 won – he got 4 points, while player 1 got none.  However, if you look at it a different way, player 2 lost 97 points while player 1 only lost 2.  If you consider the scale of a loss of 2 vs. a loss of 97, you see that a play of (100) is much less risky than a play of (2).

In fact, for an opponent’s play of 2, the OC of (100) is 2.  For any number between (3) and (99), the OC is 3.  And for (100), the OC of (100) is 1.  The OC of (2), however, is (Opponent’s play – 3).  That means that for any play above (6), the opportunity cost of (2) is higher than that of the opponent’s play.

So the (2) player almost always loses more money than his opponent – not that the players are losing money that they actually possessed, but money that they could have – and perhaps should have – earned.  If you ask any economist or poker player, that loss can sting just as much as a loss of cold hard cash.

The thing is, if you evaluate the Traveler’s Dilemma in terms of Opportunity Cost, the definition of improving one’s position changes, and therefore so does the Nash equilibirum.  It’s a situation where gaining money and not losing opportunity are not the same thing – and this situation probably comes up fairly often in the real economy, which is why opportunity cost is important as an economic concept.  Rational choices and selfishness, therefore, cannot necessarily be evaluated successfully using only the rubric of amassing the most gain by the end of the game.  There are other measures of success, and people do use them.  Game theorists and economists alike would do well to remember that.

January 25, 2008 Posted by | Economics, Game Theory | , , , , | 1 Comment

The Traveler’s Dilemma

Now for some real content. I came across this article in Scientific American about the Traveler’s Dilemma. To explain briefly, the TD is a game in which two players are each asked to select a number within certain boundaries (2 and 100, in the example). If both players select the same number, they are rewarded that number of points. (In the example, each point is worth $1, which makes the game of more than academic interest.) If one player’s number is lower, they are each awarded points equal to the lower number, modified by a reward for the player who selected the lower number and a penalty for the player who selected the higher number. So, for instance, if you choose (48) and I choose (64), you get 50 points and I get 46 points.

The intuition that I had upon reading the rules of this game was that it would be “best” for both players to choose (100). That is certainly true from a utilitarian point of view: (100, 100) results in the highest total number of points being given out – 200. The runners up are (99, 99), (100, 99), and (99, 100) with 198. However, there are two small problems – here’s the dilemma part – that prevent (100, 100) from being the “best” choice: one, the players are not allowed to communicate, and two, the (100, 99) and (99, 100) plays result in one player receiving 101 points – an improvement, for that player, over a 100 point reward.

So, the reasoning goes, if player one predicts that her opponent will play (100), she should play (99) in order to catch the 101 point reward. Her opponent, however, ought to use this same strategy, and also play (99), in which case player one ought to play (98) in order to trump her opponent, and so on and so forth. This reasoning degenerates to a play of the minimum number – in the example, (2). According to Basu, the author of the article, “Virtually all models used by game theorists predict this outcome for TD.”

However, reality does not follow these models. When people are asked to play the TD, many of them choose 100. Many of them choose other high numbers. Some seem to choose at random. Very few choose the “correct” solution – (2) – predicted by game theory. Something’s up.

Basu takes this to mean that all of our assumptions about rational behavior need to be questioned. With my philosophical background, I happen to have different assumptions about rational behavior than the mainstream, and so for me the results of the TD are not surprising in any way. But perhaps the best way to explain why the results to not surprise me is that I am a gambling man. Continue reading

January 24, 2008 Posted by | Economics, Game Theory | , , , , , , , , , , , | 6 Comments

The System Of The World

Aside from being the final installment in Neal Stephenson‘s excellent Baroque Cycle, The System Of The World is an important part of a metaphor for my approach to matters philosophical. It goes like this:

Picture a system of equations. Or just consider this one:

a + b = 3
2a + b = 4

It’s a very simple system with a very easy solution: a = 1, b = 2. But how does one solve this system? Well, one method is to examine one equation to try to find a relationship that can help us solve another equation. If we consider the first equation, we can discover that b = 3 – a. If we use this insight about b’s value in the second equation, we get the equation 2a + 3 – a = 4, which we can then solve for a. Once we know that a = 1, things become very easy.

So a system of equations can be solved by, essentially, cross-referencing the information in one equation with the information in the others.

This sort of action, however, is not limited to manipulation of numbers. Philosophy, I believe, works the same way. We can analyze one work of philosophy, or literature, or what have you, and use the conclusions we draw to analyze another different work in a different field, and from this cross-referencing we can derive new equations – perhaps ones with easier solutions.

Let’s work on a very prominent and easy example: The Oedipus complex. Freud looked at a dramatic and mythological character, Oedipus, and from his story drew some conclusions about human nature, which he then applied to the field of psychoanalysis to achieve new and unexpected results. We can challenge Freud’s particular assertions, his methods, etc, but we cannot challenge the fact that Freud was incredibly influential and his insights essentially generated a whole new science.

So where do we find insights like Freud’s? Insights that, regardless of their ultimate validity, help us to look at old problems in new ways? Insights that open up entire new fields of enquiry? The answer is, anywhere.

Each philosophy, each story, each insight, represents a piece of information, an equation in the System of the World. Each equation helps us decode other equations, helps us situate other ideas in reference to one another. All that is needed is for us to find relations, but the fun thing is that everything is related. Anything can be a metaphor for anything else, if creativity and thought are put into it. You might even say that every thought and image we have is a metaphor – after all, a picture of a pipe is not a pipe. And now we’re verging into epistemology and cognitive science. How, exactly, are thoughts organized in our minds? How do we form knowledge? Difficult questions, and well beyond the scope of this post. Suffice it to say that time will tell whether my methods are valid – whether the insights I am able to produce contain truth or falsehood.

January 24, 2008 Posted by | About | , , , , , , , , | Leave a comment

Hello world!

Ah, the Hello World. I can still remember my first programming class – we used QBasic, in which the Hello World program consisted of the following instruction:

PRINT “Hello World!”

Before that class I used to “program” my personal computer: A Commodore 64. That machine used plain old BASIC, and my first program reflected my priorities at the time:

10 PRINT “NEAL”

And that, I hope, may be contorted into some sort of useful metaphor concerning this blog. My first program was not the conventional, didactic, “Hello World,” but rather something that I dreamed up. I’m not claiming that it takes much intellectual horsepower to conceive of displaying one’s own name on a TV screen, which is what we used for a monitor that first year we got the Commodore. But since then I have made my own path in many more significant ways, and ventured, untaught, into many additional fields. Computer programming wasn’t the first (it was preceded, in predictable little-boy fashion, by my study of astronomy and dinosaurs), but it was the first that really stuck in my personality, that really gave me a new tool with which to analyze and communicate ideas.

And that, in a nutshell, is what this blog is about: finding, evaluating, and using an ever-expanding collection of analytical tools that will help us better understand and affect the systems around us.

I like to tell stories, and a lot of my previous writings have consisted of a story followed by the implications of that story and how they apply to some current issue. Much like a sermon, which tells a story from the Bible, only to draw a concept out of that story and then from that concept extrapolate some lesson or advice to the congregation that is hearing the sermon.

The thing is, I see stories everywhere. In economics, politics, philosophy, psychology, theatre, literature, history, cognitive science, even mathematics. When I hear a story told by an economist I want to learn a new way to look at human behavior. When I hear a story told by a philosopher I want to learn a new way to look at politics. My goal is to come up with whole new ideas – but I’ll be satisfied with new ways of looking at old ideas – that can be usefully employed to change peoples’ lives. That’s a tall order for a lone blogger, and I may end up, like Albert Jay Nock, writing for an unknowable potential future audience (“the Remnant”) in need of my ideas, or worse, for no one at all. But despite the risk of failure or irrelevance, I have ideas, and I might as well write them down before they go away, or else, to quote Emerson, “to-morrow a stranger will say with masterly good sense precisely what we have thought and felt all the time, and we shall be forced to take with shame our own opinion from another.”

January 24, 2008 Posted by | About | , , , , | Leave a comment