Sunday, December 31, 2006

Constructing Logical Fallacies for Fun And Profit

The Anthropic Principle is about as Odds-Are-One-y as anything outside this blog gets. We observe that there is life in the universe, ergo the universe must have evolved such that it can support life. In the possible universes where this didn't happen, nobody is hanging around the coffee shop on the corner discussing such things (because of course in universes whose physical laws don't support life, they don't drink coffee. They drink Post-Galactic, Black Hole-Warmed Plasma Beverage. Duh). You might conclude that this is the logical end of the line: there is no point in futher discussing the meaning of us being here versus being not here, because in only one of those cases can there be any discussion of any kind (this is certainly the viewpoint held and frequently advocated by those of us here at OaO). But there are still interesting questions one could ask about what We being Here Now might signify.

If Earth were the only planet one were aware of, one might reasonably wonder what the odds were that this one planet happened to exist at a particular distance from a middle-aged Type-G star such that water could exist in liquid form (yes, I know the odds of this are one, as it has already happened. Be with me in this other, non-OaO place for a second...). This being the popular scientific view for most of the history of man, a reasonable scholar operating under these a prioris might have looked at them and made this seemingly entirely logical inference: "I can think of two explanations for the existence of this planet which can support life. Either it was blind luck or it's the action of an unseen demiurge. The former is incredibly unlikely, therefore by strict laws of probability, it has to be the latter."

Actually, there was a third explanation: the universe is filled with an astronomically large number of stars, an astronomically large number of planets, and has been around for 13.7 billion years or so, so the appearance of at least one planet that has liquid water is not that surprising. The more perceptive among you will notice that the above argument looks an awful lot like an application of Occam's Razor. You might therefore conclude that the flaw in its reasoning is that there was at least one possible explanation our scholar didn't think of. But this is not why the argument is flawed.

We can summarize the logic behind the inference above like this:
  • I observe phenomenon X.
  • Phenomenon X has two possible explanations: E1, which has probability P1 of occurring, and E2 which has probability P2 of occurring.
  • P1 is much lower than P2, therefore E2 is the more likely explanation for X.
First, note that this is not Occam's Razor, which (very roughly) states that a simpler explanation for an observed phenomenon is more likely than a complex one. More to the point, there is no law of logic or mathematics that states this. It's a completely fallacious construction. Here at OaO, we say it this way: once the phenomenon has been observed, any odds go out the window.

As I've learned from The Trouble With Physics, this is of interest to current scientific thinking because they're now asking the same question about the universe. Given that we only observe one universe, it seems rather unlikely that it would be one with physical laws that allowed the formation of stars, galaxies, and planets. Having been fooled the first time around, the popular conclusion is that therefore there must be a multitude of 'verses, all with different physical laws, that can't be detected by current means. It's a good metaphor for the Earth being just one of many planets. But that's the only argument in its favor: again, the argument for it falls into the the same logical hole--the seeming incredible unlikelihood of the only universe we observe supporting intelligent life does not create any likelihood of many unseen others. There's no logical inference that would say that it does.

Next: The Fun and Profit part!

Friday, December 15, 2006

The Single

A couple of months back Salon held a song contest from their music blog. Thinking there would never be a better demographic upon which to inflict my music than Salon readers, I took Mrs. Transient Gadfly's favorite song of mine and produced the living crap out of it. I was, at the time I entered it, quite proud of my creation and full of, you know, whatever it is that people who are rock stars in their own minds are full of. Then the contest actually happened, and I neither made the finals, nor the honorable mentions, nor was there any acknowledgment that I existed on the earth or produced music from its surface--and, to make the implied rejection all the more clear, Salon featured some fairly terrible songs along the way (most of them were great, but some of them really weren't). It turned out I had produced a song that was, as far as the music bloggers at Salon were concerned, neither particularly good, nor particularly bad, nor in any way notable. It was apparently just not worthy of mention.

I understand being a professional musician to be an incredibly hard, crappy way to make your living--record labels want to screw you, promoters don't want to pay you, you live in hotel rooms, generally don't make very much money (with, obviously, a handful of very famous exceptions), and have to live in the perpetual hope that the next song or next album is going to be the one that puts you over the top. That doesn't mean I haven't lived my entire adolescent-to-adult life secretly longing to be one. It just means that I haven't ever gone after it with any amount of fervor that I couldn't later dismiss with a shrug of the shoulders saying, "oh well, I didn't really want that anyway."

I did kind of want it. A little bit.

Men Of Luggage (4:12)
(if this link doesn't work for you, try our Artist's Page on MacIdol.com)

Next: Man's inhumanity to Man!

Wednesday, December 13, 2006

A Brief Missive

Dear Jim Rutz,

Eating Soy did not make you gay
. You're. Just. Gay. Totally, totally, gay.

Regards,
T.G.

(with a nod to broadsheet).
Next: An exposé on textured vegetable protein!

Tuesday, November 28, 2006

Applied Mathematics

A good article in about the perils of mass-screenings (for security, disease, etc.) appears in Slate today. It points out the proverbial sharpness of the other edge of the Sword of Inference (please don't strain yourself going after that metaphor. It's not worth it). There are perils trying to make discrete inferences based on larger trends and there are perils trying to apply a trend as a discrete principle to large sets of data.

I took one applied math course in college, and in it I learned a bunch of things I have never since applied to anything, anywhere, ever. I really liked the professor, though; his name was Dr. Elderkin and he had these enormous hands that he would flap open and closed while he was lecturing, creating gale force winds that blew chalk dust around the room. Mrs. T.G. recently pointed out that I do this myself sometimes, so apparently the habit had quite an effect on me.

One of the useful things I learned from Dr. Elderkin is why screening for, e.g., diseases across populations is counter-productive and makes for bad social policy. Here's an example of the painfully stretched metaphor I tried to construct above.
  • Say the test for HIV antibodies in the blood test correctly identifies the presence of the antibodies 99% of the time, and 99% of the time it will correctly tell an uninfected person that he or she is not infected. Let's further guess that one million people in the US are infected with HIV (I'm making all these numbers up, but they're reasonably close to the actual numbers).
  • We test all adults--say, 100 million people, for HIV, and again we'll estimate that 1 million people are actually HIV positive.
  • The test is 99% effective, so (.99 * 1,000,000 = ) 990,000 HIV positive people learn that they are HIV positive. But it also gives a false positive 1% of the time, so of the remaining population, (.01 * 99,000,000 = ) 990,000 are given false positive diagnoses. That's 1,980,000 positive results, half of which are wrong.
  • Your HIV test, which is quite accurate for the individual, turns out to be only 50% accurate across an entire population.
In actuality, the HIV test is only something like 96% accurate, so the results aren't even going to be as accurate as my example. Moral of the story? Applied math: not generally applicable.

Next: inference vis à vis implication!

Wednesday, November 15, 2006

This Post Is Not About Veronica Mars

Here is something about which the only response I can muster is: holy, holy God, there are stupid people in the world, and they're all apparently in government. The Missouri State legislature decreed that the reason we have a problem with illegal immigrants coming and working in the US is that we have aborted some 80,000 potential Missourians. Apparently right now these potential humans would be in their working prime, eager to snap up those positions in abbatoirs and rendering plants that pay three dollars an hour with no health insurance. In addition, unlike those foreign illegals, these red-blooded Hypothetical Americans would be patriotic and not demand services of any kind for themselves or their myriad hypothetical children.

If the a prioris of the conservatives on the panel that produced this report (the panel had ten Republicans, all of whom signed the report, and six Democrats who a) didn't, and b) were apparently extremely embarrassed about its conclusions) weren't transparent enough (Abortion is bad! Illegal immigration is bad! If only there were some way we could unify the two...), they also managed to throw in there that "liberal social welfare policies" are to blame for Americans in general not working, and that all income taxes should be abolished in favor of sales taxes. Missouri: Punishing the Poor for Being Poor Since 1821.

I really can't, you know, possibly address all the flawed social policy here. Nor could I possibly ever cover all the fallacious logic that this one piece of policy analysis manages to stir up. And realistically I don't expect everyone to be able to root out flaws when trying to construct arguments that attempt to assign cause to observed phenomena by positing historical counter-factuals--those arguments are difficult at best. But Mother of God, who on Earth could possibly think that the source of America's social ills is that IT DOESN'T HAVE ENOUGH POOR PEOPLE?

</rant>

Next: Veronica Mars: The Nancy Drew of historical counter-factuals?

Sunday, November 12, 2006

The Ur Web 2.0 Post

In the comments of Tarn's blog, Dan asks apropos of nearly nothing, what Amazon EC2 is. While on the one hand, I'm slightly stunned that anybody outside a very small collective of developers has heard of Amazon EC2, Dan also says that he doesn't really understand what I talk about when I talk about Web Services. This is always true of everything I write about. But I'll try again anyway.

Most of the populace that throws around the term "Web 2.0" tends to mean things like YouTube, Flickr, MySpace, and Blogger--things where the content is user-generated. This ilk of websites, while being "new" in some sense, has never struck me as a particular innovation in and of themselves (though the technology that runs them, in many cases, is). To me, Web 2.0 is the process of opening up the platforms on which the web is built, so that the software that powers applications that have heretofore only existed in the browser can be used everywhere--the desktop, your mobile devices, your television, and so on.

When I explain what a web service is at parties, I use the following example. If you, a person, want to know some piece of information about any book out there, you'd go to your browser and surf to Amazon.com and look it up. There you'll find the cover image, all the bibliographic information, some reviews, etc. Now say you're a computer program and you want that same information about a book (because, e.g., you're a library program and you want to be able to display information about a book when somebody scans the barcode). You, alas, do not know how to use a web browser. You could be programmed to read web pages, but web pages come in all sorts of formats and change all the time, and computer programs have to be taught to read each one individually, because they're just that stupid sometimes. So web services are like web pages for software applications--sets of functions that they can call in order to get information that will be returned in a format that they, the software applications, will understand. In the example I just gave, the web service would have a function that, when you send it a book's ISBN, returns a formatted set of information about that book, with its title, author, number of pages, and so on, in a format that the software has been programmed to understand.

Amazon has a free web service called ECS that does exactly what I've just described, as well as many other things. You can sign up for it (and encounter some of my handiwork) here. Then you too could write software applications that are web-integrated and feature-rich and blah blah blah Web 2.0 blah. You are, in fact, using web services all the time--most of the widgets in OSX Dashboard, for instance, make web service calls to get their information. iTunes, when you pop in a CD, is querying a webservice to figure out what album it is so that it can populate the album information.

The nice thing about web services is that they allow you (for a definition of the word "you" that involves "you" being a "software developer") to use the software that Amazon wrote in more ways than just browsing their web site. ECS, for instance, allows you to use their shopping cart software, so that if you're running your own e-commerce website, you can use Amazon's shopping cart instead of writing your own (if written by a more business-oriented person, that sentence would have contained the word "leverage," and probably "synergy," and "core competencies," but I would have had to shoot myself in the head afterwards). The point is, Amazon sells things. They started out with books, then music, then pretty much everything that was a thing, and now they're selling (or in some cases, just giving away) the use of the software that they wrote to sell those things in the first place.

Amazon S3 and Amazon EC2 are the next phase of that: selling something that Amazon has in excess most of the year (hard drive space and server time--Amazon has to have enough hardware on hand to run the website at Christmas when we get vastly more traffic, but the rest of the time these servers are just sitting around). EC2 (or "Elastic Compute Cloud"--which, by the way, was called "Amazon Execution Service" at first, until someone finally noticed that the name was liable to give people the wrong idea about what the product did) allows you to create computer jobs (e.g. you need to render some massive 3-D images, or sift through a mass of data from SETI looking for patterns) and rent time on Amazon servers to perform them, which means you don't have to buy and maintain your own server. And like S3 (which, to review, is a virtual harddrive), it's also a web service, so you can do this from a computer program. Which in turn means you could, e.g., run a website that renders massive 3-D images or analyzes SETI data, host the actual website on your tiny slow machine in your basement, and farm out all the labor and storage out to Amazon, thereby changing the world. Or at least that's the way it looked on paper.

Thursday, November 09, 2006

We Used To Be Friends (A Long Time Ago)

Tarn alerted me to the possibility that there might, out there in the universe, be a dashboard widget that would allow me to post to The Odds Are One from OSX Dashboard. And indeed, such a widget exists. Here I am, using it. And lo, the world is made new. Surely this is the technological advance that will foment my creative breakthrough, allowing me the freedom to blog every day, to dash off some brilliant paragraph or two just before I head off to bed. Surely the only thing that was standing in the way betwixt me and Great Art was better UI.

What with the Forces of Evil having been (temporarily) defeated yesternight, The Gadflies have, this night, gone out for a celebratory evening involving sushi and lots of alcohol with our compatriot Mita. Somewhere last evening, after the votes were cast but while the Webb/Allen election was still in doubt (it's been called for Webb as of this writing. For the leftist radicals that live in the Gadfly household, Webb is absolutely nothing to write home about, but being that he's Number 51, for tonight he's one of us. In summation: hurrah!), I conceived of an OaO post about cause, wherein I would talk about how pundits would, in the future, attribute some manner of cause to Webb's or Allen's defeat. In the event of the latter, of course, they'd talk about his "Macaca Moment," and how this brought into stark relief the fact that George Allen is a fucking lunatic. If he had won (which he didn't. I'm quoting myself here: "Hurrah!"), somehow, that factor would no longer achieve the golden label of "Cause." This in spite of the fact that it would have influenced the exact same number of votes (in review, all votes have already been cast in this post-hoc estimation of voter intent)--in that case they'd be talking about how Webb's overt sexism and failure to articulate a position other than one of anti-party-in-power weren't enough for him to take the crown. The thing is, we exist in a state of superposition of Webb/Allen election. It happens that Webb has 7200 more votes out of 4 million or so, and so he takes office, but as with the Presidential elections of 2000 and 2004, somebody has to take office despite the fact that nobody won. So suddenly, and for the rest of time, "Macaca" becomes a political cliche for that moment of political implosion, in spite of the fact that Allen ended the race in a virtual tie with the bass-ackwards anti-feminist ex-Republican running against him. Narrative is funny that way.

There are so many problems here that I can't even being to describe them. One, the Blogger Dashboard widget doesn't have a scrollbar, so instead of taking the hint that I had to write shorter posts, I posted this and am editing it in a browser. Second, I just blogged about the thing I meta-didn't-blog-about. But the real problem here, in true Chambersian form (I'd have a better href for you there, but I'm too drunk right now), is that I have already blogged this post 10,000 times in 10,000 different ways, and you read it and understood it the first time, and indeed recognized the phenomenon of Odds-Are-Oneness in whatever discipline you came from already, and were already going to subconsciously apply the principle every time you read about politics and "Macaca Moments" for the rest of your life. Yet I just wrote this post anyway. So what the hell am I doing?

Tuesday, October 31, 2006

The State Machine

Dorky engineer joke of the day (from a cartoon on a co-worker's office door):
  • Person: Make me a sandwich.
  • Other Person: What? Make it yourself.
  • Person: sudo Make me a sandwich.
  • Other Person: Okay.
One more thing I wanted to call out from the post on String Theory a couple of weeks ago was another upshot of the existence of Planck's Constant: the existence of a smallest measurable unit of time. This means if you are, say, tracking the path of a photon (which I myself was doing just the other day), the best you'll ever be able to do is take a series of "snapshots" of the movement of that photon (to say nothing of the Heisenberg Uncertainty Principle, which implies that you won't be able to get an accurate position without upsetting the velocity). I previously argued that for quantities theoretically smaller than those derived from Planck's constant, there is no reality--at least not as we understand it. If I were to look at the snapshots, I could try to infer to motion that was taking place "in between" each snapshot--I could see the particle in one place in one picture and in another place in the next. But the Heisenberg Principle implies that I'd never be able to know for sure, and the 20th century interpretation of this is that there simply isn't an answer. The particle may as well have teleported from one place to the next.

(L., who has not followed a single thing I have written so far, has just keyed in to the word "teleported." She is very interested in somebody inventing teleportation).

So does this mean that we're actually living in a universe constructed like an early-80's version of Flight Simulator? Are we experiencing a reality in which a few instants of processor time are taken to render each frame, with each frame only slightly different from the last? Is it only the (seemingly trivial!) fact that these instants go by about a billion-trillion-trillion times too quickly for us to detect that makes us believe that we're experiencing reality as a continuous stream of events rather than a series of pictures?

To further abuse the computer metaphor, the universe seems an awful lot like it has a maximum processor speed--it has an absolute upper limit (the speed of light) at which information can be delivered, and it seems to have an absolute lower limit on its resolution. I don't know if this is a particularly good metaphor, though, or if it is, what it might signify.

Next: Nobody understood the Flight Simulator reference, did they?

Wednesday, October 11, 2006

Begetting Too

Men of Luggage, check your coats, sit down and stay awhile,
Loosen your ties. Enough rope to hang yourselves
Is waiting on the nightstand when you're picking up the phone.
I thought at least you would have figured out by now

That same mistake that you'll make over and over again

In your lives. You're burying the past, then just digging up the graves.
Don't blame yourself, I know you'd fight it if you could.
You're making up the rivals who are knocking down your door,
Don't hang around, they might be coming back for blood.

That same mistake that you'll make over and over again,
That same heartbreak that you'll make over and over again.
So travel light.

And you're wearing The Coat That Isn't Keeping Out The Cold,
And you carry that torch for yourself.
And I warn you the flame might just burn you at the touch,
But you don't want to hear about it all that much.
So travel light.

Men of Luggage turn around, try to retrace your steps
I do not think they will be coming back for you.
Men of Luggage it's okay, it's hard to live this life.
In your shoes I don't know what I would do,
So travel light.


Next: Context! (No, just kidding, no context.)

Friday, October 06, 2006

Seven Minutes in Heaven

(No, just kidding. It's about Physics again).

The discovery that launched the century-plus-long miasma of chaos and discovery in which Physics now finds itself was Max Planck's discovery in 1900 that energy comes only in discrete quantities, the eponymous quanta of Quantum Physics. At last count, the satchets of energy came in 6.6260693 (+/- .0000011) x 10 -34 Joule-second-sized pieces, so they're not, you know, big or anything.

The fact that energy comes in packets necessarily creates other smallest measurable units--e.g. the Planck Length, which is the smallest distance that can be measured (1.62 x 10-35 meters), or the Planck Time, which is how long it takes a light photon to traverse that distance (5.39121 × 10-44 seconds). You can see the rest here (no, don't really click there. You aren't actually interested). The existence of the Planck Time is of particular befuddlement to the physicists who are trying to figure out how our universe started. They have a pretty good idea of everything that happened starting from 5.39121 × 10-44 seconds--elementary particles were formed; galaxies, stars, planets coalesced; THE GIANTS WIN THE PENNANT! THE GIANTS WIN THE PENNANT; your first kiss playing "Seven Minutes in Heaven" at Tana Barton's house in the 7th grade; they know all of that stuff. But from time 0 to 5.39121 × 10-44 seconds, before the forces of nature and elementary particles formed, that stuff's a theoretical mystery.

For those of you whose eyes glazed over in that last paragraph, here it is in handy chart form:
TimeThings Known
5.39121 × 10-44 - nowStuff
0 - 5.39121 × 10-44Fuck all

What I like about String Theory (which I mentioned last time and is the point of this entry--bet you didn't see that coming) is that it agrees with the Hindus and the Buddhists that the universe is vibrating. What bugs me about it is the same thing that probably bugs most people about it--it has thusfar failed not only to make any testable predictions about the nature of the universe, but any predictions it could have made so far would never be testable, for (very, very roughly) the reasons I list above. The strings of String Theory are supposed to be Planck-length one-dimensional objects. To form all the particles and forces that are so far known, they would have to have eleven or twelve or thirteen dimensions in which to symmetrically vibrate.

There could well exist a dimension or twelve too small to be observed--I'm all about all the extra dimensions, myself. What really bugs me about String Theory is that at the Planck length and smaller, there is no is. What we learned from the first part of 20th Century Physics is that if nobody observes something, its properties...aren't. Light is both a particle and wave until somebody forces it into one or the other identity. Entwined particles have no spin until you observe the one or the other, and then they both have complementary spin. You could never observe a Planck-length string--to make such an observation you'd have to energize a photon so much that it would create tiny black hole in the location you were trying to observe. So as far as I'm concerned the universe is actually made up of myriad stoats of Planck-length size, all shrieking their stoaty cries at various frequencies and timbres, each shriek determining whether that stoat forms a neutrino, graviton, or gauge boson. It happens, by remarkable coincidence, that my Super-Stoat Theory of the universe uses the exact same equations for the cries of my tiny, tiny stoats that String Theory uses for the n-dimensional symmetric vibrations of their tiny, tiny strings. But my theory involves infinity percent more stoats than does String Theory, thus making it superior. And there endeth the lesson.

Next: Somewhat more useful lessons!

Wednesday, September 27, 2006

Where Am I?

Some things about which I didn't blog the last two months:

  • George W. found a post-hoc rationalization as to why he opposed stem cell research. He apparently thinks taxpayers shouldn't have to fund something to which they might object. Unfortunately, my computer ran into a deadlock created by intense irony as I was blogging, and I lost the entire post.

  • There was this article about a book which claims that String Theory is bunk (a belief to which I subscribe). It contains this musing:
    Modern physics is troubled by the anthropocentric character of the universe. For instance, had gravity been only a teensy bit stronger or weaker, planets and stars could not have formed. So, does the fortuitous value of gravity for planets and stars show that a higher power is manipulating physical law?
    But you read this blog (or at least you, you know, used to), so you are not troubled by such metaphysical questions. So I didn't blog about that either

  • Mita's ode to blueberries. It is undoubtedly the finest fruit-related blogging yet.

  • An existential crisis of indeterminate nature which caused me to think about taking a year off before entering the clinical phase of my acupuncture training. I'm still mulling that one over, so I didn't blog about that

  • Freedom From Blog continued to rock my butt.

  • The word "compromise" apparently means something entirely different than I thought it did. As I am apparently na├»ve about the ways of language and meaning, I didn't blog about that.

Those are just the things I can think of right now. This morning I decided that it was high time the first line of this blog stopped remarking about the fucking insanity of the Unabomber. So there we are.

Next: Back on topic!

Wednesday, July 26, 2006

Forbidden Blog Post #1

Ted Kaczynski was totally fucking insane.

This is not a marginal proposition, I admit. He was also, as it happens, totally brilliant. My brief, painful stint as a math graduate student happened to be the year that Kaczynski's manifesto was published in the New York Times; when he was subsequently captured, his c.v. was circulated around my department. Though he had suddenly retired from his post at Berkeley in 1967, his work as a grad student and a professor before that had been, strictly mathematically speaking, rather notable.

The next time I encountered the idea that Kaczynski was a brilliant man who had just snapped was reading an article published in Wired Magazine in 2000. It contained this quote from Kaczynski's manifesto:

First let us postulate that the computer scientists succeed in developing intelligent machines that can do all things better than human beings can do them. In that case presumably all work will be done by vast, highly organized systems of machines and no human effort will be necessary. Either of two cases might occur. The machines might be permitted to make all of their own decisions without human oversight, or else human control over the machines might be retained.

If the machines are permitted to make all their own decisions, we can't make any conjectures as to the results, because it is impossible to guess how such machines might behave. We only point out that the fate of the human race would be at the mercy of the machines....On the other hand it is possible that human control over the machines may be retained....Due to improved techniques the elite will have greater control over the masses; and because human work will no longer be necessary the masses will be superfluous, a useless burden on the system. If the elite is ruthless they may simply decide to exterminate the mass of humanity....Or, if the elite consists of soft-hearted liberals, they may decide to play the role of good shepherds to the rest of the human race....Of course, life will be so purposeless that people will have to be biologically or psychologically engineered either to remove their need for the power process or make them "sublimate" their drive for power into some harmless hobby. These engineered human beings may be happy in such a society, but they will most certainly not be free. They will have been reduced to the status of domestic animals.
Bill Joy, the cofounder of Sun Microsystems, proposed in that article that man (in his current, mostly biological form) was superfluous to the future in which Post-Humans were likely to arise.

This same idea popped back into my immediate consciousness when I was writing about Technological Singularities a couple of weeks ago. Lost in the fact that Ted Kaczynski was a psychotic murderer is the fact that, while there's no such thing as consensus on the matter, a lot of people who think about the future tend to think it will look not unlike the picture Kaczynski was drawing. At some point the future Unabomber must have hit a realization of the future and what it would be like for our species that was so exquisite in its horror that it pushed him over the edge, making him decide that any action taken to prevent it was justified.

I thought of this again this week as our president, Mr. Binary Opposition himself, issued the first (and most likely only) veto of his presidency in banning government funding of stem cell research. George W. clearly has no idea why he actually opposes stem cell research--it's impossible to parse a logical argument out of his rationale--but this is a rare case where I actually feel some sympathy for the guy. He's already lost this fight--it was lost when man emerged from the muck with big brains and opposable thumbs. Man will rush forward in His evolution as a species regardless of vetoes or pipe bombs, because that is what He does. A lot of good can be done for a lot of people who are currently suffering a lot of pain via research with stem cells, and the Right Wing Conservative Christian valuation of hypothetical blastocysts over actual living, suffering people who live in the world is a bizarre, twisted, and anti-human view of the world. But at the same time, this is one more step along the path of re-engineering the selves that humans will be in the future. Sometimes it seems like it might be a good idea to stop and think about that path, because the potential of it has scared the living crap out of a lot of very smart people.

Next: More of the Forbidden!

Thursday, July 20, 2006

On Blogging

The Pew study on bloggers is out, and I have to say I find it particularly...un...illuminating. It turns out half of American bloggers...are. Men. The median age is 30. 99% of everybody writes some sort of vaguely personal narrative that only their friends read. Check. Check. And check.

Some time ago whilst I was having one of those spruce-up-the-blog moments, I decided that OaO is a philosophy blog, and went to some blog-cataloging sites and entered it as such. Every now and then, according to the site meter graphically represented at the right, somebody clicks through one of these links and arrives here (begging the question, who has a philosophical question that can only be answered by surfing through blogs?). This, along with the occassional click-through I get from folks who find me because I'm listed as an Amazon employee in the blog kept by the Amazon Web Services evangelist (which, yes, is his actual job title), is the only time anybody that I don't know (where "know" appears in pomo-indicative italics because these days it extends to people I have neither actually met nor conversed with directly. Love this modern age in which we live. No, really. Do it. DO IT!) happens by The Odds Are One.

Sometimes I think I should actually blog about, you know, something, because blogging seems to violate all rules of narrative. Or flow. Or something. Sometimes I think I'm getting there. But I never quite do.

Next: Philosophy!
Tags:

Wednesday, July 12, 2006

The Forbidden Blog

I thought I'd continue the meme of writing about bat-shit insane dreams started (most recently) by Mita a few days back (her blog, This Particular Web, quietly joined the Hermeneutic Blog Circle a couple of weeks ago, so this will serve as an overdue welcome. If you haven't read her post about talking dumpsters, you haven't...read...it. Or lived. Or something). Some time early this morning I had a dream that seems blog-worthy in its out-there-ness.

Jos shows up in my dreams from time to time, in a very Six Feet Under kind of way. I had one dream a couple of months ago where I was hiking in Swansea and had become lost. I stopped to look at some kind of map that was there at the side of the road, and I heard someone from behind me say, "Lost, Pablo?" (Pablo being what he used to call me), I turned around and it was he. He walked me down the hill and, before leaving me on some kind of Swansea public beach, told me a bunch of things which at the time seemed very important, but which I couldn't of course remember when I woke up. Thus, when he showed up in my dream last night I tried very hard to pay attention to what he was saying.

So, the dream: I'm in Pacific Place, which is a shopping mall in downtown Seattle, and somebody who looks like Jos comes off the down escalator. I walk up to him and at first I'm not sure it looks quite like him; "Jos?" I say, "is that you?" The person says something to the effect of, "No, but hold on a minute," and I look at him again and now he definitely looks like Jos. We start to walk towards some store and I'm trying to pay close attention to what he's saying, but the only thing I can remember now is this snippet of conversation:

Jos: Listen, Pablo, you have to stop what you're doing. You're going to destroy humanity.
Me: What, you mean with the things I am blogging about?
Jos: Yes.

This dream ends when Jos hands me a pipe containing what he explains are the dregs of things smoked by the people on his side (the implication being that the actual substances themselves would be way too much for the living) (please hold your snickering until the end, thank you very much). I partake of his mysterious herbs and immediately wake up. End of dream.

I'll leave the commentary to you. Two things are clear, though:

a) I seem to have failed to take his advice, and
b) You are now reading The Forbidden Blog.

You have been warned.

Next: Commentary!
Tags:

Friday, July 07, 2006

The Event Horizon of Time

Emery drew my attention to this Wiki article about (the) Technological Singularity--an imagined point in the (relatively near) future where the technology of man starts to advance so far, so fast, that all current prognostications of the future beyond it are utterly useless. Since we've lately been talking about paradigm shifts (a concept closely linked) and humanity's certain doom that lies just around the next corner (okay, not the next corner...but definitely the corner after that....), I thought it merited a paragraph or seven of rumination.

The Technological Singularity purports to be the point at which the first posthumans appear, at which point human progress takes off so incredibly rapidly that our current paradigms of being and progress become worthless, such that we can't "see" beyond it. In this sense it's like a black hole singularity, so dense that no information of any kind can escape (well, except for tiny little bit of radiation from the edge of event horizon that's able to escape by using the Heisenberg Uncertainty Principle. Thank you, Stephen Hawking)--only instead of space from whence no information can escape, it's time.

Okay, fine, we're all already cyborgs; it's the freaking future, of course no information is escaping it; we've already seen this sort of thing in Terminator 2: Judgment Day--not that these aren't interesting critiques, but you don't read OaO to get critiques you can read elsewhere (you read OaO because you went to college with me, are one of my parents, are married to me, don't own an iPod, or because you're lost. Doy). I, too, think the AI future is relatively near at hand (no, really this time), if only because in the last ten years or so we've discovered that conscious intelligence acts lot less like a computer program (platform + set of instructions) and more like an anthill (a collection of individually "mindless" actors working in concert). It does seem, however, like the Technological Singularity is remarkably Humanity-Right-Now-Centric. If one acknowledges that the folks of the 19th century couldn't possibly imagine what was going to happen once we got computers, and the pre-writing tribes of man couldn't envision a life post-printing-press, one has gotta recognize that Technological Singularities are happening all the freaking time.

The metaphor of the black hole is another possible explanation of why the future holding certain doom for humanity seems to be such a pervasive meme. The idea that it too is an incredibly dense singularity from which no information can emerge goes a long way towards explaining why we as a population might happen to feel such visceral dread about our prospects in it.

What's also interesting in the Wiki article is the charting of the general acceleration of Man, His major advances, and His technology. The evolutionary leaps are happening fast and furious now; the things that took years today will take nanoseconds in the future, at least if this graph is any indication. My favorite koan on this topic goes like this: imagine the technology of, say, the Athenians versus the Romans a thousand years later. Unless you're an historian, you probably think of them as pretty similar in their technology. The Romans were, in fact, much more advanced--they had plumbing, aqueducts, and a radically advanced military, but compared to nuclear power and several million transistors on a chip less than an inch square, there's not that much technological difference between the two civilizations in the mind of the average person of today. Now think of the person two thousand years hence (or only two hundred?) whose technology is so advanced that he or she very roughly equates the technology of the present day with the technology of 1000 A.D. Can't imagine that? Me neither.

Next: That was only five paragraphs!
Tags: ,

Wednesday, July 05, 2006

The Network is the Computer

A little bit of semi-dismissive hype for the emerging Browser-Based OS'es appears in Slate this week. The author's major objection--users won't want to store their data on remote servers because they won't trust it--I find to be half irrelevant (as we maintain here at OaO, there are two types of data in the world: encrypted and world-readable) and half mutable (trust in product x equals good experience plus time). My current objection to the OS-in-a-browser (Slate reviews YouOS, I've also checked out Goowy) is that, on the client side, it is slower than dirt and keeps hanging my browser. At least for the time being this undermines the mythical touchstone of the (also, thus far mythical) Google PC--it'll be fast because it's running on the servers at Google. If the applications, written mostly in JavaScript and connecting over a DSL or even T1 pipe, can't keep up, it doesn't really matter how fast the underlying servers are.

I expect this problem to get solved in some meaningful way, because I too think the concept of the browser OS is just too freaking yummy. Being able to work on "your" desktop anywhere you can find a browser, much like the advent of web-based email ten years ago, has a lot of appeal; the current OS-specific attempts to allow you to work on your desktop machine remotely are slow, cumbersome, and prone to rather enormous security holes. Also, worrying about the amount of hard-drive space you need and backing up your own data against a crash is an enormous pain and off-shoring that problem to Google or YouOS seems...better. On the other hand, the Thin Client machine has supposedly been coming soon to a home near you for a long time, and we've yet to see it. This revolution, like the coming Web Service/Web 2.0 insurgency I keep heralding, still requires some technology changes. I'm not sure if it's necessarily a super-fast broadband connection in every home so much as something small that changes the perspective of the multitudes out there. Today 90% of the computer users out there think of their computer as a machine that runs Windows, and Windows in turn runs things on their computer. So what has to happen to make users think of a computer as something that runs a browser instead? And would this be a thing good or bad?

Next: Dirt! Is it really that slow?
Tags: , ,

Thursday, June 29, 2006

View From the Flip Side

The Botany of Desire by Michael Pollan starts off from roughly the premise that follows. Think of the flower and the bee. The bee thinks that he's just out gathering nectar from the flowers to make delicious honey (Mmm...delicious honey). He's also getting pollen on his legs while he does this, and unknowingly carries it to the next flower he pillages for nectar (with which he will make delicious honey, mmm, honey). Thus is he an agent in the reproductive cycle of the flower--in some evolutionary sense of the idea, the flower has gotten the bee to spread its genetic material for it, ensuring its survival. Now think of the corn plant, from the same perspective: here is a plant that is so incredibly successful that it has an highly advanced and mechanized animal (man) clearing entire forests and putting down nitrogen fertilizers just so the plant can grow for another generation. Sure, it's so that we can eat the plant, but as far as the corn gene is concerned, an individual corn plant isn't the point, is it? Wolves seem sharper, fiercer, more free, and more resourceful than the common domesticated dog, but in America today there are 10,000 wolves and 50 million dogs. So which animal is "smarter" about figuring out the world and how to survive and prosper in it?

In this blog I spend a lot of time arguing that you and I are the assembled sum of our genetic material, environment, and timeliness of our births; that, e.g., the idea that you have some sort of distinct "you-ness" that is constant across time and space, that you could be born in some other time and place and still be you is fallacious. It's not that I don't believe that there aren't aspects of our selves that are outside of genetic makeup and context, more that I think these aspects are shared among all of humanity, possibly all of life. This is, more or less, the evolutionary view of existence, and it's only when you (and by "you" I mean "I") read a book like The Botany Of Desire do you realize how contrary to your everyday perspective on your own existence this view is.

I once argued (in such extreme passing that you almost certainly missed it) that our own consciousness, our unique human intelligence, is nothing more than the acquired sum of billions of years of evolutionary accidents/designs like the flower "figuring out" that it can use the bee to spread its genes. In other words, it's not that the billion-year process of evolution that looks like the work of an intelligent actor (thus causing people to think it must be God, for some definition of "God"), it's that what we think of as intelligence and/or consciousness is the connected sum of billions of these accidents/designs. The only difference is that we've evolved that connected sum into a brain that can perform new actions like this in hours or minutes or seconds instead of over the course of thousands or millions of years.

Go back to the flower and the bee: if you look at the world this way, your view on consciousness is essentially how much agency you give the flower in its "decision" to use the bee to transmit pollen. What you're using to read and understand this post or decide what to eat for dinner or blog about tomorrow is the result of some uncountable number of those evolutionary "decisions." Don't think about it that way for too long, though, because probably your connected sum of evolutionary happenstance will explode.

Next: There it was, your moment of zen
Tags: , , ,

Monday, June 26, 2006

The Auto-Metaphor

Surfing, the actual sport that occurs on water, with a board, may or may not be one of those things one doesn't really get until one does it. It consists of long periods of time sitting out in the water waiting, punctuated by extremely short periods of time of high excitement. Then there's the whole editorializing: catch the coming wave or let it go by? Plus the language. Dude. Whatever.

I'm right on the verge of becoming a freak about surfing. I've done it twice now, and this weekend's foray to the Oregon Coast with friends didn't even involve long-boarding (what you might think of as true surfing), I borrowed one of my friend Ryan's body boards. Still, as with the last time (in Florida, when I actually did long-board), in the days afterwards I keep thinking about it. Mmm...surfing. The instant when you've caught a wave and you're propelled forward on the wake is the kind of thing that makes you want to look around and see who's watching you, as if you were ten years old. It's like a moment of Zen that lasts for ten seconds instead of the Planck Time-length instants you usually get.

The thing about surfing is that it's the perfect metaphor for what it actually is: you're literally riding a wave. All the things surfing brings to mind--patience, devotion, balance, going with the flow--all of those things seems like they're metaphors for things in life that then come back and are themselves metaphors about surfing.

A lot of things happened the last couple of weeks that I've wanted to blog about--the last book read, the last tv show watched, the most recent trip(s) taken, the state of the latest musical foray into which Mark and I have ventured, how school is going, and so on. They've all seemed connected lately, a zone where everything is related to everything else, everything a metaphor for itself. Or, you know, whatever. Dude.

Next: A metaphor for surfing!
Tags: ,

Thursday, June 22, 2006

Writing Begets Writing

    Come in to the V.U. on a cold Fall night,
    Blending with the crowd in my trench coat and a tie.
    Everyone will dance and we will rock and roll all night long.
    The Fellows are opening for Jon and Ken,
    Helping us to reach a state of teenage Zen,
    As cool as anything that I knew; I knew I was wrong

    And you always wanting everyone you know to be okay.
    Be okay.

    Follow you at night we go across Red Square,
    It's one in the morning and there's no one else there.
    Kneeling in the fountain, I hear your whispers in the dark.
    The viewing of the sky in telephoto lens
    Gave me the illusion we could still be friends.
    Now we've shaken hands, agreed to always drift apart.

    But I loved you so, even though you would already leave me,
    When you took a stand next to
    The Man Who Used to Hunt Cougars For Bounty.


Next: Context!

Wednesday, June 21, 2006

Quick-Hitter Wednesday

  • The Gadflies have lately returned from a trip to Yosemite National Park with the Gadfly in-laws. I hope to have pictures of the purple mountains' majesty Flickr'ed soon. Once upon a time this was supposed to be a nature journal, so you'd think I could come up with something, you know, insightful about the nature of being based on this trip, but I can't think of anything. We did some great hikes, experienced the train-wreck (or rather, bus-wreck) that is the Yosemite shuttle system, and got eaten by three trillion mosquitoes. Good times.

  • Whilst I was there, I picked up The Botany Of Desire from the pile of books that's found in every rental cabin and bed and breakfast from Here to Somewhere Very Far From Here. The awesomeness of this book cannot be understated, and soon I shall blog about it.

  • To this article I can only say: Yes. Duh.


Next: Content!

Monday, June 12, 2006

Blink

Blink is Malcolm (The Tipping Point) Gladwell's most recent book, a monograph about making snap decisions based on nothing but immediate impressions. So far, the thesis of this book seems to be: sometimes snap decisions are good. Sometimes snap decisions are bad. To be fair, Gladwell is also interested in what happens in the brain in the first two seconds it's confronted with a situation that make it come to an immediate conclusion. To be fairer, Malcolm Gladwell is a much more interesting, much better writer than I. Anyway, some random thoughts on the subject:
  • One of the first examples in this book made me think that the whole premise was crap. It's this: in a psych experiment, students are shown video of a professor teaching a class and then asked to fill out a teacher evaluation based just on that one class. They do, and the evaluations turn out to be remarkably consistent with evaluations completed by students who actually took a class with that professor. In the next iteration, students are shown the video for ten minutes and then asked to evaluate; the results are the same. In the next iteration, students are shown two seconds of a video and asked to evaluate the teacher; the evaluations are still consistent with those filled out by students who took a class from the professor. Ergo, you can tell whether you like a professor, whether that professor is a "good" professor, based on very little observation.

    Right now, at least two of my readers are screaming at their computer monitors, because they know another fact that's pertinent: teacher evaluations tend to correlate more with a teacher's age, gender, and general demographics rather than, say, actual effectiveness as a teacher. Here's what you can tell from watching an hour, or ten minutes, or two seconds of a video of a teacher: age, gender, and general demographics. So maybe what we really learned here is that people tend to decide how they feel about somebody within the first two seconds of seeing or meeting them and then never change their minds.

  • Second experiment. You're in a room with some furniture (tables, desks, chairs), an electrical extension cord, a pair of pliers, a yardstick, and some other detritus. There are two ropes hanging from the ceiling. The ropes are close enough and long enough that one could tie the ends together, but far enough away from each other that you can't hold on to the end of one and pull it to where the other one is. The task: tie the two ends of the ropes together. There are four ways to do this, so the researchers claim. If you want to play along at home, stop here and give it a quick ponder (Theme from "Jeopardy" plays. Thumbs are twiddled. Time elapses). Here are the first three: tie one rope to an item of furniture, move that piece of furniture towards the other rope, grab the other rope and bring it to where the furniture is. Make your arms longer, either with the yardstick or possibly a chair, so that you can reach the second rope. Finally, you could tie the extension cord to one rope to make it longer, so that it will reach the other rope.

    These solutions are kind of hard to think of if you're not in the room; e.g. it's not clear just reading about it that the yardstick will be long enough to reach the second rope while you're holding the first. There's a fourth solution, which nearly no one actually doing this experiment thought of initially: tie the pair of pliers to the end of one rope making a pendulum, and start it swinging so that you can grab it while holding the other rope. The trick of the experiment was this: when, as invariably happened, the experimentees couldn't figure out the pendulum solution, a researcher would go into the room to open a window or some such thing, and in so doing would "accidentally" brush one of the ropes and make it start swinging. After this happened, the subjects would immediately figure out the fourth solution. However, when asked what made them think of it, every person had a different story ("It just came to me," or "I thought of how I used to swing from ropes in gym class"), and nobody seemed to recognize that it was the researcher bumping the rope.

    Gladwell takes this as evidence of what he calls "the locked door," wherein you don't have conscious access to the unconscious inspiration that happens in your brain. It's not that the experimentees were lying, it's that they simply didn't have access to the information that would tell them they were inspired to make a pendulum, so they had to make something up.

    I take this as evidence of my favorite OaO chaotic bugaboo, The Post-Hoc Narrative Interpretation of Otherwise Unparseable Happenstance (must come up with a shorter name for that). Every one of those subjects believed, I'm sure, that the explanation they gave was in fact the actual cause of their inspiration (that is, he or she wasn't even aware he or she made it up), because we're well practiced in generating little narrative translations for the chemical signals that go off in our brain. What's unique to me about this experiment is not that its subjects made up a story to explain a thought they couldn't otherwise explain, because that's happening all the time, every instant of every day, every thought you have (c.f. my last post). What's unique about it is that there's an actual tangible source for the key inspiration--as Gladwell would say, we have a window through the locked door. In fact, I'd be willing to argue with Gladwell/you/anyone who cared that you can't actually make a case for the researcher brushing the rope being any more of a cause for the inspiration than whatever the subject him or herself came up with post-hoc. But that's probably a topic for another post.

Next: Pre Post-Hoc Narrative!
Tags: ,

Wednesday, June 07, 2006

Enlightenment Wednesday

Tuesday nights are band practice nights (where "band practice" = "Mark and me playing extended jams to a click track, recording it on computer and attempting to meld a song out of it."), making Wednesday the day when I've got a couple of new mixes to listen to, ponder, and deconstruct while I'm sitting at work.

I don't know if it's this precise process (trying to criticize my own musical work) that made me aware of it, but somewhere along the line I realized that the workings of my brain are like a giant David Foster Wallace novel of which I am generally only dimly cognizant. It goes something like this:
  • Brain has thought
  • Thought is translated into English
  • Translation is compared to original thought, and possibly retranslated
  • Various other metaphors for original thought are constructed
  • This happens, like, 20 more times whilst at the same time, the footnoting and reanalyzing of the original thought begins
  • All of these thoughts must be translated, checked for translation accuracy, metaphorized, repeated, analyzed, footnoted, etc. etc.
  • Repeat ad infinitum
The other thing I realized is that this David Foster Wallace novel is extremely self-critical of...um...itself. In the particular arena of mention above, the critical analysis has lately come to the conclusion that I can't play the guitar for shit. A typical exchange goes like this
  • Thought evoked by listening to a bit of one of the rough mixes of last night's practice
  • Translation: "I screwed up in that bit."
  • Query: "How bad a screw up was it?"
  • Analysis: "Your finger fell off the string and it killed the note right in the middle of the phrase."
  • Metaphor: "Your guitar playing here mirrors your inability to produce anything of artistic value ever. Every time you venture towards something good you try to get too fancy and you kill it."
  • Protestation: "But this is just a jam session. You're improvising here. Everyone makes mistakes in this context."
  • Critique: "That's not the point and you know it. Your mistakes should be beautiful things, too. But they're not. Your playing is utterly without spirit."
  • Rejoinder: "That's only because you're listening to yourself. All you can hear is the mechanics of it. You'll never be able to hear your own music as just music."
  • Critique: "Um...no. It just sucks."
  • etc., etc.
Yes, my own internal monologue refers to me in the second person. I'm not saying it's not weird. Most of the time this goes by and I'm not really aware of it. Only when I'm paying attention to it is it that literal. Anyway, if I've read and understood the writings of the Buddhists and the Taoists, the state enlightenment, vis à vis what I've got going on now, should be something like this:
  • Thought
  • Other thought
  • A third thought
  • ...
What's always bugged me about that is that, well, it doesn't seem like enough. I mean, it would be great for my state of mental well-being if I could shed all of the meta-commentary that surrounds everything that goes through my head, but it doesn't seem like it would turn me into the Buddha. Somewhere in the music of this last Tuesday night, though, I had a little moment of Zen where I realized that, in fact, that is all there is to enlightenment. Being that it's the Tao and as soon as you put the Tao into words, it's not the Tao any longer, I can't really explain the insight. I think it's something about shutting off the commentary track and just enjoying the movie. Maybe, if I don't think about it for a little while, I'll come up with a better koan for it.

Next: The Sound of One Hand Clapping!
Tags:

Monday, June 05, 2006

Truths, Convenient and Otherwise

Sorry about the gap there. I suppose a week and a half isn't all that interminable a lag in the grand scheme of things, but the point of blogging is to keep up, post something every day, blah blah blah. After a month of non-stop school and work, though, I needed a week of doing nothing. Anyway, nobody really complained that I wasn't blogging, which is...probably...a bad...sign.

So, yeah, that Al Gore movie. Go see it. It's not as overtly political, extremist, or unbalanced as one might expect (and by "one" I mean "I"). It's pretty straightforward without being boring, and it has a handful of really breathtaking images--or rather, pairs of images (then and now pictures of glaciers of the world, and the breakup of the Larsen B ice shelf, e.g.). If you're not convinced yet, it also functions as a movie-length ad for Mac laptops. Just throwing that out there for any interested readers.

You read this blog so you know that climate and weather, being one of those systems that is subject to incredibly small variances in temperature, pressure, cow flatulence, cosmic rays and muons from space, solar flares, and so on, is an agglomeration about which it is impossible to make discrete inferences. You simply cannot say that storm 'X' or drought 'Y' was caused by Global Warming (actually, you read this blog so you know that it's my contention that you can't even make inferences about discrete coin flips. So, you know, whatever). This was, for me, the strength of this movie--it looks at the very long, continuous, much more inference-able view.

Over the course of the last several million years the earth has gone through a relatively predictable course of ice ages and subsequent warmings. It's only in the last 50 years or so that we started measuring atmospheric CO2 levels, such that fifty (and thirty, and twenty) years ago, the popular scientific view was that we were headed back into another ice age. Then came weather balloon studies, and then they started drilling cores in the polar ice caps and doing some detective work. What they discovered was that there seemed to be a very clear historical connection between CO2 levels in the air and mean atmospheric temperature of the earth. This is the first graph presented in the movie and it's an incredibly powerful metric, showing both what carbon dioxide levels have been doing over the past 650,000 years and what they've been doing in the last fifty. Gore points at the height at the last ice age and follows it to where we are now and says, "The spread of this graph is, in the city of Chicago, the difference between a very nice day and having a mile of ice over your head."

Where we are now is in one of the temperate periods--the difference this time is that instead of heading back down into the next ice age, as we did all the times before, both CO2 levels and temperatures are heading upwards ever faster. And where we are now, as it turns out, is also this odd little quirk of the Earth's history. As has been pointed out many places, the human species has had the brain capacity it has now for the past 50,000-odd years, but it's only in the last 5,000 or so that civilizations arose. It's Odds Are One for the history of humanity--it happened to be that big-brained homo-sapiens landed in a climate where you could stop hunting and gathering and start farming and domesticating (other people are just calling that "luck" these days, but I hear that if you invent a hermeneutic principle that describes the same thing using bigger words, people will give you a book contract). Having gotten to this point, and seemingly having a vested interest in, you know, sticking around, it seems that not only do we not want it to get markedly warmer, we kind of need to maintain the same conditions that we've grown up in. That ice age we might otherwise head into isn't going to be much fun either. If by some miracle we actually don't heat the earth up beyond its carrying capacity, our next problem will be figuring out long term climatological geostasis. Well, not your or my problem. But somebody's problem.

Next: Knee-jerk critiques of more texts!
Tags: ,

Thursday, May 25, 2006

Poli-palooza

One last thing about warrantless wiretapping and data-mining that was referenced in the Salon War Room Blog that I linked to at the very top of my last post. It is nicely illustrated by this week's Tom The Dancing Bug (subscription, or watching a brief ad, required). This strip, by the by, is seriously the best weekly comic in the history of the world.

The following thing needs to be said. If you believe that the government is illegally spying only on The Terrorists, you believe something that is simply irrational. If you do not believe that the government is using warrantless wiretapping and data-mining programs to monitor its political enemies, you are holding to a precept that is not rational to hold. Sadly, if you do not think that every email you send is being data-mined for keywords by net-monitoring computers at the NSA, you are not thinking rationally.

At every revelation of the illegal monitoring program, the government has stated outright untruths about its scope. First they claimed they always got warrants. When it turned out they weren't getting warrants, they claimed it was only international calls. When it turned out it wasn't only international calls, they claimed it was only calls where at least one party was international. When it turned out it wasn't only calls where at least one party was international, they refused to acknowledge the program existed. If you don't think they're telling the truth right now, you are not paranoid. It is simply not rational to think otherwise.

Next: Sigh.
Tags: , ,

Tuesday, May 23, 2006

Seriously, what now?

I simply cannot read another story like this and, you know, stay sane. Here, apparently, are The Rules: you cannot criticize the government because that emboldens The Terrorists. You cannot limit executive power because the would inhibit the ability of the executive to fight The Terrorists. You cannot investigate the possible misuse of power by the executive because then The Terrorists will find out how we are fighting The Terrorists. Here's what you can do: shut the hell up and sit there while the executive does whatever he feels like doing.

I was reading the latest Dan Savage last night, in which he picks up the latest hit from the Religious Right, the War on Contraception (Second Americano's recent take is here). His thesis is basically this: "Okay, I don't agree with it, but I understand if you didn't want to all stand up for gay rights, because you're not gay. But straight rights are being trampled on as well, and I simply don't understand why everyone is taking it lying down." On this subject I want to say this: opposition to gay rights makes me insanely angry. The anti-choice movement makes me insanely angry. People who oppose contraception make me insanely angry. And it's not the fact of it that makes me insanely angry, people have a right to their religious beliefs and if they don't believe in contraception or abortions or that people should be gay, they don't have to use it/have one/be gay. What makes me insanely angry is that these positions aren't being taken as moral ones, they are taken as political stands that must be enforced on everyone.

I'm sure my model of the world and how it should work is riddled with hypocrisies that I just don't see because, well, I'm me. Maybe the only difference between the leaders of the Religious Right and me is that they're in positions of power and I'm not. Maybe all of their fears about liberals and liberalism are correct: if I were running the country, I'd probably want to have long talks with The Terrorists about their feelings while forcing everyone to have secular gay abortions after giving each other hand jobs while the Religious Right is taking a nap on the front porch. I do, after all, have True Belief that they are wrong and I am right. A fundamental tenet of these beliefs is that I don't get to legislate what they do with their lives and in their homes, and neither do they. But maybe this tenet is more mutable than I think, and that, like State's Rights, it's something one only has when one is not the party in power.

Anyway, the theme that these two things (objection to both religious intolerance and unchecked executive power) share is that I simply don't know what to do about them any longer, other than stand up and say that they're not okay with me. If you're waiting for a groundswell of popular opinion to force some kind of change and/or accountability, I gotta tell you: so am I, but I don't see it happening. If you're waiting to see what happens in the 2006 elections, I can save you the suspense. What will happen in the 2006 elections is nothing. The polls you've been reading that say congress has approval ratings lower than the President? Irrelevant. Voters hate pretty much everybody in congress except their own representative, whom they will happily re-elect. What voters want is for people in other districts to toss their own representative out. What will happen in 2006 is that the Republicans will lose a couple of seats in both houses, and the Bush administration will crow that since they didn't lose control of any branch of government, the nation agrees with them and their policies. The media will repeat this claim. And the slide will continue.

Somebody out there tell me what to do. Don't tell me to march or write or sign petitions or donate money or call my representative, because I've done all these things and they don't work. The only thing I can think of to do is go take a shower, because every time I blog about politics, I feel dirty.

Next: Something somewhat happier!
Tags: , ,

Wednesday, May 17, 2006

Paradigm Shifts of Doom

2+2=5
for large values of 2
(on a t-shirt I saw in the halls at work this morning. There is probably nobody on earth who will think that's as funny as I do, on account of no one else is that much of a dork).

I'm rather skeptical of the word "paradigm" myself, though this is probably due to the fact that I work in corporate America where, as Sam points out, "paradigm shift" means, "change." (Really it's only to be expected from the universe where "functionality" was coined to replace that unwieldy and opaque word, "function"). Sam/Rebecca's paradigm post (link above) does, however, sequé nicely into some things I ran out of room in which to say last post.

Sam:
This is what happens when we run out of a primary energy source - we switch to another one. And we don't know what the logic of the economy, the environment or world politics will look like in that new energy paradigm - because we aren't in it.
First, an interesting thing to note is that we actually haven't run out of a primary energy source before--at least not on the global scale that's likely approaching now. I gather that in the mid-nineteenth century it was starting to get dicey with whale oil, but then petroleum showed up and blah blah paradigm shift blah. Having only read half the book at the time, I also short changed The Long Emergency a little bit, because the author does speculate about what the possible paradigm shift is likely to look like: economies become local again, suburbanization ends, people migrate away from deserts as it becomes impractical to pump fresh water to them, etc. I also...uh...long changed him a little bit insofar as now I've gotten into the chapter about prions and mutating viruses and antibiotic-resistant bacteria, and have at this point concluded that I am reading a book whose thesis is, "JESUS FUCKING CHRIST! WE'RE ALL GOING TO DIE!"

The other night L. was talking to Mr. L. (her Dad, not me), who pointed out that when he was growing up they were certain they were all going to die of polio, unless of course the world was annihilated in a nuclear conflagration first. It's not that I don't believe these things weren't or aren't terrible threats and that we shouldn't take whatever steps we can to deal with them; in this case I'm more interested in what one is actually saying to/about the world when one proclaims that therefore The End is Nigh. Not so much Eschatology as, I guess, Meta-Eschatology.

I have an OaO answer, of course, which is that it's about ones narrative needing to have onesself be, you know, the end result of things--the same phenomenon that I claimed earlier causes people to reject Evolution in favor of Creationism. If things keep going on after you're gone and 99.999999...% of creation doesn't really seem to notice, that narrative you're making for yourself right now has this rather gaping plot hole. On the other hand, that answer doesn't entirely click for me--there's something really primal, it seems to me, in this belief/creeping suspicion we seem to have that we are Living at the End of Days. I suspect, like the Redness of Nature's Tooth and Claw that I mentioned last time, it is related to something that helped us survive at some point. Maybe it's from our mammalian ancestors who managed to survive the meteorite that wiped out the dinosaurs (which, I guess, really was the End of Days for them).

Next: the odds are again one!

Tuesday, May 09, 2006

Pre-Millenium Tension

Before I dive in to my next intractibly large post that weighs in on themes to broad to possibly sum up in a single short essay, let alone some academic article, I would just like to say this: Freedom From Blog is fucking awesome. Take this post as one example. Put it on your list of daily things to read, so that you can say you liked them before they were cool.

I'm in the process of reading The Long Emergency, James Kunstler's apparent rebuttal to those Julian Simon-inspired Long Boom missives from the age of soon-to-be-realized global prosperity. If I am already sounding skeptical of warnings of coming doom and/or clarion calls of coming utopias, it's because the various histories of the future have proved to be invariably and utterly wrong. The whole bio-/eco-/socio-/whatevero- system in which we exist is a chaotic amalgam susceptible to infinite factors small and large, and you're likely to have as much luck predicting the future by observing the flap of butterfly wings as global peak oil production.

I do, generally, find Kunstler's arguments pretty persuasive: our way of life, in America and more generally the developed world, is based on the fact that oil is cheap and readily available. We can live in the suburbs, eat fruit from California, wear nylon, raise mega-cattle on mega-farms, and buy lots of cheap plastic things because power from burning fossil fuels is extremely easy to get. The second point is that running out of oil is not the problem so much as the fact that once we reach the point where we are pumping out the maximum amount of oil that will ever be produced, we're screwed. Demand will keep increasing, but supply will never again be able to catch up (this is the so called "Peak Oil" point). Kunstler goes on to argue that none of the current alternative energies will be able to take the place of oil. He further argues that, therefore, there will be a bunch of wars and terrible conflagrations as our societies, built upon fossil fuel burning, fight to the death over the dwindling supply of it (and all of this is, of course, quite apart from global warming).

It's not, then, that I don't believe in the dark nature and/or stupidity of humans to kill a whole bunch of ourselves in order to merely put off something that's going to happen anyway (and maybe I shouldn't call it stupidity--we have an animal nature and it's apparently hardwired to protect our own genes at the possible cost of all others, such that it should not be surprising when we try to kill a bunch of Them so that We can live long enough to reproduce again. Maybe that's just as far as we're capable of seeing). (Question: what has to happen so that this urge is bred out of us? Will the Red In Tooth and Claw parts of us always survive because, well, they are what made us survive in the first place? Discuss). It's more that I am highly skeptical of the argument that we are living at The End Of Days. This argument has always already been made, and it has always (already) been wrong. "Repent, the end is nigh" is oft repeated. So far, the end has yet to be nigh. I'm sure a recession is coming (GWB has made sure of that) and there's probably a depression right behind it, made more likely the longer we insist on trying to milk the petroleum lifestyle. And, if we elect our Red-in-Tooth-And-Claw nature into office again, there'll be some more wars. It'll probably suck. It probably won't be the End of The World.

Next: Less Gloom! More Doom!
Tags: ,

Posts Of The Damned

If I have a major problem writing OaO, it's that I start these entries on enormously weighty subjects and then don't know where to go with them. This, rather than any particular time pressures in my life, is the most frequent reason I don't post for long periods of time--I'm working on a particular post and I get stuck. I've got six or seven posts sitting in this unfinished, unpublished state. Today I've decided to drag them out in a sort of pastiche of things never to be blogged about, on account of how they are no longer timely. Maybe you and your reader-response can fill in the missing narratives.
  • Thought Experiments

    Emery says this:
    The eternal return thing is just strange. Clearly, the repetition of my consciousness is an impossibility, because if it happened again, it wouldn't be mine. Part of individual identity is the continuity of existence. I am me because I was me yesterday, and the day before, and back in 1985, and back in first grade, in 1975, and so on. If there was some physically identical-to-me person in three trillion years, that would be a physically identical-to-me person, not me.
    I made this same statement, although in a totally different context. Suffice to say I agree with the conclusion of the argument. I don't, however, agree with the a priori (that it's because of some sort of bodily or existential continuity). Most all of the cells that made up Emery in 1975 have died and been replaced, the osteoclasts and osteoblasts have torn down and rebuilt the matrix of his bones several times over, and that person was four years old or so and Emery is in his mid-30s (I have met Emery only once, and I didn't know Emery the person I met and Emery the blogger were the same person (and I will happily accept arguments that they still aren't) until last week). But this isn't why I reject the idea of continuity as being the key to our sense of identity. In fact I think that continuity is a complete illusion....

  • "A strange game. The only winning move is not to play."

    Seeing things on stage, for whatever reason, tends to often speak to the state of our (where, for the purposes of this sentence, "our" = "L.'s and my") existence. So it was at the end of last week, where the official and actual end of our epic real estate saga and seeing August Wilson's last play, Radio Golf at the Seattle Rep, coincided. Radio Golf is about a lot of things, but one of them is about "playing the game," so to speak. If you, e.g., believe that the process of politics in this country is in some way broken, should you run for office? Yes, you've thrown your hat into the same broken process you want to fix thereby further validating it, but at the same time, how else can you change things...?

  • Coastlines

    How would you measure a coastline? Do you walk along the waterline with a tape measure? At high or low tide? Or what if you took a piece of string and started winding it around each stone, or pebble, or each grain of sand? There is of course, no numerical answer to this question, but there is an interesting non-answer. For some abstract definition of the word, "interesting," anyway....

  • Camp

    When I was 18, I was in a high school production of Grease....

  • (Baseball + "The Zone" * Kant) / (Steve Miller)2 modulo Synchronicity = superposition(Randomness, God)

    It just seems so obvious that organisms or systems that are more "fit" would naturally survive, even if they emerge totally by chance, because...well, because they're more "fit." It's a hidden tautology. This is something I've long been meaning to blog about--maybe next post.

    So here it is, the next post. This thread has gone in all sorts of directions at this point, you can pick it up at Freedom From Blog, or Second Americano, or not at all if you choose.

    Some defender of ID, perhaps in the recent Pennsylvania court case, trying to envision an experiment which would support ID as a theory (which, of course, you can't do, but that's for the next paragraph) came up with observing some bacteria in a petri dish, get a number of generations going, and seeing if any of them evolved some sort of flagellum. His argument was that you wouldn't, because a flagellum is irreducibly complex, therefore ID is true (no word on why the Intelligent Designer wouldn't decide to intervene in the experiment and give all the bacteria flagella immediately--maybe his or her work is already done here?).

    The only problem with that example is that this experiment can actually be done, and it proves exactly the opposite, that a flagellum is very reducibly complex. This is what happens: you put, say 100 million immobile E. Coli in a agar solution and they sit there. Eventually, they run through the food that's around them and, since they can't move to a place where there's more food, they die. Or rather, 99,999,999 of them die. One of them has a mutation that makes a protein filament near its cell wall stick out a little bit, and when the cell is literally in its death throes, the filament wiggles a little bit, and it actually propels the bacterium a few millimeters to some available food, and it survives. That's the only one that lives to pass on its genes, and now the next generation has a little extra filament that helps it move. Soon that generation uses up all the food within a few millimeters, and so the only ones of that set that will survive have to be able to move a little bit farther. Repeat ad infinitum, for a billion years.

    Even among the evolution crowd the language used to describe this process is about "nature engineering a solution," or "adapting." That's not what happens--what happens is that everybody else dies. Survival of the fittest isn't about being fit at all. It's about being incredibly lucky. Given a very large number of organisms, someone might get lucky. Given a few billion years and the luck might pile up. You can tell this is true because, you know, it has already happened.

    Rebecca has nicely pointed out (Second Americano, linked above) that Intelligent Design isn't about God, religion, or faith--it's about politics. ID is, like just about every other model of God, about trying to put Him/Her in the gaps--the gap between the model and reality. Rebecca says this quite nicely:
    If there's anything that exhibits Derrida's point about nothing being outside the text, it's faith.
    What bugs the living crap out of me about ID is the fact that it tries to create gaps to fill (and, as Rebecca points out, those gaps are entirely political in nature). The guy giving the example above is not saying, "It's not understood how flagella were created, therefore it must be God," he's saying, "I can't understand how flagella were created, therefore it must be my God...."

Next: Who freaking knows?!
Tags:

Monday, May 01, 2006

Holy. Living. Crap.

We interrupt our regularly scheduled musings on the nature of being to comment on Stephen Colbert's appearance Saturday night at the White House Correspondents' Dinner. I doubt anyone reading hasn't already seen it, but if not, it's here (if you haven't seen it, watch it. Do it now). It's probably extremely telling that our first reaction upon watching this was, "How was this allowed to happen?" He rips apart everything the administration has tried to pass off as, you know, some version of reality, and he's standing ten feet from the President of The United States while he's doing it. It's not just that Colbert lambastes the President, the press, and pretty much everyone in between, it's that he never breaks character. He's Stephen Colbert, the television persona, the entire time and it's brilliant.

The second thing that's simply incredible is the way it's being covered by the major news outlets. Here, for instance, is the A.P.'s report. Reuters' take is here. Colbert was the featured speaker and he's barely mentioned in either story. The Reuters story states that he performed to "muted laughs," giving the sense that he bombed with his audience, rather than, as is clear from watching the clip, that his audience became increasingly uncomfortable as they realized he wasn't going to let them off the hook. Ever. Notice that in the first five minutes of Colbert's monologue, CSPAN shows a couple of reaction shots of GWB. After that, they stop and we never see him again. Try and guess why.

It's late Sunday night and I don't know if anyone will pick this story up come Monday--so far the only things I've seen written about this have appeared in left-leaning blogs. If, indeed, the news cycle these days is still determined by the major news services, then this already won't be in it, and that will be that. No doubt Colbert's scathing critique of the media in general has played and will continue to play a large part in that. But on the other hand, holy crap. It would be hard to sum up the state of the nation in 20 minutes or less better than Stephen Colbert did on Saturday night.

Next: Meaning of Life, Redux
Tags: , , ,

Thursday, April 20, 2006

Ephemeral Fame is Mine, and Other Thoughts


  • Going through my regular internet stops this morning whilst eating my bowl of cereal, I came across this article on The Hardball Times, which is the blog bible for statistical analysts of baseball, and the people who love them, everywhere. The article this morning is about Win Probability, and has linked back to my post on the subject. It's not the first time that somebody I don't know has acknowledged my existence, but on the other hand, dude: The Hardball Times! They're, like, real bloggers and stuff.

  • Sam and TenaciousMcD started up a thread (here, and then here) from the end of my last post, which addresses the question "if all is randomness, whither order? Whither morals? Whither justice?" (I'm paraphrasing. A lot). TenaciousMcD's question:
    How, for example, could any structure--even one we, creatures of nature that we are, superimpose--come from shere [sic] abject randomness, or how sense out of nonsense?
    Sam's response:
    When I read [TenaciousMcD's question], when I hear it, I don't think the question has the 'hook', for lack of a better word, that [Mr. Tenacious] expects it to. Perhaps I should be, but I'm not really all that bothered about how sense emerges from nonsense. Indeed, I think that's just how sense emerges –from nonsense. We produce meaning in the world, we read it into a world without meaning, and as Paul tells us continually, we construct narratives that give that meaning a place to rest, and a place from which to emerge. Structure emerges out of randomness (see Paul's many iPod posts); order comes from chaos. I'm OK with that.
    I take Sam's position in this, though I want to radicalize it a little. Or hone it. Or something. I once likened the way order seems to emerge from chaos, at least in the evolution of life on earth, as the action of an editor--he or she had no control whatsoever over what was produced, The Editor could only look at the results and, essentially, say yea or nay (pretty much everybody else is calling this, "Natural Selection" these days). In a truly random model of evolution, it seems like this mechanism is always glossed over. I think this is the "hook" to TMcD's question that Sam is looking for. It just seems so obvious that organisms or systems that are more "fit" would naturally survive, even if they emerge totally by chance, because...well, because they're more "fit." It's a hidden tautology. This is something I've long been meaning to blog about--maybe next post.

  • I'd also like to observe, as TMcD brings up apparently non-random constructions like morals and justice, that these things are also (literally) evolving, and that there's no particular reason to think that they too aren't borne out of an endless frenzy of random attempts to optimize the interactions of communities or societies or entire species.

  • Tarn asks:
    [S]uddenly I am hearing many voices -- both the familiar and alien -- and though their stories and their histories are, in detail, not alike, something tugs from deep within the seat of my capacity and nudges at me to learn a specific lesson common to All The Variety of Living....Does anyone understand this language I'm speaking? If so, please help me identify it. Seriously. I'm not joking. Anyone? Anyone?
    L. and I frequently ask this question, in a different form, usually when we keep running into the same obstacles in life over and over again. "Clearly," says one of us to the other, "There is something we are meant to be learning from this that we are not learning." In one instance, where we eventually got an answer, we had an ongoing (about three year) bout with trying to figure out where we were going to live, Seattle versus Baltimore (versus Florida. Don't ask). The first time, after an incredibly agonizing process, we decided Baltimore. Then, a year later, the decision came up again and we decided Seattle, but only temporarily, because we were going to move to Florida. Then it came time to move to Florida, and lo! The same incredibly agonizing decision came up again. Eventually, we figured out that we wanted, or the universe wanted us, to live in Seattle. This goes back to my earlier recurring point: life is hard when it's happening to you. To Tarn I have this to say: yes, I understand your language. Give me specifics, I can probably come up with an answer for you.


Next: More Bullet Points!
Tags: