Friday, December 23, 2005

It's Because Of Narrative

The true criterion for rightness and goodness:
It's no accident that there is no American literature celebrating imperial presidential powers, celebrating a president operating in secret to expand his powers while citing national security threats, celebrating, in short, demagoguery. No great American literature or Hollywood movie has rewarded trampling on the Constitution. No celebrations of the US waterboarding detainees. What is our whole national literature about? ...It's about doing the right thing. Not only that - it's about the right thing prevailing, being rewarded. It's about justice prevailing over injustice. It's about abuses -- deception, corruption, violence, racism -- being ratted out. It's about those who deceive, who seek to grab power, who become corrupted by power, who go on witch hunts, who appeal always to fear as a form of political manipulation -- ultimately being exposed and falling, being censured by a system that is more powerful than they are. In particular, our national literature has celebrated one thing: the individual - the ordinary public servant, the small town lawyer, the ordinary citizen -- who labors to make justice prevail (to kill a mockingbird, sinclair lewis, a civil action, three days of the condor), and has been especially harsh on one thing: the political leader who deceives and who seeks to expand his power. The Watergate break-in is not celebrated in our national literature -- it's those who expose it. The McCarthy witchhunts aren't celebrated in our Hollywood movies -- it's those who finally exposed McCarthy for what he was. No popular movies celebrating the Reagan administration's secret selling of TOW missiles to the mullahs in Iran and diverting the proceeds to the Contras. Americans have a fundamental distrust of government conducted in secret, of those leaders who would seek to expand their powers in secret, appealing always to fear. We know what would happen in the movies. The demagogue would fall. The system, we would be assured, works. It would retract back to normalcy.

Thursday, December 22, 2005

Non-Virtual Blogging

The proprietors of Second Americano have come for a non-virtual visit and, today, gone on their way. Soon the only traces that remain of their appearance here in Seattle shall be a few bytes in cyberspace and a few fused neural connections in our collective cerebral cortices. They managed to actually blog a couple of times during the visit, while over here at OaO (you gotta love it when your target demographic gives your product a nickname) my little political rant was sitting at the top of the page, still making me feel dirty. But enough of that. The only thing I wanted to say here at the moment was that, in spite of the constant virtual relationship built from discourse through blogging, we miss our friends, are happy when they are, however briefly, here, and sad that they live so very far away from us.

Monday, December 19, 2005

I Just Don't Get It

Every time I post on politics, I feel dirty later on, like I've gone and sacrificed reason in order to blow off steam. But then I keep doing it anyway. So, today, in addition to everything else I think is ridiculous about the current administration, I share Emery's bewilderment on the issue of unchecked NSA Surveliance vis-a-vis Conservatism. As the story I've linked indicates, having been caught red-handed in the act, and apparently having learned the lesson of the Plame leak, they're loudly proclaiming that what they're doing is absolutely just and right. I don't know, as this thing gets rolling, if legal parsing will make it so or not, but I do understand that a base tenet of Conservatism is distrust of government. The government should not be in your life. It is bottom-line dogma.

But even this isn't what I don't get. What I don't get is what's in it for them. One of my basic a priori assumptions about humanity is that no one actually thinks of themself as an evil, arch-villian. Yes, George Bush has cut taxes and overseen an unchecked growth in spending, mostly to the benefit of the extremely wealthy. There's not much question outside of Neo-conservative circles that this is an act of evil, but if what you want is that your nation doesn't tax you and doesn't pay for anything but national defense (the stated aim of the Neo-conservative movement), you might reasonably believe that it is right to bankrupt your nation's government in order to achieve this result.

Likewise, I just can't bring myself to believe that George W. Bush thinks that it would be better for this nation if he had unchecked dictatorial power. I'm sure he believes that there are things that have to be done for the security and safety of this nation, that his administration has to do what it has to do. I mean, I understand the psychology of it, I do. One of my previous jobs, in the earlier days of, was to solve problems. If something was blocking the flow of orders from the website, or the generation of shipments from the warehouse, or the shipping of those orders out the door, I was supposed to fix it. When I started the job, the company was relatively small, and I had direct administrative access to many of the servers, production databases, and software that ran the company (I had, what I used to refer to headily as "The Unlimited Power"). I literally had the power to take down the company if I wasn't careful. I was damn good at this job, and I also occasionally screwed up. A couple of times I screwed up badly, and a warehouse would stop functioning for several hours. Eventually the company started to put protocols in place, such that I couldn't just ad-hoc modify production data or software. I could still do it, but I had to get permission first. At first (and by "at first," I mean, "for the rest of the time I worked that job") I was pretty pissed off about it--they wanted me to solve problems, and these bureaucratic hoops were just getting in my way. To me they didn't seem to serve any useful purpose whatsoever, they were just blocking a person who was damn good at his job from doing it.

You know what? I was wrong. I was a lot wrong. Those protocols are there for a reason, because it is not a good idea to give people the Unlimited Power. Sometimes people with The Unlimited Power get paged in the middle of the night, log in to work, think that they're clearing an order queue that's blocked and accidentally delete everything in it instead. It's not because they mean harm, but it happens anyway.

So I don't get it. I don't get what they think they're doing. And, to quote Emery again, where is the outrage? Every aggressor in the history of the world who invaded another country said they were doing it for the safety of them and everyone else, and I'm sure the people who said it believed it. Every state that captured citizens and held them without trial, and that tortured them, said that those citizens were direct threats to that nation's security, and that it was absolutely necessary for the common good. Every leader who claimed unchecked executive power on his way to dictatorship claimed that he was doing it for the good of the citizenry. Every. Single. One. In. The. History. Of. The. World. I'm not saying this is where we're headed. I am saying that this is what it looks like when you are.

Friday, December 16, 2005

My Poor Brain

The post below reminded me of something else I wanted to mention: I am 32 years old, and find that it is noticeably harder for me to grok new models and concepts than it was, say, five years ago. That is to say, it takes more time and more mental effort for me to understand things that do not directly relate to things I already know about. I worked at my current job for a month or so before I actually understood the underlying philosophy to what I was doing--I actually blogged about the meeting I was sitting in when it finally hit me what we were working on.

This is a well known phenomenon, I guess--it is generally understood, in academia for instance, that new schools of thought and modes of study come in with the new faculty and the old ones don't leave until the faculty who study and/or championed them retire. This is so well understood in mathematics that the Field's Medal, which is the Nobel Prize of the discipline, cannot be won by anybody over 35--it's part of the rules. But it's only lately become clear to me that it's not just that people get set in their ways, or like to stick with modes of thinking in which they're familiar, but that the brain is physiologically becoming fixed. The neural connections have already formed, and there just aren't that many more left to fuse.

Well, I'm depressed. I'll never win that Field's Medal now.

Thursday, December 15, 2005

The Web, v 2.0

On Monday, after one of those evenings at work that lasts an unnecessarily long time, we launched the Alexa Web Service Platform (Please, hold back your gasps and applause until the end. Really, it's an honor just to be nominated). This is not to say that I had anything to do with this service or implementing what it does--we just sort of make things that allow other things to do what they do for the people out there in the world. We're sort of the BASF of the web services universe (anybody who gets that reference without clicking the link wins a prize. Anybody who gets that reference and also knows what a web service is...actually, then you'd be me. Never mind).

Alexa is a company that crawls the web and archives it. The upshot of the new web service is that you can write your own search engine (where "you" = "a software engineer with an understanding of what a web service is and how to use one") without actually having to do the actual searching. Is this good/great/revolutionary/going to change the world overnight? Some people think so. The idea that the next great Google-like product will be produced not by some large corporate amalgam, but instead by a guy in a garage is, I admit, more like the market universe that you and I know and love and wish we actually lived in.

I've sort of shied away from trying to explain the universe that I'm working in these days, but the new Alexa search platform is sort of a good example of what this is all about. For instance, go to the front page of Google. Go ahead, I'll be here when you get back. You are (or were, just then) looking at the user interface to the largest, fastest, and most versatile repository of information in the history of the world--a text box and two buttons, one of which is almost totally superfluous. To you, Google is a website. Hiding behind that web page, however, is an enormously complex suite of software, database applications, algorithms, guys who work on and improve the products--but you can't use any of that directly, you can only use it the way Google wants (or has time and resources) to present it to you.

This is what web services (and the general idea behind Internet, Version 2.0) are about. You could, if you were Google, allow computer programmers and/or programs access into your inner sanctum, charge them some money for it, and they could implement all manner of new websites or stand-alone desktop applications using Google's already implemented work. This takes Google from being a website and turns it into a platform on which you can compute (I don't know if anybody out there will even understand that sentence, but the concept is, strictly relatively speaking, pretty revolutionary).

Anyway, this is what Alexa has done, except that their website hasn't ever registered in your consciousness the way Google's has. Somewhere, some small subset of the IT world is working away at this, slowly exposing the underbelly and guts of the internet to the world at large. It won't be an exciting revolution--in fact I doubt anyone will notice that it happens, or really understand what the difference is between that and the web you've got already.

Oh well, back to work then.

Wednesday, December 14, 2005

Love, Sex, and Death (no, just kidding. More math).

"'Math types'...have a tendency to think math is THE expression, that it gets at some deeper truth than other expressions can. I once argued for hours with my friend Mark, who swears that if a = b and b = c then a = c IS TRUE. IS TRUE, not as a property of western logic, but simply as a fact of the universe. I've read way too much Heidegger to buy that."

I find it interesting, and yet also a source of my own enormous personal smug-itude, that most people who think that math is cool think it's cool roughly for the reason that Sam states: Math is The Expression of Truth, God is a Mathematician, all of scientific knowledge is intrinsically written in Mathematical language. That kind of thing--basically that Math is different or pure or something. They believe this in spite of the that fact that it's been PROVEN USING ITS OWN AXIOMS that it isn't.

If you're not interested in more philosophy of Mathematics, you can stop reading, because I've given you the punchline. Math may appear to be different than, say, language or Philosophy, and immune from the freaky things that happen when you start using language to talk about language or Philosophy to talk about Philosophy. It's not. Kurt Gödel proved that this was the case. He did it using math.

There was a movement in mathematics over the 250 years or so prior to Gödel's proof to logically formalize pretty much everything in math--that is, to formally derive it from first principles. The zenith of this effort was probably Whitehead & Russel's Pricipia Mathematica. This is a book which everyone claims is brilliant and groundbreaking, and which no one has actually read. It's several hundred pages long, and proves such things as, given the well defined concepts of addition, one, and two, that 1 + 1 = 2. No, really. And it proves them only if you take a couple of things, one of which being that meta- is not allowed to occur, to be axioms. Anyway, this movement pretty much died with Gödel. The idea that "Math is different" seems to have not died at all.

The Incompleteness Theorem is generally listed, with Relativity and Quantuum Theory, as one of the most profound theoretical advances of the 20th century. But whereas you almost cannot get through a high school physics class without learning Special Relativity, and the first thing you learn in chemistry class is the Bohr Atom, I have a BA in Mathematics and my classroom time with the Incompleteness Theorem was about ten minutes, it was a sidelight in the midst of learning about the rigorous formalization of the foundations of calculus, and it was presented like, "well, isn't that whacky. Anyway, back to what we were doing...." Apparently the idea that Mathematics isn't the language of truth any more than anything else is is too hard to grok, even for the Mathematicians.

Monday, December 12, 2005

Dreams = Interesting, Math = Not Interesting

My now semipenultimate entry below, the Twilight Zone dream musings, currently has ten comments (from, admittedly, only five distinct parties, one of whom is me, and half of them are actually about who has been together longer without being wed, but still — dude, ten comments!). The one above it is holding steady at none. I'm sensing my audience shifting back and forth on their feet, thinking, "Uh...math. Dude, that's...really...great." How come nobody thinks Math is as cool as I do? No, really. Come on people, I think Theory is cool. Give me some love here.
  • Yeah, so...dreams. Last night I dreamed that I walked out the back door of our house and into this enormous other house which, at first, I took to be some sort of annex to our house that I hadn't known about. Then suddenly the caterers showed up, as well as a bunch of guys in green kilts and green shirts festooned with gold trim, who were carrying in folding chairs and setting them up. Then I realized I was in some sort of hall that people rented for weddings. "So..." I thought, "is this part of our house, or what?" I'm sure Freud would have a field day with that one.

  • I have lots of dreams where I realize I'm dreaming. Lately, in them, I've taken to examining the scenery, or walls, or trees, in them, just to see how good the scan resolution of my dream-brain is (it turns out to be as good as I want it to be).

  • When I started sleeping in proximity to L., she started appearing in my dreams as a matter of course--whatever I dreamt about, she was just there.

  • A couple of songs that I'm working on now have parts (choruses, words, chord progressions) that came from dreams--I dreamt them and then they were still in my head when I woke up.

  • I've had what I assume other people are talking about when they say they've had out-of-body experiences (I am far from prepared to say that that's what happening, though who the hell knows). It otherwise feels like dreaming, with the notable exception of the distinct feeling of being lifted up and out of your body, and a sort of white-out of my field of vision and feeling of coming back when they're over. In one of them I was on the street outside our apartment, in another I went flying off somewhere, and ended up in front of a house on a hill. I walked up the steps and inside and there sitting on the couch was me, 20 years hence--looked like me, only skinnier and I had a relatively full beard. I started asking him (me) questions, and he (I) just looked at me and said, "Dude, you have no idea what's about to hit you." Then he took his hands and made a gesture like one might make to indicate that ones head is exploding. That dream/experience was six months ago. I'm still waiting on that particular prophesized revelation.

As I said in one of the ten (ten!) comments, the theory about RAM dumps is just my own "why" of dreams, and it's really only the why within a particular model of consciousness (though I do think that it nicely answers the question of why sleep isn't restful unless you dream). I don't know where I first ran into this idea, but it's not at all clear what the difference between "out there" and "in here" is as far as the brain is concerned. All is translation from the input from the receivers of vibration, or a particular spectrum of visible light, or chemoreceptors. More than a few smart people think that this distinction--in here versus out there--doesn't exist at all, at least not the way we think it does.

When we dream, we'd all agree that this distinction is gone entirely, though, for myself at least, it's something I still seem to enforce. Even when I know I'm dreaming I'm still dreaming of a distinct me, there's a distinct outside with distinct others in it. It can shift fluidly (I find in my dream narratives that I become different people at arbitrary points), but there's still that sense of me vs. not-me. So apparently I am clinging to some sort of distinction that doesn't really exist, at least not while I'm dreaming. Does it when I'm awake? Bring on the comments.

Friday, December 09, 2005


Mark is home sick with some manner of bronchial unpleasantness, but he fought through the fog the other night to call me on the phone and say this: "I was just thinking about your prime number thing. I was wondering if you could model it as a fractal." Since I had, myself, been thinking about prime numbers as the result of iterative functions (which is another way to say 'fractals'), it has suddenly become a good time for a brief plunge into things math-y (except secretly I will be using math as a metaphor for life, because that's just the kind of guy I am).

First of all, about my prime number thing: it's a trivially easy fact to prove that all primes greater than 3 can be expressed in the form 6n +/- 1, where n is an integer. That is, prime numbers appear on either side of multiples of six (5 and 7 around 6, 11 and 13 around 12, 17 and 19 around 18). The converse, obviously, isn't true (not all numbers of the form 6n +/- 1 are prime), nor are there any inferences to be made about, e.g., 6m + 1 being a prime for some m because 6m - 1 is prime (23 is prime, but 25 isn't). I generally state my "prime thing" as, "All the numbers of the form 6n +/- 1 are prime, except for the ones that aren't." There's more to it than that--it involves graphing them in a particular way such that there's a well defined way of drawing lines on the same graph that will run through the 6n +/- 1's that aren't prime. This may or may not sound all ground-breaking and shit, but in fact it is nothing more than a visual representation of Erastothenes' Prime Sieve.

Primes are interesting (um...relatively speaking) because you can predict nearly everything about them except where they actually are. The sequence that starts 2,3,5,7,11,13... is pretty much a random sequence of numbers--they all share a particular property, but there's no mathematical way, given prime Pn, to calculate Pn+1. This problem is considered so unsolveable (not that the solution is hard, but that there's simply no solution) that the famous (again, relatively speaking--Brad Pitt is not sitting at home trying to solve this problem or nothing) unsolved problem about prime numbers isn't about trying to figure out a nice formula for whether a number is prime or not. It's The Hilbert Conjecture, and it only tries to quantify the distribution of primes based on The Riemann Zeta Function (Don't try to understand that last sentence, but you might want to click on the link, because the pictures are cool).

With digital computers we got fractals and chaos mathematics--somewhere along the line between the Greeks and ourselves people started to notice that nature didn't behave geometrically. In the words of Tom Stoppard:

Thomasina: Each week I plot your equations dot for dot, xs against ys in all manner of algebraical relation, and every week they draw themselves as commonplace geometry, as if the world of forms were nothing but arcs and angles. God's truth, Septimus, if there is an equation for a curve like a bell, there must be an equation for one like a bluebell, and if a bluebell, why not a rose? Do we believe nature is written in numbers?

Septimus: We do.

Thomasina: Then why do your equations only describe the shapes of manufacture?

Septimus: I do not know.

Thomasina: Armed thus, God could only make a cabinet.

It's one of the entirely reasonable but oft-unexamined tenets of our scientific knowledge that it is all written in mathematical language. There's a certain tautology to this (science = anything you can describe mathematically, all else is philosophy, language, metaphor, etc.), but on the other hand, math has proved to be awfully prescient and adaptive to the needs of scientists and scientific theory over the years. The physicists at the beginning of the 20th century, for instance, were pleasantly surprised to find that Riemannian Geometry (geometry of curved spaces) had existed for a hundred years or so when they discoverd that space itself was also curved. But it's only been in the last fifty years or so that we've really had the ability to, as Thomasina puts it in Arcadia, graph more than x's and y's. You only have to look at a bluebell to see that nature isn't geometric, and look at a Romanescu Cabbage to see evidence that nature does seem to be fractal--ever repeating, but always a little bit different with each iteration, on down to infinity. Fractals and chaos mathematics have fallen out of favor in the last ten years or so, because thusfar they've produced pretty pictures but few useful results. I think, though, this is another one of those cases where math is just waiting for science to catch up.

Tuesday, December 06, 2005

Submitted For Your Approval

L. and I have just started watching Lost, so we're in a kind of "hidden eerieness" frame of mind at the moment.

Ignoreable offtopic digressions:
  • There should be a verb that distinguishes watching episodic TV week by week on the network or cable channel that airs it versus watching it ad hoc on DVD. And, probably, for as long as there's this distinction, between network watching and TiVo watching. We are, for the record, DVD watching.
  • How long before the Style Manual is updated to indicate that titles of works, just as they are to be italicized in regular (non-hyper) texts, must be linked when they appear in hypertexts? And what will the guidelines be for what is to be at the end of those links? Will placement in the Style Manual be something that, say, Amazon can buy for books, or IMdb can buy for movies and TV shows?
  • Discuss
  • Do not discuss anything about Lost, because we do not want any of it spoiled for us.

With that in mind, I present this: the other day I woke up with the flavor of an odd dream still lingering in my brain, in which I was married to somebody else (a real person, somebody, in real life, that I used to work for), and sort of came into the dream in medias res thinking, "Crap. This must have seemed like an okay idea at the time, I mean I like this person well enough, but I was really happy being married to L. What the hell happened?" Later I either woke up or dreamt that I woke up and discovered it was all okay. Later that day I off-the-cuff emailed L. about it and...wait for it...she had had the same dream--same situation (in her dream she had married the (real-life) son of some friends of her parents), same sense of the dream (the sort of wtf? at suddenly being married to somebody else).

We were trying to figure out what might have triggered this little synchrony--the only thing we could come up with was that we'd gotten news the night before that some very very long time couple friends of ours were getting married (unceremoniously taking the mantle from Sam and Red as Partners Who Held Out Longest Before Succumbing to the Heterosexist Matrix, Who Are Also Friends of Ours) (It's a really nice trophy, too, burnished metal mounted on an oak base with gothic lettered engraving, I really hate to have to fly to Wales and relieve them of it). But that was as good as we could do, and it wasn't an entirely satisfying explanation.

People spend about five minutes with us before remarking that we sound like the same person. I guess this happens when you start evolving in tandem with somebody else, and from my perspective (and L.'s, I assume) it's that our brains (figuratively!) run in the same channel. There'll be some sort of external stimulus, such as a third party saying something, and as if it were the setup to a joke we both know, we'll burst out with the same response. So this dream thing, if the dream was the punchline, got me to wondering what the setup was.

You don't remember words, or images, or smells, or sounds. What you hold in your brain are chemical patterns and signals, and other parts of the brain interpret those things as words and images and sounds each time you remember something. In the same way, dreams aren't made up of images or sounds, they are made up of chemical signals wandering around--my own pet theory is that your brain is doing a nightly RAM dump, that 'tiredness' is actually your available RAM filling up, and dreaming is your conscious interpretation of the brain's batch job that moves the day's input either to the hard disk or to the trash bin. The images and sounds, in any case, are just a translation.

If all of those things are true, it's not even that surprising that L. and I would occasionally have run-ins in the sublimial realm. It wouldn't be surprising that we'd interpret a particular stimulus of the day with a similar response. It sure was creepy, though.

Monday, December 05, 2005

Dan Hale

I didn't really know Dan, I met him once a couple of months ago at a party for L.'s department--he was, that particular evening, playing the role of techie spouse of an academic, just like me. A bunch of us went out to dinner afterwards, and I talked to him a little bit about things internet related. He was a pretty cool guy, I noticed that he was (literally and figuratively) soft-spoken, but otherwise he and his wife seemed young, normal, and happy.

Yesterday we attended his memorial service--he was diagnosed with esophageal cancer a couple of years ago, and died two weeks ago after a long battle with it. It was a service both nice and, you know, terrible--it's tragic, and then when it's somebody your age with a life a lot like your own, it's also pretty visceral.

One of Dan's friends who stood up to speak during the service brought up a TiBook to the microphone with him, he said, "Sorry, I hope the laptop isn't tacky, but I was working on this speech until ten minutes ago." He read from his screen a description of his relationship with Dan, which, as it turned out, was almost entirely virtual. They'd been friends for ten years and, up until last month, they'd met in person a total of three times. Somewhere in the middle of this heartfelt eulogy about a relationship literally created out of email and hypertext, I thought, "I'm seeing something right here. And I don't quite know what it is."

Twenty-four hours later, I think what I thought I was seeing and what I was actually seeing (or wasn't actually seeing, as the case may have been) are two different things. I've talked about the subject a lot lately, and I guess it might seem like I don't think that Being-In-The-World (hey, Red started it) is any different today than, say, ten years ago when these two people met, because obviously it is, especially in the Heideggerian sense (I mean, hell, it's getting pretty hard to argue that your Virtual Being isn't a necessary component of In-The-World-Ness these days). For me the whole phenomenon is kind of koan-ic (no, it's not a word. And it certainly doesn't sound alarmingly like 'colonic.' Please move along, there is nothing to see here)--what appears different about life Now versus Then is not different; what you think has not changed between Then and Now actually has (this wants explanation, but it's too much of a digression right now. So, later).

No, what I came up with here, twenty-four hours later, was this: grief is also transcendant, in the same way that joy is. It takes you out of yourself, to that place where nothing else matters, an experience which turns out not to necessarily hinge on being joyous at all. That's what I was seeing, that humanness transcends the laptop and the email and the hypertext and all the rest of it. And I might have been able to see it a little better had I not been sitting there watching Dan's friend pour out his soul and thinking, "Wow, I'm seeing something right here. I'm going to have to decide what it is and go blog about it later."