Tuesday, May 31

Physics sadness

Paul Falstad has a pretty cool physics page where you can run java applets that demonstrate basic concepts. Surfing around his page, I found some unbelievable anecdotes about the state of physics knowledge in the world today. I am not posting this to make fun of the Ignorant Masses - after all, it's not their fault that basic scientific and physical reasoning isn't part of their worldview. Rather, I fault the community of scientists for failing to promote their ideas with the proper enthusiasm.

Gravity on the Moon
Heavy Boots

Monday, May 30

An incredibly nerdy injoke that made me laugh out loud.

Today's Dinosaur Comics are funny in and of themselves -- but there's a buried metajoke that just floored me.

Ryan North always puts title tags on the comics, to give a little injoke or commentary on the day's comic. Today's is "s/owned/pwned/g".

Yup -- it's commentary on a joke, relying for its humour on an understanding of Unix command-line tools (sed/awk in this case), internet-gaming lingo ('pwned' as a variant of 'owned', which means "beaten in a comprehensive way"), and the level-confusion of the application of that lingo to a part of the real world that's almost exactly antithetical to internet gaming (making anarchic art shows in parks). OK, so it's incredibly narrow. But I get it! And that's almost the funniest part!

The funniest part is that I care enough to post about it, too!

Friday, May 27

Stross On Anti-Humanist Science Fiction

Charlie Stross' latest blogpost covers the current round of infighting in the British Science Fiction Scene. I have to admit that as a reader-but-not-a-fanboy of british SF, it's all a bit over my head; I don't really care about the ridiculous cliques of in-crowds and out-crowds and "we're-better-authors-than-you-are" spats that the people who produce half of my entertainment come up with when they should be WRITING MORE BOOKS DAMMIT. That said, Charlie raises an interesting point near the end of the article: that most adventure sci-fi heroes would would be psychopaths if removed from context, and that the ur-plotarc for most sci fi would drive Normal People insane.

OK, fine, but Oz and I were talking about the pitfalls of heroic fiction a week ago today, and (though we came to no conclusions) we agreed that heroic fiction has its place. I see the point in fiction about Real People too, but that's a lot harder to get right than the heroic kind and it fails more spectacularly when it's gotten wrong. To quote my wife: "I don't want to read about people making poor life decisions. If I want that I can read the newspaper."

Besides, sometimes I identify pretty strongly with psychopaths.

Wednesday, May 25

An Agonist for Anger, a Medicine for Melancholy

Sorry, Ray.

This rant's about coffee. I'm an addict.

I've been staying up past my bedtime a bit too much lately, and as a consequence I'm depending more and more these past few weeks on Artificial Perkiness to keep my interest in my job up and a viciously brutal headache at bay. Boiling hot joy indeed. I actually can detect the difference in my mood that starts 15 minutes after I dose myself up: I get speedier, I feel sharper, I feel smarter. And, depending on the day, I either get happy or angry. I haven't figured out what triggers one over the other, though. Just, some days I get smiley and jokey and productive, and other days I get snarly and mean. Which apparently can be funny too, but I woudn't know: I'm too busy hoping everyone and everything around me dies a screaming, painful death. Not least this fucking job, which you can take and shove and I hate you all and the little dogs you rode in on.

Guess which one's got me today.

I don't even really like the taste of coffee all that much. I'd never drink decaf. But somehow, over the course of the past few years, I've gotten to the point where I can't not have it and stay functioning.

At least I don't smoke.

Thursday, May 19

Metareview: double whammy

The best review is a bad review.

Let me explain. As opposed to my colleague, who has lately been writing thought-provoking posts that have substance, I tend to enjoy commenting on Pop Culture. In particular, I derive pleasure from pointing out how crappy, absurd, and intellecutally bankrupt it is. When I see another reviewer do the same, it fills my spite-ridden heart with hope.

I have two reviews to recommend today.

The first is for the new Nine Inch Nails album, With Teeth.

The second is the Salon Review of "The Amityville Horror." I will excerpt the part that captures the way I feel about almost all works of pop culture. If only more reviews contained such golden nuggets of truth.
I wasn't shocked by "The Amityville Horror," or outraged by it: I felt nothing but disdain. As a symbol of what some filmmakers and some studios think the public will buy, it's a horrific piece of work. How dare anyone put this piece of crap in front of me. How dare anyone put it in front of you.

Tuesday, May 17

Catch my drift? Ya get me?

Back in the day, I blogged that I'm (almost) the only person who uses the name 'fraxas' on the net. Well, it turns out I'm not the only person who has (almost) original thoughts: Atlas' expression "I'm smelling what you're stepping in" (for "catching your drift) gets relatively few google hits.

So good job, man! you phrase-coiner you!

Saturday, May 14

Two Gamers Named 'Fingers'

Co-worker, Co-Nerd, and all-around good guy Dave and I were hanging out night before last, and he returned to me a book of poetry he'd borrowed almost a year ago. It's called Blue Wizard Is About To Die!. It turned out better than I hoped it would, to have lent him the book; I read it before I lent it to him, and now it's been long enough that it's almost all fresh again. I remember snippets here and there, but most of it has faded so completely from my mind that I'm able to resavour every page as if I'd never seen them before.

It's an incredibly good collection of video-game poetry; I imagine it Means a whole lot less to people who don't play games, or didn't in the 80s, but whatever. It's still a different kind of insight into the mind of a gamer than any other you're likely to get. And it's written by a guy (who's younger than me! And published!) who goes by the nickname 'Fingers'.

So that's one of them.

The other one is a guy from battle.net that I played Starcraft against one time. His name wasn't actually 'Fingers'; it was something like "_.|.._^_^_..|._" , which if you squint looks like a happyface flipping a double bird. He and a friend of his were teamed up, playing against me and Atlas. He was good. He was better than me, for sure; he might even have been better than Atlas. I don't know for sure because he actually cheated, in the one game we played; he accused us of cheating in the typed chat interface the game has, thereby causing us to defend ourselves in the same chat interface. Of course, the time you spend chatting is time you're not spending actually playing the game, and as a result we got steamrolled; he used the totally unfounded accusation of cheating as a way to get a 2-minute head start on us. And it worked. And he admitted that's what he'd done.

That made me really sad. Really disappointed. Really, intensely, angry at the state of humanity that someone would sink to those kinds of depths to win at a computer game. So that's the second Fingers.

I really hope they're not the same person. Assuming we ever met, I'd hate to have to suckerpunch him in revenge after shaking his hand for writing one of my favourite books.

Friday, May 13

Some Reasons Software Sucks

Pharaohmagnetic and I were talking today about Paul De Palma's pseudomemoir The Software Wars, published in the Winter 2005 issue of American Scholar. (I'd give a link, but it's subscription only; Pharaoh receives the actual bound journal in the mail every season. Quaint, isn't it.) De Palma spends most of the article war-storying a few of his experiences in professional software development:
My company's sin went beyond working with complex, poorly understood tools. Neither the tools nor our system existed. The database manufacturer had a delivery date and no product. Their consultants were selling us a nonexistent system. To make their deadline, I am confident they hired more programmers and experimented with unproven software from still other companies with delivery dates but no products. And what of those companies? You get the idea.

His argument, paraphrased masterfully by Pharaoh, is this: software projects can be ruined by foolish business decisions in ways that tangible-goods projects cannot.

That's a good start. There's a lot of meat on that idea, a lot of tasty morsels to chew on. But I'd argue that it misses the point subtly: foolish business decisions can ruin any project. The difference is that the newness of software as a whole means that western corporate culture doesn't yet have an instinctive understanding of what constitutes a foolish business decision for a software project. Admittedly, we're doing better now than we were in the bubble days, when The New Economy (it doesn't exist) meant that old rules didn't apply (they do) and mindshare is now the most important thing (it isn't).

Foolish business decisions on software projects have publicly ruined many a company. But market forces haven't finished culling the truly fat-headed, idiotic, soul-crushingly incompetent software project managers from the talent pool. Add to this the fact that if you're really lucky, the idiotic business decisions you made will only manifest after your product goes to market and its manager gets promoted, and you've got a recipe for a continuing culture of Really Bad Products in the software world.

So one of the reasons software sucks is that the people who make it suck.

Another reason is that software, since it's just a collection of bits rather than a tangible good, is pretty hard to test destructively. So its bounds of correct operation are really hard to determine without actually, y'know, releasing it into the market and waiting for it to break. Engineers can build scale models of their bridges and, within certain well-understood limits, be confident that the full-scale bridge will act the same way as the small one did. DePalma touches on this, but he fails to draw this distinction:

A few years ago, an IBM consulting group determined that of twenty-four companies surveyed, 55 percent built systems that were over budget; 68 percent built systems that were behind schedule; and 88 percent of the completed systems had to be redesigned. Try to imagine the same kind of gloomy numbers for civil engineering: three-quarters of all bridges carrying loads below specifica­tion; almost nine of ten sewage treatment plants, once completed, in need of redesign; one-third of highway projects canceled because tech­nical problems have grown beyond the capacity of engineers to solve them. Silly? Yes. Programming has miles to go before it earns the title "software engineering."

Since a software engineering department's output is itself the model of the production system (we deliver the code you need to execute to generate the running process, not the running process itself), they don't have the luxury of modelling as a mechanism to test assumptions about the systemic qualities of their product.

Additionally, OSes and libraries suffer the same problems of obscurity and untestability. So another reason software sucks is that the tools used to make it suck, which I mentioned before. To summarize that post, software develpment tools don't give you intuitive ways for determining correctness; they require neocortical intelligence as opposed to reptillian-brain intelligence.

A third reason: perhaps consumer demand is not strong enough. The consuming public is already quite used to the general crappiness of software - when a system fails, sometimes they get mad, but most of the time they utter a sigh of resignation, pick up the pieces, and accept the fallibility of the product in question as par for the course. But there's a finer point here: this overall trend of Software Consumer Resignation contributes to an atmosphere within the developer community where there is strangely low pressure for true innovation. Particular trendbucking examples are iPod and Google - no one really knew just how good a portable music player or a search engine could be before they hit the market. People were satisfied with Walkmen and Altavista; they didn't know that vastly superior products could exist, let alone did they expect or demand them.

A fourth reason, which De Palma illustrates quite well, is the Military Procurement Analogy. So well, in fact, that I'll block-quote this big excerpt and let it speak for itself:
The characteristics of software often cited as leading to failure-its complexity, its utter plasticity, its free-floating nature, unham­pered by tethers to the physical world-make it oddly, even paradoxically, similar to the practice of military procurement.

Late in 1986 James Fallows wrote an article analyzing the Challenger ex­plosion for the New York Review of Books. Instead of concentrating on the well-known O-ring problem, he situated the failure of the Challenger in the context of military procurement, specifically in the military's inordinate fondness for complex systems. This fondness leads to stunning cost over­runs, unanticipated complexity, and regular failures. It leads to Osprey air­craft that fall from the sky, to anti-missile missiles for which decoys are easy to construct, to FA-22 fighters that are fabulously over budget. The litany goes on. What these failures have in common with the Challenger is, Fallows argues, "military procurement disease," namely, "over-ambitious schedules, problems born of too-complex design, shortages of spare parts, a`can-do' attitude that stifles embarrassing truths ('No problem, Mr. President, we can lick those Viet Cong' ), and total collapse when one component unexpect­edly fails."Explanations for this phenomenon include competition among the services; a monopoly hold by defense contractors who are building, say, aircraft or submarines; lavish defense budgets that isolate military pur­chases from normal market mechanisms; the nature of capital-intensive, laptop warfare where hypothetical justifications need not-usually can­not-be verified in practice; and a little-boy fascination with things that fly and explode. Much of this describes the software industry too.

Fallows breaks down military procurement into five stages:
  1. The Vegematic Promise, wherein we are offered hybrid aircraft, part heli­copter, part airplane, or software that has more features than could be learned in a lifetime of diligent study. Think Microsoft Office here.
  2. The Rosy Prospect, wherein we are assured that all is going well. I call this the 90 percent syndrome. I don't think I have ever supervised a project, ei­ther as a software manager overseeing professionals or as a professor over­seeing students, that was not 90 percent complete whenever I asked.
  3. The Big Technical Leap, wherein we learn that our system will take us to regions not yet visited, and we will build it using tools not yet developed. So the shuttle's solid-fuel boosters were more powerful than any previously developed boosters, and bringing it all back home, my system was to use a database we had never used before, running on a computer for which a ver­sion of that software did not yet exist.
  4. The Unpleasant Surprise, wherein we learn something unforeseen and, if we are unlucky, calamitous. Thus, the shuttle's heat-resistant tiles, all 31,000 of them, had to be installed at the unexpected rate of 2.8 days per tile, and my system gobbled so much disk space that there was scarcely any room for data.
  5. The House of Cards, wherein an unpleasant surprise, or two, or three, causes the entire system to collapse. The Germans flanked the Maginot Line, and in my case, once we learned that our reliance on a promised data­base package outstripped operating-system limits, the choices were: one, wait for advances in operating systems; two, admit a mistake, beg for for­giveness, and resolve to be more prudent in the future; or, three, push on until management pulls the plug.
We see the effects of big-ticket government procurement all the time; here's a boingboing link to a relevant example.

Now, with the problems identified, how do we proceed? The field of software engineering is still quite young. Bridge-building, too, went through an industrial-scale learning period - Pharaohmagnetic points out the Firth of Tay, Quebec Bridge, and Tacoma Narrows as oft-cited examples of failure that all civil engineers study in their freshmen courses. Imagine a future where standards for software projects mirror those for bridges; a future were software and the tools to make it are so intelligent, it codes itself.

On second thought, let's not just imagine it. Let's make it happen.

Wednesday, May 11

Jeff Freeman on gratification in MMO design

In most MMOs, there's no easy way to determine what hoop you have to jump through to get a particular reward, beyond the simple "get more powerful by whacking foozles". So if you want to know more details, you go to an external website -- probably fan-run, probably inaccurate, definitely full of idiotic comments from people you wouldn't want to meet online, let alone in real life. Jeff thinks MMOs should be less opaque. I tend to agree; it detracts a little from the virtual-world aspect of the MMO, but frankly the virtual-world crew should take their bearded heads out of their butts more often than they do and smell the GAMES SHOULD BE FUN coffee. Also, I should continue to mix metaphors with abandon.


My current honeymoon game (we're still seeing each other almost every night! I'm level 13!) is Guild Wars, and guild wars gets it almost right in this regard. I know, generally speaking, where I have to go next, and what I'm going to have to do to get there, and what my rewards will be; I've only really looked for outside information twice. One of those times was because I couldn't remember where the guy I'd gotten the quest was, and the other was because I was at work and bored and wanted my Guild Wars fix.

Don't EVER force me to be smart.

Joel put up a new article today. It's called
Making Wrong Code Look Wrong
. It's aimed at coders, so it might be a bit greek if you're not a huge nerd who programs computers for a living (or for fun). Essentially, the point of the article is the same as the point of most of Joel's articles: make life easy on yourself and on your users. They'll love you for it, and you'll be more productive.

This is a lesson that very few companies have learned.

In the article, he talks about ways to make errors more apparent in your code. This is a big problem in most code -- to make a frame carpentry analogy, most coding languages don't let you check whether your walls are straight and plumb until after you've got the drywall up. And it's correspondingly hard to go back and fix things. So Joel's suggesting various ways to set up your jobsite and construct your wall sections so that if they aren't straight and plumb and level, they don't look straight and plumb and level.

This almost gets back at my last Wormsign entry: structuring our artificial environments so as to take advantage of our natural capabilities is the most effective way to get (and stay!) productive. Of course, now that I've written that last sentence, I realize I'm talking about ergonomics here. Not physical ergonomics, where you set up your workstation so as to emulate the kind of motions and postures your body evolved for, but rather mental ergonomics, where you set up your workflows so as to emulate the kinds of thought your body(/brain/mind) evolved for. Let that almost-magically-competent pattern-recognizer in your head focus on the things you want it to. Exploit that almost-pathological inability to concentrate on one thing by letting yourself do more than one thing at once. And avoid the kinds of work environments that work against our bodies' and brains' needs: avoid context-switches, avoid waiting, avoid flashing lights, avoid sharp corners, avoid running with scissors.

Ultimately, I think it comes down to the fact that people have way, way more sensory bandwidth than they have memory bandwidth. It's hard to remember two things at once, let alone 4. But I can look at, and integrate information from, at least 7 different windows on my desktop at the same time. So if I put a window in the background for something I'll have to do later, or if I put a sticky on my desk for the doctor's appointment I have in 4 days, or if my coding environment allows me to see errors rather than have to remember to check for them, I'm a happier camper than I would be if I had to remember all that stuff. The less brainpower I have to devote to administrative tasks like checking walls for plumb and level, the more I can devote to actually Generating Shareholder Value and Improving All Our Lives.

Don't EVER force me to be smart. We'll all regret it.

Thursday, May 5

Wormsign: "I feel magnetic fields."

Todd Huffman has a sense we don't. It's an extension of touch; it's an ability to sense, via a neodymium magnet implanted into his left third fingertip, magnetic fields around him.


Now, he says he doesn't have particularly high resolving power (that is, fine details escape him) but he can tell gross characteristics of static bar magnet fields from 1-2 inches away, and detect powerful oscillating fields from 1 or 2 feet. He also says it isn't really of much utility to him -- it's just stuff he notices. But imagine! If this is the kind of thing tattoo parlours can do with a few months' planning and research, think of what a dedicated research program could do. Imagine knowing, as a consequence of having touched something, what its magnetic characteristics are in the same way you know what its hardness and temperature are. I can't even imagine the ways in which that's useful, but I'm sure it is.

One thing I can tell you right now it does really well is to integrate into the unconscious information that, right now, we have to think about. That's the way we're going to be able to hack our biological machinery to deal with mid-late 21st century infodensity. Funnel it through the mechanisms we already have for filtering, in parallel, enormous amounts of information. (pop quiz: are you aware of the pressure your hand is exerting on your mouse at this second?)

This will be big.

Metareview: The Review Reviewer Strikes Again

This has got to be the best music review I've read in a long time.

I'd always thought he was content churning out the same tepid, formulaic gak he's reheated so artfully and redundantly over the course of his career with castrated, mediocrity rock wizards Matchbox Twenty.


This is hilarious. I can imagine a future where the EastAsian continent empties of people, and then, 3.1 miles away North America's shoreline, a living nest of rafts and ships provides all our labour needs.

From the article on Forbes.com: (go to bugmenot for logins)

Two San Diego entrepreneurs have come up with a very literal twist on offshoring software development jobs. This pair wants to get their hands on a 600-cabin cruise ship and park it off the coast of El Segundo, Calif., just over the 3-mile border that marks international waters. They'll pack the boat with engineers who will write code day and night.

The two founders of SeaCode, David Cook and Roger Green, are confident their plan will float. All they need to do is classify their workers as "seamen," so that they're protected by international maritime laws that skirt the need for those pesky immigration visas.

Get a load of these quotes:

"We're not a slave ship," says Cook. Adds Green, "It's like the International Space Station."

"Try to get American software engineers to work at night," says Cook.

Wednesday, May 4

Imperious bloggers

No links this time, though I desperately want to tell who exactly who I'm dissing here. Anyway.

Sometimes I come across a blog that Just Sounds Wrong. I don't mean content wise (though there are plenty of Wrong-content blogs out there!) but rather in tone. Today, I was linkjumping randomly around some of the blogs I read, and happened across one that combined a number of my interests. I started reading. I started getting uncomfortable, without really knowing why. But after thinking about it for a bit, I know now: I was being talked down to. Nobody likes that.

This particular man's writing had the tone of pronouncements from the mount, complete with putdowns to the anonymous commenters on some posts. It's a shame, really; I bet he's not like that in person. Then again, STDU probably has the same problem.

On the internet, nobody knows you're a dog. But everyone thinks you're an asshole.

Metareview: Worst Review Ever

Pitchfork has always been one of my love-to-hate-it destinations. But this review is just so incredibly terrible, the utter badness of it has got to be some sort of joke I'm not getting. And that makes me feel dumb, which is the point of the whole Indie Rock Review Scene anyway, I guess.

Here's my attempt to imitate the writing in this review. Tell me how I'm doing!
Track 1 is a song that I think is a song I don't like. Hey, it sucks. And the track isn't very good and hey, isn't ben folds a guy? Hey, let's say hey again. It's not like hey, I can't contradict myself in the same sentence as this one.

Track 2: Instead of a banana, this is a song that is sitting on a desk I once ate. Propulsive, opening, swirling, gerunds never grow me bored. My attempt at a witty simile is as awkward as a an attempt at awkwardness by, yes, an awkward aardvark (hey that's a lot of a's.)

Tracks I can't count that high: Hey, do something I like and I'll say I don't like it. This song opens with a an opening and then hey, something after that (hey, isn't that hey hey hey!) What a cop-out.

If Ben Folds plays piano, then yes, I'll conclude this sentence with a random non sequitur. Yes, I changed my mind and didn't know what to do with the diaper. And when I'm done, there's a word I'll use one more time, yes, it's hey. Yes too is a word I overuse. And hey, I'm a scared little boy, I never meant any of the stuff I meant, this is a comma splice because there's no coordinating conjunction after, yes, the comma. Hey.

Interruptions, distractions, slowdowns

I'm sure you've seen the posts on Slashdot, BoingBoing, and myriad other places that distraction lowers your score on an IQ test more than lack of sleep or chemical impairment.

Of course, studies like this always end with a list of things that we should do -- turn off the music, shut the door, get a quiet working environment, and so on -- but there's one that's missing that I think's really important.

Have a workflow that doesn't have any big pauses in it.

My computers at work (both of them!) are old enough, and slow enough, that I have to wait for windows to open and tasks to complete and so on. Every time I click a link in my stupid CRM webapp that opens a new window, I have to sit there and wait for it. I wait for Firefox to open the window, then I wait for the app to populate it with 12 or 15 HTTP queries, each of which go through my office's ridiculously small pipe. I seriously think that the router that connects us to the rest of the WAN is hamster-powered. Anyway. My point is that this takes so much time that I have time to get distracted. And then it takes me a long time to get back on track.

So eliminate those pauses! clear the clutter out of your workflow the way you clear off your desk. Because it's slowing you down way more.