furia furialog · Every Noise at Once · New Particles · The War Against Silence · Aedliga (songs) · photography · other things · contact
I've come to the unexpected and disconcerting realization that my period of greatest apparent productivity in user-interface production was, pretty clearly, when I was mainly working in Visual Basic, circa 1994-1997. Ten years later, I'm actually spending much more of my design time repeatedly re-implementing basic mechanics of no inherent interest, and my design work at any given point in time is littered with more unsightly temporary hacks of things I haven't had time to re-implement this time yet. It takes me longer to produce first approximations, and longer to add some of the most common levels of noticeable refinement.  

This would be deeply tragic in several dimensions if it didn't deserve enough qualifying asterisks to make an ASCII-art smiley the size of Nebraska. The genius of Visual Basic was that it allowed you to very quickly produce 90% of a UI that was as good as 90% of everything else. Getting from 90% to 100% of a UI that was as good as 90% of everything else, however, took a large amount of extremely annoying work, some of which had to be constantly rechecked and redone every time you made the tiniest change to anything thereafter. And worse, of course, the standard established by 90% of everything else was not, in objective human terms, actually very good.  

Visual Basic's success, such as it was, relied on two very large simplifications of the underlying human problems. First, technically, VB was both a development and a deployment environment. VB applications were actually configuration data for the VB runtime. There were no platform-matrix issues, because VB itself was the platform. The specific version of VB with which you built the application, even. Unless you started introducing your own external dependencies, you could basically count on the things running the same way for your users as they ran for you.  

Much more significantly, though, VB instantiated a fixed vocabulary and grammar of UI interaction. This made it extremely simple to produce static forms and dialog boxes using standard Windows UI widgets, fairly easy to add a few well-defined kinds of dynamism to them, cumbersomely possible to do a little better, and beyond that you ended up fighting it more than it helped, and so usually didn't bother.  

This is what all well-conceived software projects and all self-aware computer programmers do, of course: we take environments in which a great many things are complicatedly possible, and transform them into environments in which a small subset of those things are significantly easier, and a larger number of things that used to be hard are now so much harder than the easier things that nobody wastes time trying them anymore. We do this by, mainly, an obsessive attention to abstraction and instantiation and packaging and layering. This is why the details of most programming successes are staggeringly boring to non-programmers in rough proportion to their genuine significance: the best programming work usually is not done in directly solving an individual human problem, but in making some tool, ten steps away from the visible problem, that makes it incrementally easier to solve some whole class of abstruse subproblems that factor in some small way into the one you care about. The web page that lets you buy embarrassing personal ointments without having to hand them to a 4'5" woman named Ruthie, who reads every package label aloud in megaphone tones, solves a human problem for you. The good programming work behind that page has nothing to do with ointment or Ruthie, and lots to do with content-management-system staging strategies, database transaction consistency, protocol standards, application-framework meta-programming, and twenty other topics so obscure and tedious that you'd probably rather talk about why you need the ointment.  

The problem with Visual Basic, circa 1994, was that the things it made easy were not important enough. Using it, I was able to produce a lot of UI very quickly, the Windows UI paradigm in which VB thrived was itself the cause of most of that work. I spent most of my time laboriously refining mildly optimized sets of widgetry in order to get the computer to show me something that I already knew. It'd be my guess that seeing and/or editing simple pieces of known data accounts for about 94% of the code ever written for computers, and that seeing and editing more-complex clumps of known data accounts for another 6%. All the stuff that does something with that data, and for which a computer is more than a very expensive and quickly obsolescent filing cabinet, at this level of precision accounts for approximately 0%.  

The web, as the new default environment for data applications, is both progress and regress. The literal display of data is generally a little easier than it was in most pre-HTML environments. Simple data editing is about as easy as it ever was, but orchestration of the editing of related bits of data is harder. Managing large sets of data, and managing the social dynamics of shared data environments, are probably slightly easier than before in relative terms, but since the connected world makes them less obviously intractable, they now represent a larger fraction of an increased overall quantity of active difficulty. And if the problem is taking the things you know, and the things I know, and getting to something new we now know together, a disappointing amount of what we've built turns out to be discussion forums where people we've never met can file unverifiable reports on whether the ointment worked for them.  

But the web, as it is now or even as the SPARQL/RDF version of the Semantic Web might make it, is yet another environment that generates a huge amount of the work it requires. This is ultimately the metric by which I think any tool, including software tools, and including meta-tool-sets as big as the web, should be most diligently measured: does it increase or decrease the human-scale meaningfulness of the human work done with it? In the same way that "the economy" should be graded on net human welfare, not gross currency movement, software should be graded on how it allows us to spend our lives, not how industriously it allows us to do new work nobody had to do at all before. It's amazing that it's possible to put a shame-free ointment store online, but absurd how much harder it is to do that than to put a tube of ointment on a metal shelf. And profoundly pathetic that the magnitude of our shame makes the enterprise seem reasonable.  
 

The software-design work I'm doing now won't, directly, even get rid of your rash, let alone reverse global carbon-dioxide trends. Gauging the value of your daily work based on its capacity to increase the welfare of humanity is practically difficult, and theoretically presumptuous if not absurd. But I think it matters whether it matters, and I really do believe that I am working on a part of the solution. If human existence is the stake in a race between shared insight and thermodynamic chaos, chaos has all the big-money sponsors, but shared insight is a power law. I'm trying to build software that does, for networks of highly interconnected shared data, what Excel does for columns of numbers (or, maybe more realistically at first, what Visicalc originally did for columns of numbers). I'm trying to help bring about a new base level of operative abstraction in which data is visible, and its connections traversable, by its very nature, so nobody has to write code to see what we already know. I believe that it should be possible for computers, instead of diligently enforcing dehumanized bureaucracy, to help sustain the transformative illusion that all it takes for you, as a human, to participate in the sharing of human knowledge is one thing nobody else imagined before, or one link between things that nobody else has noticed yet.  

And although global warming is not entirely a data problem, even that might arguably be mostly a data problem. Its key vectors all originate in what are fundamentally data blockages: inadequate education, incomplete understanding, incomprehensible macro-effects of unassessable micro-decisions, misapplications of science or technology, misguided optimizations of local variables at the expense of the overall system, failures of empathy, failures of recognition, and perhaps above all else the way in which untreated ignorance makes it possible to feel safely (and/or alienatingly) isolated from things to which you are really deeply and inextricably connected.  

I'm trying to help make the internet into the place where we know what we know. I'm trying to help make a new place where we go to do human things of which we are collectively proud, not somewhere where we hide among the widgets while we pray that this is the ointment that will make us feel better.
Site contents published by glenn mcdonald under a Creative Commons BY/NC/ND License except where otherwise noted.