furia furialog · Every Noise at Once · New Particles · The War Against Silence · Aedliga (songs) · photography · other things
(Promises in Anarchy)
"A law", I once wrote, "is a statement of incapacity." I was spiraling evasively into the introduction to a college essay about the systemic efficacy and desirability of using technology to enforce social policy, and although inevitably much about the paper now makes me cringe, a few of its central arguments still seem cogently chosen and usefully explicated to me. I summarized the social tradeoff between privacy and accountability fairly well, I think, and extrapolated the particular technological characteristics of that tension accurately enough to anticipate the general tenor of debate fifteen years later, although not the specific cast that cryptography would lend parts of it. I vastly overpredicted the degree of spatial virtuality in cyberspace (i.e., I had read "True Names" and Neuromancer before they were canonized and qualified), but knew we were headed for something much richer and more inclusive than BITNET, and something in which laws would have an increasingly tenuous role. "Laws replace judgement", I said, not doing my case any favors by misspelling "judgment". What I should also have said is that nearly everything replaces judgment. Machines, systems, chance, fashion, religion, discipline, habit, loyalty, love, panic, freedom. We grasp at any mechanism or excuse to avoid having to confront a decision squarely and thoroughly, and choose a specific, unambiguous answer out of nothing but our own moral abilities.
Whatever historical prescience I can hope to obliquely claim, though, the paper's politics and conclusions were disastrously naive. At the time I wanted desperately to be a rational anarchist (and what are liberal-arts colleges for if not to help young people experience their formative rational-anarchist phases), but I couldn't sustain the elitist myopia quite long enough, and in my eventual failure plunged haplessly into what I now realize was egregious technofascism. We pass laws when we feel individuals can't be trusted to act in socially desirable ways on their own recognizance, using their own judgment. My obtuse improvement on this concession was to think that where the laws didn't work, either, we could use technology to make disobedience simply impossible. This had the tiniest shred of plausibility when applied very strictly to urban traffic-control, and was a hopeless idiocy otherwise. In some ways it's taken me fifteen years of professional labor as a designer of systems to disabuse myself of the thoughtful compulsive's fantasy of finally building a machine for making other people always do the right things.
What I think I've gradually come to understand, though, at least until this too is supplanted in my evolution, is that fascism, even in its nominally apolitical technology incarnations, is an error of overspecificity, and a willful sacrifice of the whole to the parts. We recite this as a cautionary mantra now, "Mussolini just wanted to make the trains run on time", but I think we think it means that people get carried away, which is not at all the lesson. Mussolini's was an error of conception, not execution. His mistake was defining train schedules as his metric of social probity, and more particularly (and more generally), believing that society should be metered, or could. Fascism is not an egotist aberration, it is a pathology of distrust. The fascist, and the technofascist even more so, is afraid of people. Fascists build laws and machines that aspire to eliminate the people where possible, and minimize the destabilizing effects of their humanity when not. This is a self-consistent political philosophy, but one that can only self-consistently be held by robots.
For people, the sole viable answer is not to obviate the need for trust but to throw ourselves on its mercy. The great advantage of machines over laws is that the machines enforce themselves, but the great advantage of laws over machines is that laws can blink and except and forgive. I was right, a law is an admission that we can't rely on individual judgment, but the application of law is itself an exercise in judgment. If the law forgets that it exists to serve people, the legal system can remember. It might not, always, when it should, but at least it can.
The huge catch in this simplistic reduction, of course, is that trust and judgment rely on information. Machines and laws both take known input. Some of the time, the problem is not that people aren't willing to do the right things, it's that they are placed in contexts where the right things cannot be identified from the information available. Our technology increasingly obscures the relevant criteria with the irrelevant, overlaying illusory and ill-fit simplicity on genuinely and meaningfully complex problems. We replace people with numbers, and data with summaries, and paragraphs with quasi-graphs, until we're left with shiny Go/No-Go toggles, and then hope that flipping a coin will count as wisdom if we just do it with a firm enough hand. We push decision-making up the hierarchy while information drains down it. Our systems for progress and compliance will fail in the long term, not just because over-modeling embeds dysfunction and engenders systemic fragility, but more crucially because over-modeled systems untrain their most important users long before they can afford to. The executives make bad decisions based on misconsolidated numbers, and the people who know what the numbers misrepresent slowly forget how to make better ones.
Arguably the worst feature of this problem is that it is self-exacerbating, and very nearly self-initiating. If two roads cross in an unmarked intersection, in a world of unmarked intersections, drivers in all directions have no alternative but to conduct a full review of the state of the intersection before attempting to go through it. Ditto for pedestrians trying to cross a street, bicyclists, utility workers, small dogs, parachutists, etc. The system neither facilitates nor constrains human decision-making at this node. This state is technically stable, but socially vulnerable. One day somebody will suggest putting up a Stop sign.
If you argue against the Stop sign, the best possible result is that you will simply lose. More likely, you will get yourself vilified and ostracized. A Stop sign is self-evidently safer than an unmarked intersection. Think of the children. Better yet, think of the children as idiots (which will get easier the more you listen to their parents). Distrust them, and distrust drivers, and distrust yourself. The Stop sign will make people slow down.
Or it's supposed to, but it probably won't. Put up a single Stop sign, and you have begun the process of dehumanizing every person it confronts, and thus of building a system that will be measured in dehumanist units. The sign suggests a behavior. The absence of the sign in other directions now also suggests behaviors. Worse, the absence of the sign everywhere else in the world now suggests behaviors. The individual participant in the system can still opt to try to make their own decisions, but this will be confounded by the unassessable (to them) influence of systemic suggestion on the other participants. Before the Stop sign, everybody had to go slow. After the Stop sign, people in one direction will learn to stop, but the people in the other three directions will learn that they can speed up. Obviously the second Stop sign is an immediately necessity, as is teaching your children not to cross that street anymore, except that's frightening so you might want to ask the city council to put up a fence. People will drive even faster once there's a fence to protect the children, but a modern city will want programmable traffic lights, anyway, so you can optimize city-wide traffic patterns. Now people will drive faster in all those optimized directions, which will lead to some of them running the lights, which will lead to new laws and new concrete and more police and a combination of national highway system and automobile industry whose list of safety innovations is only outnumbered by the roll call of its dead.
This pattern is recapitulated over and over, in contexts larger and smaller. We scrutinize each step, but neglect origin and destination. The rational anarchist I wanted to be has no chance; rational anarchy is a group behavior, not an individual response. The rational traffic anarchist will quickly become a statistic. The Kantian I now am isn't much better off, either. The Kantian driver will be reduced to sheepish obedience. In an over-constrained world, there are no universal laws left to legislate, so the legislator is left to drive within the lines. I don't believe I am exaggerating when I say that I am the most diligently lawful driver I know. I don't even park illegally. It's not self-righteousness, it's self-defense. My best personal hope of surviving our current traffic system is to obey the laws, hope everybody else obeys them too, and spend some of the leftover moral energy trying to change the ones that seem the most specifically counter-productive. The overall system is beyond my ability to repair.
Different systems, obviously, impose different intractabilities on change and different costs on dissent. It is unlikely that a heedless eight-year-old chasing a soccer ball will suddenly run out in front of my illegal file download, so copyright adherence may seem to belong to a different conversation than the one about whether the "No U-Turn" signs on Mass Ave apply to you. In practice, I find myself behaving the same way. I hate Sony and BMG, but I like music. Copyright law has serious problems in current mechanics, but I still basically support the premise that artists should retain some control over their art (not that there aren't other premises you could assign to copyright law), and so, industry exploitation and nuisance copy-protection and moronic DMCA nonsense notwithstanding, paying for music feels to me more like supporting the future I want than does downloading it for free.
And I am getting married. The practical arguments for legal marriage belong to the same genus as obeying traffic signs, but in this case are of very little individual consequence to me. When I decided to get married (which only sounds unilateral because it was Belle who proposed to me), it was because I believe wholeheartedly and unapologetically in the greatness of people promising to share the rest of their lives with each other, and I agree with the institution's most gibbering "defenders" that recognizing this promise with civil privileges is one of the core responsibilities of a civilization. Except that some of them think it's a reason for civilization, which is circular, or a prerogative of civilization, which is backwards. In an anarchy, what we promise each other is nobody's business, is not business at all. Marriage is a necessity of a regulated community. That is, if you constrain human freedom, you incur the obligation to see that your constraints do not come between people and their free promises to each other. Our civilization has done an incomplete job of this, in more ways than just the gender constraints we're now fighting about, but we will keep fixing the things that are broken. As a society, we will keep trying to understand how to make the legal implications of marriage match the moral ones. As a married person, I will keep trying to figure out how the faith in my promise inhabits the anima of my instincts.
But marriage is a law, not a machine, and the moral substance of Belle and my promises to each other will not even be laws. Same-sex marriage is not going to destroy our civilization, because sex is not a civil issue. (Note that reproduction could be, but not on this planet, where we're going to need more parents and fewer children for the foreseeable future, not vice versa.) It's a painful irony that the factions we can usually rely on to campaign for reducing the imposition of government on people are in this case (and a couple other notable ones also involving sex) trying to increase it. To be fair, though, it's an even more painful irony that in most other realms it's the factions who claim compassion and humanity as their distinctions who end up trying to substitute leery paternalism for inspiring confidence. Our nominal conservatives are moral cowards and bullies at once, but our nominal liberals are self-defeating manipulators and self-perpetuating bureaucrats.
Here, then, is my injunction to myself and to all of us, in our roles, minor or major, as administrators of this civilization we are trying to share. Rational anarchy is inoperable, but it is the moral impulse in which all deviations from it are grounded. Our laws should declare nothing we can afford to leave unsaid, because laws inevitably compound, and we are better equipped to survive simple flawed systems than complex ones. Our machines should enforce nothing we can afford to leave to imagination, because the purpose of machinery is to amplify our imagination, not to predetermine it. When the rare opportunities for thinking of yourself as a legislator of universal law fall to you, perform them to the best of your judgment and trust, and the rest of the time, try to ride public transportation or a bike.
But these are the simplest cases, albeit the most common. The complex cases are the next order. We will have laws, and machines. When the rare opportunities for thinking of yourself as a designer of systems fall to you, remember that it is not the designer's judgment and trust you are trying to serve, but that of the participants in the system. It is not enough for you to do the right things, you must find a way of designing a system in which everybody is free, and able, and even invited, to do the right things themselves. You must have the courage not to make things easy for them when they need to make hard decisions, not to simplify away details they need to be confused by, not to pre-answer questions they need to ask, not to empower them when the wisdom to use that power is obscured from them. You must not replace judgment. We are not robots, and if we do not treat each other as robots, we will not be fascists, either. We must trust each other, even where disaster seems assured. How else will we learn? We must understand that the best system-design may involve building nothing, or dismantling things we've relied on. Somebody may have to argue against the Stop sign, after all. There can sometimes be better order in chaos. We must be willing to set ourselves terrifyingly free. How else will our promises define us unless they are exactly what hold us together and apart?
Site contents published by glenn mcdonald under a Creative Commons BY/NC/ND License except where otherwise noted.