The new hatred of technology

The new hatred of technology

People never are it was better, here in our simulation year 2024, to hate the very forces that underlie this simulation—to hate, in other words, digital technology itself. And good for them. These ubiquitous tech critics don’t just rely on the obscure, the nostalgic, the technophobic feelings already. Now they have scientific papers to back them up. They have best sellers like Harari and Haidt. They have – imagine the complacency –statistics. Kids, I don’t know if you’ve heard, are killing themselves from the classroom.

None of this bothers me. Well, teenage suicide obviously does, it’s horrible, but it’s not hard to debunk arguments blaming technology. What is difficult to debunk, and what troubles me, is the one exception, in my view, to this rule: the anti-technology argument advanced by the modern philosopher.

By philosopher I don’t mean some vaunted self-help statistical writer. I mean an overanalyzer at the deepest level, an absurdly scientific overanalyzer, someone who breaks problems down into their respective parts so that when those parts are put back together, nothing looks quite the same. Descartes didn’t just rattle off “I think, therefore I am” in haste. He had to go this far c his head as well as he humanly could, removing everything else before he could get to his classic one-liner. (Plus God. People always seem to forget that Descartes, the inventor of the so-called rational mind, could not eliminate God.)

For someone trying to make a case against technology, then a Descartes-esque line of attack might go something like this: When we go as far into the technology as possible, removing everything else and breaking the problem down into its component parts, where do we end up? Right there, of course: in the literal bits, the 1s and 0s of digital computation. And what do bits tell us about the world? I’m simplifying here, but to a large extent: everything. A cat or a dog. Harris or Trump. Black or white. Everyone thinks in binary terms these days. Because that is what is enforced and enforced by the dominant machine.

Or so, in a nutshell, is the strongest argument against digital technology: “I binarize,” computers teach us, “therefore I am.” Some techno-literates have been venturing out versions of this theory of everything for some time now; earlier this year, an English professor at Dartmouth, Aden Evens, published what is, as far as I can tell, the first properly philosophical codification of it, The digital and its discontent. I had a little chat with Evens. a good person. He’s not a technophobe, he claims, but still: It’s clear that he’s world-historically suffering from digital life, and that suffering is rooted in the foundations of technology.

Once upon a time I might have agreed. Now, like I said: I’m worried. I am dissatisfied. The more I think about the technophilosophy of Evens et al., the less I want to accept it. I think there are two reasons for my dissatisfaction. First: Since when are the units based on something dictate the overall expression at a higher level? Genes, the basic units of life, account for only a fraction of the vast majority of how we develop and behave. Quantum mechanical phenomena, the fundamental units of physics, have no bearing on my physical actions. (Otherwise I’d be walking through walls—when I wasn’t, I was dead half the time.) So why should binary numbers forever define the limits of computation and our experience of it? New behaviors always have a way, when complex systems interact, of mysteriously emerging. Nowhere in the individual bird can you find the flocking algorithm! Turing himself said that you cannot look at computer code and know completelywhat will happen

And second: Blaming the discontents of technology on 1s and 0s treats the digital as an endpoint, as some kind of logical conclusion to the history of human thought—as if humanity, as Evens suggests, has finally achieved the dreams of an Enlightened rationality. There is no reason to believe such a thing. Computers for most of their history have been no digital. And if predictions of an analog comeback are correct, it won’t stay purely digital for much longer. I’m not here to say whether computer scientists should or shouldn’t develop chips by analogy, just to say that if it were to happenit would be foolish to suggest that all the binarisms of modern existence, so thoroughly implanted in us by our digitized machine, will suddenly collapse into nuance and glorious analog complexity. We invent technology. Technology does not invent us.

Leave a Reply

Your email address will not be published. Required fields are marked *