The Luddites Were Right!
"Is It OK to Be a Luddite?" asked novelist Thomas Pynchon all the way back in 1984, the year, according to George Orwell's prediction, that we'd all be living in a technologically advanced dystopian hell. At the time, the digital revolution was just taking off, the personal computer only a few years away from reality.
Pynchon questioned if the advent of the P.C. would be opposed by those who took after the Luddites of old – literary and intellectual humanists. During the Industrial Revolution, the European intelligentsia fretted over the effect things like textile machines and the steam engine would have on manual labor. This concern was wound into the works of Lord Byron and Mary Shelley, who warned of technological progress gone awry.
Then the electronic word processor came along and shut the snooty intellectuals up. "Machines have already become so user-friendly that even the most unreconstructed of Luddites can be charmed into laying down the old sledgehammer and stroking a few keys instead," Pynchon lamented. The Luddite mindset looked defeated at the hands of a few college dropouts tinkering in their garages.
Fast-forward almost three and a half decades, and Pynchon may think differently. Two recent events have muddled our understanding of technological innovation, clouding and confusing what we thought we knew about the promise of advancement.
The first was the implosion of every techie's dream: the self-driving car. An auto-piloted Uber vehicle struck and killed a pedestrian near Phoenix, Arizona. At the time of the accident, the vehicle, a Volvo SUV, was in a state of full mechanical autonomy; the driver behind the wheel was there only for emergencies. An emergency did occur, as Elaine Herzberg attempted to cross the road at night. The car's lights didn't spot her in time. The brakes failed to kick in. Herzberg lost her life on the darkened street.
This was no human error. The Associated Press consulted two experts, who found that the SUV should have sensed Herzberg before the collision. "The victim did not come out of nowhere. She's moving on a dark road, but it's an open road, so Lidar (laser) and radar should have detected and classified her," explained law professor Bryant Walker Smith.
Researcher Sam Abuelsmaid concurred, pointing out that the car's detection system "absolutely should have been able to pick her up." The word "absolutely" is apropos – we've been conditioned to expect tech to operate perfectly as planned, yet we find ourselves disappointed when the copier gets jammed for the thousandth time.
The second incident to raise our suspicion of technology's benefits was Facebook's admission that the data firm Cambridge Analytica had "violated" its terms of usage by accessing the private data of an estimated 50 million users. Cambridge, a British consulting firm employed at one point by the Trump presidential campaign, mined individual data by paying Facebook users for take a personality quiz via a third-party app, then turned around and used the participants' contacts lists as its own address book.
If the tactic sounds familiar – gathering minute details about individual voters by using Facebook's interconnectivity – it should. It was the near exact tactic the 2012 Obama campaign used, all to adoring fanfare. But since the Trump people did it, it's all of a sudden the equivalent of summoning Satan with a Ouija board.
Nevertheless, Zuckerberg made the rounds on cable news, apologizing over and over for the lapse in their oversight. "I wish we'd taken those steps earlier," Zuckerberg told a scolding a CNN host about Facebook's updated standards to protect user privacy. "That ... is probably the biggest mistake that we made here."
The company's mea culpa didn't prevent its stock from tanking or big industry names like Elon Musk from completely disassociating with the brand. And just like that, Zuckerberg's social media superstructure was struck on its Achilles heel: public perception.
The hubbub is all nonsense, of course. Only the most mendacious critics attest that Facebook was weaponized by dark forces. This is a center-left-driven moral panic used to excuse the communicative failings of the political class. Zuckerberg's a simple scapegoat, driven out for the sin of allowing his platform to be used by those without elite approval.
Modern-day Luddites should take heart over these developments. Uber has temporarily put the kibosh on testing autonomous vehicles. Facebook will never slough off the shame of enabling a boor like Trump to enter the White House. And now the social giant has to contend with a Federal Trade Commission investigation.
Noah Rothman of Commentary worries that the backlash we're seeing to technology's failures may inhibit our want for innovation. After Elaine Herzberg's death, Rothman predicts that the "temptation to put the brakes on this paradigm-shifting innovation will be immense." But we should fear not, as "this inevitable development will produce more winners than losers, as has virtually every other technological advancement of its kind."
Like Stalin's apocryphal omelet, death is sometimes the cost of betterment, whether it's mortal death, the death of privacy, the death of freedom, or the death of our attention spans.
There's something comically all too human about how many movies and books we produce about the dangers of technology, yet we still seek to slake our unquenchable thirst to push the cyber-frontier. One viewing of Terminator 2: Judgment Day should be enough to shear our tech fetishes, yet the technologists of Silicon Valley press forward with the development of the cybernetic endgame: self-aware artificial intelligence.
Pynchon recognized as much the year the first Terminator hit theaters. "[T]he next great challenge to watch out for will come," he wrote with a littérateur's foresight, "when the curves of research and development in artificial intelligence, molecular biology and robotics all converge."
What Pynchon described sounds eerily like the singularity, the epochal change when A.I. becomes self-reinforcing, awakening a new technological renaissance and changing humankind forever. Once it hits, which some researchers believe may have already happened, there is no going back. We'll have said our vows to our new algorithmically designed cyber-bride, locked into marriage 'til death do us part, where we then upload our consciousness into cyberspace to "live" forever, in the loosest sense of the word.
Elaine Herzberg need not be a necessary fatality on our way to the brave new world of technological enhancement. A little bit of that Luddite spirit could help us to distinguish between necessary technological change and our self-sacrifice before the digital gods.