There is no denying that AI will define the next decade and beyond, not just technologically but culturally. It won’t be too long before Spotify has the technology to recommend gene editing sequences for your one-click checkout baby based upon your listening and purchasing patterns. Would you like alternative and upbeat or classical and moody? Check out our featured templates by upgrading your subscription. People who listened to Taylor Swift also enjoyed “The Protagonist” (ENFJ) baby. But to what end?
Noam Chomsky implicitly likens GPT to Eichmann, writing that it “exhibits something like the banality of evil: plagiarism and apathy and obviation.” Language AI, for him, is quintessentially mediocre, a far cry from AGI (Advanced General Intelligence). Meanwhile, Yuval Harrari writes that “it is extremely powerful and offers us bedazzling gifts but could also hack the foundations of our civilization.” Eliezer Yudkowsky, who is the Big Short guy of AI, claims that AI alignment and AI safety are the greatest existential needs of our time, imagining a runaway scenario in which a AGI poisons humanity in one fell swoop. In the meantime, people are using GPT for all kinds of applications.
Sam Altman, Founder and CEO of OpenAI, says that the tech that has been released is nowhere near the capabilities of what is coming. Yes, early versions of the internet and the iPhone were rough and ready, but their runway continues. It would have been foolish to judge these by their first versions. Maybe GPT-4 can ace the LSAT, but still can’t write a good poem, but are we ready for GPT-703?
The ancient Athenians, in contrast to the Spartans, built a wall around their city-state. In so doing, the Athenians liberated their citizens from the burden of having to specialize in war. The Spartans, who chose not to depend upon wall “technology,” ensured that every male was an elite warrior. They lacked leverage, and thus their culture was spare. All resources went into war. In Athens, where the arts flourished, the goal was to outsource defense and deterrence to the wall so insiders could focus on other things. The wall was a kind of early version of artificial intelligence, allowing a machine to do the task for you. The problem with walled cities, though, is that, once breached, the offensive army has the advantage.
Outsourcing any task deprives you of the ability to know how to do it yourself. This is one reason Hegel thought slaves to have an advantage over their masters—proximity to work allows you to be a master of a craft. The watchmaker knows how to fix watches, while the owner of the watch store does not. As AI does more for us, we become more like owners and less like workers. Indeed, one of the optimistic prospects of AI is that it will take away a lot of annoying work and increase overall prosperity by making us more efficient and productive. Yet the cultural downside seems not to be an AI that is misaligned with human interest than the downside of the Athenian Wall, which weakened Athens and made it vulnerable to losses during the Peloponnesian War. Ironically, the cosmopolitan ideals of ancient Athens came home to roost when Attica was hit by a plague. The world to which Athens had been so open and to which it could be so open because of its long walls gave it diseases that spread behinds its walls with nowhere to go.
If we abstract, we see that technological progress is ambivalent because it deepens our dependence. At its most risky, it turns us into a monoculture that rises or falls with the technology.
The antidote to AI supremacy is not simply alignment or safety (ensuring the AI doesn’t go rogue), but diversification. We need watchmakers even if watches are obsolete. We need rhetoricians even if GPT can write speeches for us. We need philosophers and doctors and novelists even if or when GPT can surpass us in these fields. Otherwise, AI becomes the long wall that lets us be open to the world (and over-estimate our strength) until that very openness spreads us thin and reveals our fragility.
The question is not Athens vs. Sparta, but how much of each. Athens leveraged tech to free itself from specialization and to become a cosmopolis. Sparta relied on excellence in (war)craft, but shunned innovation, leaving little cultural trace. Their military dominance was short-lived, too.
The greatest risk I see to human flourishing by AI is not “instrumental convergence,” but the cultural assumption that our human work is done now that tech has made X, Y, and Z easier. In fact, this complacence has proven time and again to be the undoing of great civilizations. Embrace new use cases for tech, but don’t treat the human condition as a use case. Chomsky is wrong to diminish the advances in machine learning, but right to insist that more know-how will not bring more wisdom.
The reason AI will reach a cap, in my view, is that all of its references are bounded by human data. Human beings are the only ones capable of self-transcendence, of synthesis, of natality.
There is a theological idea that God is blind, and elects human beings to be God’s witnesses. When we get to heaven, God will ask us, “What did you see? What did you experience?” God will relish our answers as surrogates for the fact that God Godself could do no such experiencing. I remain convinced that AI will not be capable of experience, and thus of witness. The blind God will know this, and will not be fooled by GPT-703’s fabricated account of life on earth. Pericles may have erred in war strategy against the Spartans, but his funeral oration remains, memorialized. His witness remains.
I like the way you compared the Athenian wall to chat GPT as a machine being used to outsource an activity. Very similar to the way topology categorizes a donut as equivalent to a teacup because both have only on hole and by stretching can be converted into each other without cutting. We are truly living on a Mobius strip and within a Klein bottle.