AGI Will Displace Humans Like Cars Displaced Horses: Historian Niall Ferguson

It is apparent that humans are well on the path to creating intelligences superior to their own, but it remains to be seen how relevant humans remain once these intelligent beings are around.

British historian and author Niall Ferguson has offered a stark perspective on the future of humanity in the age of Artificial General Intelligence (AGI). His analogy, comparing the potential displacement of humans by AGI to the displacement of horses by cars, is particularly chilling, especially given the rapid advancements in AI we’ve witnessed. He suggests that AGI could render humans largely redundant, leading to a decline in population and potentially even extinction.

“I think the advent of AGI will coincide with, or just precede, the decline in population,” Ferguson states. “The human race will just go the way of horses. Horses used to be the defining form of transport for most of recorded history until we came up with something that was clearly better than a horse, namely a car.”

This comparison forms the crux of his argument: “I think if AGI is achieved – general intelligence superior to most humans – then we will have good reason to go extinct, or at least to shrink in our numbers the way horses have.” He dismisses the idea that this is mere fear-mongering: “This is not… AI doomsaying. It’s just an obvious inference. If (future AI) is as good as Sam [Altman, CEO of OpenAI] says it will be, most humans will be redundant. Simple as that.”

Ferguson’s argument culminates in a reference to science fiction, invoking the hostile Trisolarans from Liu Cixin’s The Three-Body Problem: “If we create the aliens in our own midst… if we create the Trisolarans from The Three-Body Problem, we make them, then what do we expect to happen?”

Ferguson’s analogy, while provocative, raises critical questions. If AGI surpasses human intelligence in all domains, what roles will humans fill? Will we become reliant on AGI, effectively surrendering control over our own destiny? The comparison to horses, while seemingly extreme, highlights a potential power imbalance. Horses, despite their strength and historical importance, were ultimately subservient to human needs and desires. Could humanity face a similar fate, becoming irrelevant or even a burden in a world dominated by superior artificial intelligence?

Furthermore, his reference to the Trisolarans underscores a deeper concern: the potential for conflict. If we create an intelligence superior to our own, can we guarantee its benevolence? The Trisolarans, driven by the need to survive, pose an existential threat to humanity in Liu Cixin’s novels. This fictional scenario forces us to confront the potential consequences of creating an intelligence that might view humanity as a competitor or even an obstacle. Ferguson’s perspective, while perhaps pessimistic, serves as a crucial warning — the development of AGI holds immense promise, but it also presents unprecedented risks.

Posted in AI