Information wars are about to get worse, says Yuval Harari

“Let truth and falsehood contend,” argued John Milton in Areopagitica, a pamphlet published in 1644 in defense of freedom of the press. Such freedom, he admitted, would allow incorrect or misleading works to be published, but bad ideas would spread anyway, even without printing; so it was better to allow everything to be published and let rival opinions compete on the battlefield of ideas. Good information, Milton confidently believed, would drive out bad: the “dust and ashes” of falsehood “may yet serve to polish and brighten the armoury of truth.”

Yuval Noah Harari, an Israeli historian, criticizes this position as a “naïve view” of information in a timely new book. It is a mistake, he argues, to suggest that more information is always better and more likely to lead to the truth; the Internet did not end totalitarianism, and racism cannot be eliminated by fact-checking. But he also argues against a “populist view” that objective truth does not exist and that information should be used as a weapon. (It is ironic, he notes, that the notion of truth as illusory, which has been embraced by right-wing politicians, originated with left-wing thinkers such as Marx and Foucault.)

Few historians have achieved the global fame of Harari, who has sold more than 45 million copies of his mega-histories, including “Sapiens.” His followers include Barack Obama and Mark Zuckerberg. Harari, a doomsday technofuturist, has warned of the harmful effects of technology in his books and speeches, but he captivates Silicon Valley bosses, whose innovations he criticizes.

In “Nexus,” a narrative spanning the Stone Age to the age of artificial intelligence (AI), Harari sets out to “better understand what information is, how it helps build human networks, and how it relates to truth and power.” The lessons of history can, he suggests, serve as a guide to addressing the great information-related challenges of the present, chief among them the political impact of AI and the risks to democracy posed by disinformation. In an impressive feat of temporal aim, a historian whose arguments operate on the scale of millennia has managed to capture the zeitgeist perfectly. With 70 nations, representing around half the world’s population, heading to the polls this year, questions of truth and disinformation are top of mind for voters and readers.

Harari’s starting point is a novel definition of information itself. Most information, he says, represents nothing and has no essential link to truth. The defining characteristic of information is not representation but connection; it is not a way of grasping reality but a way of linking and organizing ideas and, crucially, people (it is a “social nexus”). Early information technologies, such as stories, clay tablets, or religious texts, and later newspapers and radio, are ways of orchestrating social order.

Here, Harari is building on an argument he has made in his previous books, such as Sapiens and Homo Deus: humans prevailed over other species thanks to their ability to cooperate flexibly in large numbers, and shared stories and myths allowed those interactions to extend beyond direct contact between people. Laws, gods, currencies, and nationalities are intangible things that are created through shared narratives. Those stories don’t have to be completely accurate; fiction has the advantage of being able to simplify and ignore uncomfortable or painful truths.

The opposite of myth, which is interesting but may not be accurate, is the list, which boringly attempts to capture reality and gives rise to bureaucracy. Societies need both mythology and bureaucracy to maintain order. Consider the creation and interpretation of sacred texts and the rise of the scientific method as contrasting approaches to questions of trust and fallibility, and the maintenance of order versus the search for truth.

He also applies this approach to politics, treating democracy and totalitarianism as “opposing types of information networks.” Beginning in the 19th century, mass media made democracy possible at the national level, but they also “opened the door to large-scale totalitarian regimes.” In a democracy, information flows are decentralized and rulers are assumed to be fallible; in totalitarianism, the opposite is true. And now digital media, in various forms, are having their own political effects. New information technologies are catalysts for major historical changes.

Dark Matter

As in his previous works, Harari’s writing is confident, wide-ranging and laced with humour. He draws on history, religion, epidemiology, mythology, literature, evolutionary biology and his own family biography, often jumping millennia and back in a few paragraphs. Some readers will find this stimulating; others may experience whiplash.

And many will wonder why, for a book about information that promises new insights into AI, he spends so much time on religious history, and in particular the history of the Bible. The reason is that holy books and AI are both attempts, he argues, to create an “infallible superhuman authority.” Just as decisions made in the fourth century AD about which books to include in the Bible turned out to have far-reaching consequences centuries later, the same, he worries, may be true today regarding AI: decisions made about it now will shape the future of humanity.

Harari argues that AI should actually be synonymous with “alien intelligence” and fears that AIs are potentially “new kinds of gods.” Unlike stories, lists, or newspapers, AIs can be active agents in information networks, like people. He fears that AI will exacerbate existing cyber dangers, such as algorithmic bias, online radicalization, cyberattacks, and ubiquitous surveillance. He imagines that AIs will create dangerous new myths, cults, political movements, and new financial products that will wreck the economy.

Some of his nightmare scenarios seem far-fetched. He imagines one autocrat becoming dependent on his AI surveillance system and another, distrusting his defence minister, handing over control of his nuclear arsenal to an AI. And some of his concerns seem quixotic: he rails against TripAdvisor, a website where tourists rate restaurants and hotels, as a terrifying “peer-to-peer surveillance system”. He has a habit of conflating all forms of computing with AI. And his definition of “information network” is so loose that it encompasses everything from large language models like ChatGPT to witch-hunting groups in early modern Europe.

But Harari’s narrative is engaging and his approach is surprisingly original. He himself admits that he is an outsider when it comes to writing about computer science and artificial intelligence, which gives him a different and refreshing perspective. Technology enthusiasts will find themselves reading about unexpected aspects of history, while history buffs will understand the debate over artificial intelligence. Using storytelling to connect groups of people? That sounds familiar. Harari’s book is an embodiment of the theory he sets forth.

© 2024, The Economist Newspaper Limited. All rights reserved. From The Economist, published under license. The original content can be found at www.economist.com

Source link

Disclaimer:
The information contained in this post is for general information purposes only. We make no representations or warranties of any kind, express or implied, about the completeness, accuracy, reliability, suitability or availability with respect to the website or the information, products, services, or related graphics contained on the post for any purpose.
We respect the intellectual property rights of content creators. If you are the owner of any material featured on our website and have concerns about its use, please contact us. We are committed to addressing any copyright issues promptly and will remove any material within 2 days of receiving a request from the rightful owner.

Leave a Comment