61 pages • 2 hours read
In the Prologue to Nexus, Yuval Noah Harari posits that the rise of Artificial Intelligence (AI) will soon cause “an existential crisis” (xi) in humanity. As wise as humans are, humans seem innately self-destructive. He cites myths and legends that warn humans not to “summon powers you cannot control” (xiii), but laments that these cautionary fables do not offer answers beyond waiting for divine intervention. These fables, Harari suggests, misunderstand the way in which humans gain and abuse power, which is closely associated with the human tendency to cooperate in large numbers. Typically, people believe that big information networks are inherently good. This view is naïve, Harari says, as information can lead to errors as well as truth.
The naïve view that “information is an essentially good thing, and the more information we have of it, the better” (xvii) does not stand up to scrutiny. Inventions that have revolutionized human culture, from the printing press to the telegraph to the internet, have not eradicated prejudice or violence. Harari accepts that there are many cases where more information has improved the world, comparing child mortality rates from the 18th century to the modern day as a demonstration of a time when the naïve view of information proved to be correct.
Plus, gain access to 8,550+ more expert-written Study Guides.
Including features:
By Yuval Noah Harari