In 1938, the English science fiction writer H. G. Wells envisioned the coming of a “world brain.” It was to be a universal encyclopedia of human knowledge; a codification of all history and culture into one technologically enhanced body of information accessible to anyone, anywhere. Raring to resume the “infinite march of progress” halted by the First World War,1 Wells called for the creation of a radically new kind of encyclopedia fit for a globalized, rationalized humanity. “Such an encyclopedia,” he said, “would play the role of an undogmatic Bible to a world culture.”2
Wells’s world brain may have first appeared an outlandish utopian concept—coated with the metallic sheen of the sci-fi imagination—but it was simply a continuation of ideas about human knowledge that had become prevalent since the European Enlightenment. In the eighteenth century, Immanuel Kant had synthesized empiricism and rationalism into a universalized schema of experience that applied, in theory, to all people in all places. This was part of his mission against the perceived dogmatism of traditional institutions that hindered individuals’ use of reason. Then came the secularist French Encyclopédistes, who—in the mirror of Kantian philosophy—sought to bind all human knowledge into one comprehensive index of world history. It was not only Wells who plugged these approaches to knowledge into the emerging technologies of his time, but also Arthur C. Clarke who, in the 1960s, picked up the mantle of a world brain and predicted that individuals would become connected worldwide through a device that a man would carry “in his vest pocket.” These ideas came to fruition through the innovation that defines our lives today: the internet.
The internet can be seen as the culmination of the Enlightenment project. Besides being a legacy of the Kantian aspiration for a universal and “undogmatic” space for human autonomy—allowing each of us as rational subjects to freely encounter objects of knowledge—it also echoes the egalitarianism of Thomas Hobbes and John Locke, who saw the world as comprised of “self-contained, self-moving and self-oriented beings,”3 rendering us all, in an ontological sense, equal. The internet actualizes this with its uniform “users,” each with the same means of accessing a worldwide web of knowledge. As with the Age of Enlightenment, it was hoped that the internet age would bring about world peace. It was to be the digital alarm clock that finally awoke us from our “dogmatic slumbers,”4 simultaneously creating a universalized humanity and empowering individuals to discover things for themselves.
What motivated this ambition to unite all of humanity under a banner of shared knowledge? Perhaps counterintuitively, we may find its origins in traditional religious ideas. The impulse to seek knowledge, fundamentally a spiritual impulse, arises from a primordial curiosity about the world and our place within it. Teachings across different religions extol that pursuit, from the biblical proverb “the heart of him who has understanding seeks knowledge” (Proverbs 15:14) to the Prophet Muĥammad’s saying that “seeking knowledge is an obligation upon every Muslim.” The idea of making knowledge accessible to all human beings—especially knowledge that improves the conditions of human life and therefore our ability to perform good deeds and acts of worship—is harmonious with a theistic worldview in which God has created us with the potential to discover creation and the Divine.
Yet the universalizing or perhaps totalizing approach to knowledge that we see in Enlightenment Encyclopedism and the “world brain” redirects this spiritual impulse. These innovations arrived at a time when epistemology had already been profaned; Kant had (if gently) pulled the pursuit of truth-seeking down to a worldly plane.5 As scholar Lee Braver describes, Kant’s theory of knowledge was transcendental but not transcendent,6 attempting to account for all of human experience but within a philosophical framework that legitimized only that experience which was immanent; a kind of epistemic flattening, we might say, evoking René Guénon’s critique of how “horizontal knowledge” (quantitative knowledge of things of this world) has replaced “vertical knowledge” (qualitative knowledge that looks beyond this world) owing to the gravitational pull of modern materialism.7 In effect, the contents of an encyclopedic world brain represent only a flat topography of human existence rather than the means of attaining transcendent truths. On this basis alone, it desanctifies the search for knowledge.
The ideology behind the world brain defies an important precept of traditional philosophies: the necessity of limits in the apprehension of knowledge. The internet represents not just a technological revolution but also an epistemological one, as it liberates subjects from their immediate physical contexts and from pedagogical institutions that had previously determined what, how, and when they were to learn. In other words, the internet helped overcome all limits. Like all other revolutions—characterized by a shift “from the reign of necessity to the reign of freedom”8—it placed the keys to all knowledge in the hands of each individual. For revolutionaries—be they Enlightenment thinkers, critical theorists, or anarchists—this removal of epistemic restrictions could only be a positive move. After all, they viewed institutions that upheld the restrictions as oppressive power structures embroiled in programs of what Michel Foucault called “knowledge production.”9
Despite being framed as a benevolent gesture toward inclusion and equality, the idea that all human beings should “dare to know” (Kant’s sapere aude) through their own reasoning ability has come with an unprecedented degree of epistemic hubris; an anthropocentric arrogance that insists that human beings, unmoored from teaching and tradition, are capable of acquiring and producing knowledge by and for themselves. We are warned against this kind of hubris in such lessons as Proverbs 1:7: that fools despise instruction. Throughout scripture, we find that the imperative to seek knowledge is tempered by awareness of the natural limits of our humanity and the need for guidance in the pursuit of wisdom. Subjective knowledge, after all, tends to be distorted by the fallibility of our senses and corrupted by our base desires. To assert that we can know things wholly independently and without instruction or guidance is, in one interpretation of the Qur’anic verse, to transgress our limits (2:190).
The consequences of this transgression are glaringly apparent. Far from fulfilling its promise of a united humanity, the internet’s epistemological revolution has wrought hyper-individualism, severe polarization, and the utter fragmentation of knowledge. Instead of enabling us to attain universal truths, the internet has given us only a universal right to forge our own truths. With social media sophistry and the dissemination of information outside of pedagogical institutions, anyone can claim expertise on any given subject at the touch of a podcast, a video clip, or a colorful infographic. “Content creators” are now the promulgators of knowledge, albeit more preoccupied with “hot takes” than any sincere attempt to increase human understanding. The cacophonic and contrarian character of digital discourse endangers humanity as a whole and hinders our acquisition of wisdom as individuals. In despising instruction and seeking knowledge by and for ourselves, we are too often left overwhelmed, bereft of any real insight.
Philosophies that flourished before the modern era of the world brain had a different approach to knowledge: one that valued limits and recognized the dangers of transgressing them. The notion of placing certain restrictions on information was not seen as a tool of oppression, but rather served to initiate us into knowledge so we could come to understand the world through the guidance of those more learned than ourselves and through means that speak to us as human beings in the world. When knowledge is reduced to instantaneously accessible data, as is pervasive in the digital arena, we lose the human agency that determines what is Good or True, and are instead left susceptible to corrupting influences that seek to manipulate rather than to inspire real knowledge.
Traditional forms of authority—the teacher, the sage, the priest, or the philosopher king—were considered essential to the pursuit of knowledge for this very reason. After all, the proverbial concept of instruction implies the need for an instructor. But contrary to the modern and postmodern critique, the kind of authority that these archetypes embodied sought not to produce knowledge but rather to enable its discovery. As per the maieutic method of Socrates, the pedagogue should act as a midwife helping give birth to the student’s own understanding.10 Knowledge is a matter of drawing out more than putting in,11 a nuance that the likes of Foucault dismissed, being committed to the view that “truth is a thing of this world” and not something latent within the soul.12 When we open our minds to this aspect of traditional authority, we come to realize its ability to actualize human potential—and the need to revive it in a disordered digital age.
Indeed, the very etymology of the word “authority”—from the Latin augere, “to make grow”—reveals its inherent purpose to nurture intellectual, moral, and spiritual growth and therefore human flourishing.13 It may determine the direction of growth for the same reason that heliotropic plants must be oriented toward the sun to receive light. Likewise, authority may regulate the educational environment in which a person learns—for example, as in Plato, prohibiting that which misdirects the soul’s inclination toward the Good—as photosynthesis can only occur within a specific set of conditions. Authority should not suppress the variations in individuals’ thoughts or beliefs but rather preserve the one invariability that allows them to grow, naturally, into their best selves.
In practice, this model of authority determined the ordering of curriculums within classical institutions. In liberal arts education, for example, students must first be taught the preparatory disciplines of the trivium—grammar, logic, and rhetoric—before they learn those of the quadrivium—arithmetic, geometry, music, and astronomy. The pedagogical process placed limits on learning to initiate students into knowledge. Teachers embodied the notion of authority as that which encourages growth, setting certain boundaries to ensure that students are equipped to encounter complex concepts or, put another way, that they do not “get ahead of themselves.” Looking beyond the cynical dichotomy of master and slave, we find this relationship to be crucial; without the guidance of those more learned than us, we become intellectually adrift and anchorless, susceptible to being lured away from the pursuit of truth.