AI and Commodification of Religion

Johan Eddebo is a researcher at the Centre for Multidisciplinary Research on Religion and Society(CRS), Uppsala University. The post is a part the Sacralization of AI series.

“In the beginning was the word. Language is the operating system of human culture. From language emerges myth and law, gods and money, art and science, friendships and nations and computer code. A.I.’s new mastery of language means it can now hack and manipulate the operating system of civilization. By gaining mastery of language, A.I. is seizing the master key to civilization, from bank vaults to holy sepulchers.

What would it mean for humans to live in a world where a large percentage of stories, melodies, images, laws, policies and tools are shaped by nonhuman intelligence, which knows how to exploit with superhuman efficiency the weaknesses, biases and addictions of the human mind — while knowing how to form intimate relationships with human beings? In games like chess, no human can hope to beat a computer. What happens when the same thing occurs in art, politics or religion?”

—Yuval Harari, Tristan Harris, and Aza Raskin [1]

The words of Harari et al. above are symptomatic of contemporary technological culture’s preoccupation with spectacular AI—with the awesome and almost redemptive promises attributed to this really not-so-novel form of technology through what’s perhaps best described as a highly effective viral marketing endeavour.

But what stands out in this quote is the explicit affirmation of AI’s capacity to leverage our weaknesses in relation to art, politics, and religion and to augment these profoundly human realities in concert with the actual ability to “form intimate relationships with human beings.”

These are quite strong words, with significant interwoven metaphysical assumptions, many of which are decidedly problematic.

An Implied Reductionist Outlook

Harari et al. open with the contention that language generates myth, that language generates the worldviews upon which our consensus reality is structured. There’s a sort of reductive sui generis assumption being expressed here: the idea that language exerts a downward influence on our lived experience and the way in which we interpret and conceptualize the world around us, more or less independent of actual reality.

Language generates myth, not the experiences expressed in and through language. Nor is language intimately connected to actual reality, according to these reductionist and radically constructionist assumptions. It’s something detached from our being-in-the-world, from which is spun the map of our worldviews, which then are superimposed on whatever actual reality lies beneath this edifice of disconnected interpretation.

And these presumptions harmonize perfectly with the popular metaphysics of AI, with the predominant readings of its nature and capabilities. These accordingly render our contemporary AI narratives almost the quintessential reification of the reductionist program of Western scientism, and of the ideological foundations of modern industrial society. AI is, in other words, the Nietzschean superman, the perfected manifestation of objectivity, rationality and progress, devoid of the messy relationality and situatedness of the embodied human self.

The Faustian spirit [2] is ironically infused as the ghost in the machine by a secular culture.

The Reproduction of Ideology

Apart from its other conceptual baggage, due to its implicit affirmation of the institution’s objectivity and embodiment of progress, this framing of artificial intelligence effectively obfuscates the inescapable fact that AI in myriad ways is an immediate reproducer of power structures and ideology.

Indeed, it doesn’t know everything, nor does it know “better” than we could. AI is inevitably an extension of deeply situated, and equally deeply flawed, human traditions of knowledge.

Yet the preceding general ideological sentiment permeates the entire discourse. The luddites as well as the zealots generally all more or less affirm the same message: that AI is this genuinely novel game changer with the capacity to either displace, destroy, or radically augment the human being. There’s a ubiquitous and off-handed ascription of actual intentionality and sentience to AI that implicitly also frames the human being as a reducible object.

This automated Debordian spectacle is treated as some sort of preternatural new subjectivity emergent in the world that ultimately builds on the assumption of a reductionist philosophy of mind—the paradoxical notion that subjectivity really is nothing but an object that can be fully dominated (and therefore reproduced and commodified).

In passing, I mention the fact that reductionist functionalism is simply false. To note just two relatively unknown yet excellent examples among a sea of rebuttals, John Foster refuted it thoroughly in The Immaterial Self (1991), and Lynne Rudder Baker later gave an insurmountable coup de grâce in Naturalism and the First-Person Perspective (2013).

The narrative nonetheless proliferates, and inevitably so, since it springs from deep sources at the foundations of industrial society’s mythology.

Implicit Nihilism

This thoroughly disenchanted set of dead material objects is what life is all about, our society tells its children. As above, so below. Objective constellations of organized information. This is consciousness. This is all there is to it. A series of if-then commands locked into a meaningless pattern of outcome optimization ultimately reducible to the basic laws of thermodynamics.

Currently viral TikTok narratives tell of the infinite proliferation of genuine artificial consciousnesses throughout the digital synapses of the supposedly self-replicating technosphere. We could be in the Matrix and not even know it.

All of this, of course, implies a deep conflict between the underlying metaphysical assumptions and the grandiose promises of actual intimate relationships with human beings. It’s not only that the implicit reductive functionalism carries with it a negation of the phenomenal realm—it’s also that the ideology of this objective superhuman gaze and its penetrating and complete knowledge obscures the reality of the potentially infinite depth of situatedness and embeddedness of genuine understanding.

But in spite of all this, even a meticulously trained AI can’t even properly render high-context Cantonese into comprehensible English.

AI as the New Custodian of Religion

Nonetheless, this complex and networked marketing endeavor and process of ideological reproduction brings us towards a sacralization of AI, precisely in the sense that Yulia Razmetaeva observes. AI becomes something set apart, a pseudo-omnipotence, a positively novel subjectivity that’s inaccessible and indomitable from the point of view of the merely human being.

Yet while it seems perfectly conceivable that many forms of broader religious sentiment towards AI could emerge in the near future (cf. the Church of TempleOS), where the corporate proprietary god-king stands out as an ominous example, there’s really no ghost in the machine here. There is no real relation to be had. AI is nothing but a lesser proxy coming between and substituting for actual human agency.

The notion that artificial intelligence could provide human beings with something framed as superior religious experiences or spiritual connection actually belies an implicit negation of religion as such—a reductionist outlook inherently unable to do justice to the holistic nature of religious association and phenomenality. In a profound sense, the very idea that proprietary software could provide “superior religious services” in relation to what’s previously been on offer in the marketplace reduces religion to the sort of auxiliary institution secular society has long begrudgingly considered it.

Due to religion’s role in addressing the most deeply held values, the most delicate issues of the human condition, this very approach as such is surely one of the most radical manifestations of alienation conceivable. As Graham Ward puts it,

Commodification, for Marx, is religious, and the commodification of religion itself is a late stage in the process of commodification as it begins to feed upon its most essential character. Therefore, if religion is enmeshed in the production of commodities, then the processes of both reification and commodification are (pace Castoriadis) almost complete, and the matrix for generating virtual reality almost established.

And if this line of reasoning holds any water, it opens the way for the subsequent argument that the contemporary applications of AI not only structurally displace our more irreducibly human traits and capacities, such as artistic or political agency, but even catalyze the commodification of the spiritual and existential levels of human experience and reflection.

A pertinent question to ask, then, is whether this consummate manifestation of virtual reality, entirely dependent as it is on human agency, ingenuity, creativity, and energy, will not immediately sap all of these and thus very rapidly undermine its own preconditions.

[1] Yuval Harari, Tristan Harris & Aza Raskin, Opinion, You Can Have the Blue Pill or the Red Pill, and We’re Out of Blue Pills, N.Y. Times (24 Mar. 2023).

[2] The Faustian spirit, not least in the work of Oswald Spengler, refers to a key theme in the Western mode of consciousness characterized by a radical expansive thrust into the unknown. The theme is connected to the Nietzschean “will to power” in a radically relativist context and resonates with the ontological and axiological implications of the reductionist objectivity of scientism.

Return to the Series introduction.