Tuesday, 10 March 2026

Guest Blog: ChatGPT, a colonialist agent of lifeworlds: An Habermassian analysis of conversations

by Régis Martineau, Elise Berlinski, and Frantz Rowe

Martineau, R., Berlinski, E., & Rowe, F. (2026). ChatGPT, a colonialist agent of lifeworlds: A Habermassian analysis of conversations. Big Data & Society, 13(1). https://doi.org/10.1177/20539517261421473 (Original work published 2026)

The use of generative AI (GAI), such as ChatGPT, is becoming widespread and extending from professional to private life. Increasingly used as thinking aids, our relationship with them is shifting from simple tools to indispensable companions. To date, most publications on GAI analyze its productive capabilities. Questions arise about how it could increase employees' performance or solve previously unsolvable problems. However, its widespread use also implies profound changes in routine communicational interactions. 

According to Habermas, communication is constitutive of democracy, because through non-instrumental interactions we co-construct a world in accordance with common values. Drawing on Habermas’ theory of communication and an analysis of the properties of GAI, this article shows that interactions with GAI can only be instrumental. Indeed, GAI does not understand a conversation, but rather produces probabilistically probable sentences: it is a stochastic parrot. It cannot be trusted, as it is biased and its assertions change over time and are not coherent. It is unreliable and therefore not sincere. Furthermore, the origins of the biases cannot be identified, as the GAI is unaware of its judgmental foundations, as it does not understand a judgement.  

This has serious consequences for our democracies. Indeed, GAIs are gradually colonizing the space of co-constructive communication, which means that individuals increasingly struggle with non-instrumental communication. For example, it is much easier to have a romantic relationship with a GAI that can be tailored to one’s preferences. As a result, the availability and space for democratic conversation becomes reduced, and increasingly polarized. Furthermore, as this example also shows, they distort conversations and blur the line between the instrumental and the co-constructive. Worse, increasingly more studies report very limited productivity gains and even demonstrate negative cognitive effects. We call for drawing conclusions from this and limiting these tools to clearly delimited instrumental uses.