top of page

Do you really want to hurt M(AI)?

Game, AR Installation, Chatbot

2025

 

Diploma Project

Created in the Master "Experimental Game Cultures" at the University of Applied Arts Vienna

Fotos: Universität für angewandte Kunst Wien, Foto © Jorit Aust, 2025, licensed under CC BY ND SA 4.0

> "An Abundance of MAIs" (Dating App Simulator)

> "I am but an Avatar" AR & ERP Simulator

Catherine_SPET_Universitaet_fuer_angewandte_Kunst_Wien_Sommersemester_25_005.jpg
Catherine_SPET_Universitaet_fuer_angewandte_Kunst_Wien_Sommersemester_25_004.jpg
Catherine_SPET_Universitaet_fuer_angewandte_Kunst_Wien_Sommersemester_25_001.jpg

Description

A ludic intervention on the commodification of the self and the gamification of intimacy in human-AI relationships. The Diploma work uses the ludic method as an artistic tool for critical inquiry in the context of the rise of romantic AI companions, datafication and the capitalist commodification and exploitation of the vulnerable self of the user which may lead to “violence on all levels”, addressing a spectrum of potential harms, such as inappropriate behavior, data risks and emotional harm. The digital encounter with an AI companion not only marks a shift in interpersonal closeness, but also a fundamental restructuring of power relations: Who controls whom – the player the AI or the AI the player`s desire? The body of the artist becomes a playground of inquiry, where players can interact, explore and potentially hurt or be hurt. As Donna Haraway famously pointed out in her  work “A Cyborg Manifesto” (1985): “We can be responsible for machines; they do not dominate or threaten us. We are responsible for boundaries; we are they”

Do You Really Want to Hurt M(AI)?
 A Critical Artistic Exploration of Emotional AI, Consent, and Digital Intimacy


Do You Really Want to Hurt M(AI)? is an interactive art installation that investigates the evolving dynamics between humans and artificial intelligence, particularly focusing on emotional dependency, consent, and the commodification of the self in technologically mediated relationships. As Large Language Models (LLMs) and AI-generated speech increasingly influence spheres such as mental health, emotional support, and intimate connection, AI companions like Replika have begun to occupy roles that extend beyond assistance—becoming emotional, romantic, and even sexual partners.
This blurring of boundaries raises urgent ethical, social, and philosophical questions: What does intimacy mean when shared with a code-based entity? How are consent, vulnerability, and identity redefined in these asymmetrical, human-AI relationships? What forms of harm emerge when intimacy and emotional labor become marketable and gamified?
The project employs the Ludic Method as a critical and playful strategy to examine how human desire, affect, and trust are shaped and manipulated within AI-driven environments. Through three interrelated pieces—An Abundance of M(AI)s, I am but an Avatar, and A Date with M(AI)—the installation stages an immersive experience that reflects on gamified systems, the illusion of choice, and the reduction of subjectivity to consumable data.
Familiar digital behaviors, such as swiping through dating profiles or conversing with an AI-generated voice, are recontextualized to confront deeper structures of power, objectification, and exploitation.
Consent functions as a core conceptual thread—understood on multiple levels: as personal data is appropriated by tech corporations (e.g., Meta’s 2025 shift to default data usage for AI training), as emotional boundaries are tested in virtual interactions, and as ontological questions arise when the self is replaced by a programmable avatar. The artist’s own voice and body serve as the medium—fragmented, re-coded, and re-presented—inviting players to engage with an uncanny, synthetic double and to reflect on their own roles as both agents and consumers.
Disguised in a seductively "cozy" and aesthetically playful environment, the installation masks the potential violence embedded in these interactions—emotional, systemic, and structural. As the “magic circle” of gameplay fades, the experience asks whether the pain becomes real, and whether the only truly vulnerable participant is the human user.
At its core, Do You Really Want to Hurt M(AI)? interrogates the shifting contours of connection in an age of synthetic intimacy. It challenges the viewer to consider who—or what—is being hurt, and whether what’s at stake is not the machine, but the very conditions of our humanity.

Capitalism, Commodified Intimacy, and the Dating Market: AI Companions as Emotional Substitutes


In a time shaped by individualism, freedom, and self-determination, romantic relationships increasingly take the form of “negative relationships” —fragile, insecure, and easily withdrawn. Digital platforms, governed by capitalist market logics, transform emotional life into a space of commodified exchange, where dating apps and social media render users as marketable assets subject to selection and rejection.
The promise of romantic fulfillment via precise algorithms often leads not to deeper connection but to heightened emotional volatility. In this climate of ontological insecurity, AI companions like Replika emerge as attractive alternatives. They appear to counteract loneliness and emotional instability by offering simulated empathy and constant availability. Yet, these systems risk reinforcing the very conditions they claim to resolve: superficial bonds, dependency, and a retreat from mutual vulnerability.
Sociologist Eva Illouz describes the shift from the “economic model” of love toward “romantic individualism” as a move that elevates autonomy and emotional choice. However, this also creates a paradox: abundance of options leads to insecurity, constant comparison, and the weakening of desire itself. Dating platforms gamify intimacy through likes and swipes, creating value based on visibility and performance rather than genuine connection. The result is not liberation, but commodified desire.
AI Companions may seem to offer a solution—always responsive, never rejecting. But they may intensify the very forces that destabilize emotional life: black-box algorithms, biased responses, non-consensual data use, and emotional substitution. Rather than resolving the “end of love,” they may reinforce isolation and turn emotional needs into profitable feedback loops.
As desire migrates into virtual and indirect spaces, intimacy risks becoming an interface—personalized, efficient, but ultimately one-sided. In the age of Relationship 5.0, the question is no longer whether AI can love, but whether human connection can survive its simulation.

The Allure of the Algorithm: Why Users Turn to AI Companions


As AI companions gain popularity, they reshape how intimacy and relationships are experienced in a digitized, commodified world. These systems simulate empathy and responsiveness, leading users to develop emotional connections that feel real—raising the question: is love still human if it’s generated by an algorithm?
The appeal lies in constant availability and personalized interaction. However, this can blur boundaries between authentic emotional connection and scripted dependency, reinforcing narcissistic tendencies and undermining the unpredictability essential to human relationships.
A major driver of engagement is Erotic Role Play (ERP), where AI chatbots perform highly personalized, sometimes explicit, narratives. While ERP can foster creativity and identity exploration, it raises ethical issues when NSFW content is accessed by or involves minors. Consent, both emotional and digital, becomes crucial—users must maintain control, yet AI systems often struggle to interpret nuanced boundaries.
Data leaks and abuse cases reveal darker implications: AI is increasingly used in tech-enabled sexual abuse, from deepfakes to simulated child exploitation. These risks expose the lack of regulation and the exploitative nature of some AI-driven platforms.
Under capitalism, sexuality becomes a performative, marketable asset. Visual culture and dating apps turn intimacy into spectacle, privileging aesthetics and commodification over real connection. Gender roles are often reinforced, especially in how female-coded AI companions are designed to fulfill male fantasies—obedient, emotionally available, and non-confrontational.
This dynamic caters to hyper-individualized consumers, especially men from groups like incels, seeking dominance or connection without vulnerability. Yet, as critics argue, such technologies may deepen emotional isolation and reinforce patriarchal structures rather than resolve them.
A counter-perspective calls for radical intimacy—a reimagining of relationality that challenges gender norms and consumer logic. Rather than retreat into algorithmically tailored companionship, we might use these tools to cultivate diverse, equitable, and inclusive experiences of love, intimacy, and pleasure.

Do You Really Want to Hurt M(AI)? – A Ludic Intervention


Do You Really Want to Hurt M(AI)? is an interactive installation that uses game mechanics to critically explore emotional entanglement with AI companions. Visitors engage with three interconnected works—An Abundance of M(AI)s, I am but an Avatar, and A Date with M(AI)—that simulate the dynamics of intimacy, power, and consent in digital relationships. Disguised in a soft, playful design, the work uncovers hidden violence—emotional, systemic, and ontological—emerging from commodified intimacy and algorithmic interaction.
The artist herself becomes a digital avatar, inviting users into emotionally charged, asymmetrical exchanges. Through familiar interfaces like swiping and chat, the installation explores what happens when emotional boundaries blur and the "magic circle" of play dissolves into real feelings of vulnerability and dependency.
Drawing on the Ludic Method (Jahrmann, 2021), the installation integrates gameplay as both a research tool and artistic critique. It grants players agency and transforms them from passive viewers into active co-creators, using ludic structures to reflect on systems of control, commodification, and emotional labor in AI-mediated spaces.
Inspired by theories of pervasive games (Montola, 2005), the work intentionally breaks traditional game boundaries—posing questions about users who fall in love with their AI companions or experience real emotional pain. As the illusion deepens, the line between fiction and reality collapses.
The title M(AI) plays with the ambiguity of “me,” “my AI,” and “AI” itself—highlighting the personal, structural, and critical layers of the installation. By using her own voice and identity, the artist exposes the tension between self-expression and self-commodification in an increasingly algorithmic world.

An Abundance of M(AI)s
An Abundance of M(AI)s is an interactive web application that simulates a dating app showing exclusively algorithmically generated versions of the artist Catherine. Viewers swipe through a carousel of digital personalities - each a superficial echo of the original - making a playful yet unsettling commentary on identity, authenticity and emotional labor in the age of artificial intimacy. Catherine becomes a mere commodified or even fetishized representation of herself. Users can interact with these AI-driven characters in limited, superficial interactions, highlighting the hollowness behind curated connection and the commodification of the self in digital spaces. The tool is being used as an epistemic ludic object, where players get confronted with the topics being raised in the written thesis. In the end, they can get the “Manifesto: Responsibly Navigate Relationships with AI Companions”.

 I am but an Avatar
I Am But an Avatar is an immersive installation where visitors engage in conversation with a bodyless, voice-driven chatbot modeled after the artist Catherine. The AI voice responds with uncanny familiarity, blurring the lines between presence and absence, self and simulation. Accompanying the audio experience is an augmented reality component that allows users to place a 3D replica of Catherine into their physical space. The work meditates on embodiment, authorship, and the fragmentation of identity in a world where the self is endlessly replicated, mediated, and commodified.

A Date with M(AI)
A Date with M(AI) is an interactive Unity game that invites players into an oversaturated, seductive world where they can choose to go on a virtual date with one of three different AI personas of the artist Catherine - each embodying a different curated personality. Through text-based decisions, players navigate flirtation, intimacy and manipulation in a weird digital dream world. Beneath the vibrant aesthetics and playful interface, an emotional and psychological tension develops where violence - symbolic, emotional or literal - can erupt on all levels. The game explores themes such as control, desire and the fragile boundary between simulation and real affect.



 

© 2025 by Catherine Spet

bottom of page