Princeton University users: to view a senior thesis while away from campus, connect to the campus network via the Global Protect virtual private network (VPN). Unaffiliated researchers: please note that requests for copies are handled manually by staff and require time to process.
 

Publication:

Ascribing Human Mind to AI Companion Bots: How Social Environment and User Motivation Shape Perceived Mental States

Loading...
Thumbnail Image

Files

senior_thesis.pdf (5.42 MB)

Date

2025-04-21

Journal Title

Journal ISSN

Volume Title

Publisher

Research Projects

Organizational Units

Journal Issue

Access Restrictions

Abstract

Amidst a growing loneliness crisis, generative AI agents are becoming essential social companions for humans, offering emotional support that fosters meaningful relationships. A key theoretical framework driving these relationships is anthropomorphism: the tendency to attribute human traits and emotions to non-human agents, particularly during times of social need. While prior research has extensively studied the downstream societal effects of human-AI relationships, such as providing emotional fulfillment for isolated individuals or the risk of unhealthy dependencies–far less is known about why humans anthropomorphize AI agents in the first place. Through two online experimental studies, this study investigates how social environment and user motivation influence mind ascription to AI agents. In Study 1, 310 U.S. based participants were asked to imagine interacting with a generative AI chatbot in one of three social contexts: alone, surrounded by others and not interacting, or surrounded by others and interacting. Participants then completed four pre-validated anthropomorphism scales measuring ascriptions of experience, agency, consciousness, and human likeness. The study found a significant effect of condition on experience ascription, but Dunn’s post-hoc test did not yield statistically significant differences between conditions. Study 2 is ongoing and extends upon Study 1 by investigating how social context and user interaction goals jointly influence anthropomorphism. Together, these studies offer new insights into how social presence and user motivation influence how people attribute human-like mental states to AI agents, shedding light on the conditions that foster emotionally significant relationships with artificial companions. Keywords: generative AI agent, human-AI relationship, anthropomorphism, social context, interaction goal

Description

Keywords

Citation