Negative perceptions of outsourcing to artificial intelligence

People judge AI users more negatively across many tasks — but especially for love letters, apologies, and wedding vows

People tend to judge those who outsource tasks to artificial intelligence more negatively than those who do the work themselves — and this effect is particularly strong when AI is used for personal or emotionally meaningful tasks. In this work from the project led by Scott Claessens, we examined how relying on AI can have effects for how we judge the user. 

Across six studies involving nearly 4,000 UK participants, we examined how people perceive others who use AI tools such as ChatGPT to complete a wide range of tasks, from writing computer code and daily schedules to composing love letters, apology notes, and wedding vows. While outsourcing to AI often led to more negative impressions overall, the strongest effects emerged for socio-relational tasks. We found that using AI for personal messages led people to see the user as less caring, less authentic, less trustworthy, and lazier — even when the user finished the task themselves, described good reasons for using AI, and was honest about their AI use.

As Scott Claessens noted, “People don’t just judge what you produce, they judge how you produce it“.

Dr Jim Everett explained:

If you use AI for these kind of social tasks that bind us together, you risk being judged not only because you didn’t put effort in, but because it makes people think you care less about the task and what it represents. ”

By contrast, using AI for more practical or technical tasks — such as writing code or organising schedules — attracted far less criticism. As AI becomes embedded in everyday life, the study highlights a trade-off between efficiency and social meaning: while AI can save time, using it for more social tasks may come at a reputational cost.

As we write in our conclusion, “In a world of algorithm-mediated interactions, AI is no substitute for investing effort into our interpersonal relationships.”

 

Read More:

Claessens, S., Veitch, P., & Everett, J. A.C (2025). Negative perceptions of outsourcing to artificial intelligence. Computers in Human Behavior, 108894.

As artificial intelligence (AI) tools become increasingly integrated into daily life, people are beginning to outsource not only professional tasks but also socio-relational ones. Large language models like ChatGPT can generate wedding vows, speeches, and personal messages, raising questions about how individuals who use AI for such tasks are perceived by others. In this paper, we conduct six pre-registered studies with British participants (N = 3935) to understand how people view those who expend less effort by outsourcing tasks to AI in different ways. We highlight a tradeoff between efficiency and inferred moral character, authenticity, and value: outsourcing makes us think more negatively about not only the person and their motivations, but also the outsourced work itself. Importantly, this effect is not uniform. Reduced effort does not consistently lead to domain-general negative character perceptions across all tasks, but has particularly negative effects for outsourcing socio-relational tasks. Our results suggest that reduced effort is important not only because people value time and energy spent, but because expending less effort through outsourcing signals second-order perceptions that people are being less authentic and care less about the task. Our research highlights how relying on AI shapes our perceptions of the user, raising key philosophical questions about efficiency, authenticity, and social ties in a world filled with AI-mediated interactions.

https://www.sciencedirect.com/science/article/pii/S0010027724003147