By Sepideh Bazazi (TCD); Jurgis Karpus (LMU); Taha Yasseri (TCD, TU Dublin), published in iScience, November 2025
Abstract:
Cooperation between humans and machines is increasingly vital as artificial intelligence (AI) becomes integrated into daily life. Research shows that people are often less willing to cooperate with AI agents than with humans and are more likely to exploit AI for personal gain. While prior studies indicate that human-like features in AI influence cooperation, the impact of AI’s assigned gender remains underexplored. This study investigates how cooperation varies with the gender labels assigned to AI partners. In a Prisoner’s Dilemma game, 402 participants interacted with partners labeled as AI or human, and as male, female, non-binary, or gender-neutral. Participants exploited female-labeled and distrusted male-labeled AI agents more than human counterparts with the same gender labels, reflecting gender biases similar to those in human-human interactions. These findings underscore the importance of accounting for gender bias in AI design, policy, and regulation.