心理学
社会心理学
雌雄同体
背景(考古学)
人类智力
性别偏见
发展心理学
男子气概
精神分析
生物
古生物学
作者
Julia Spielmann,Chadly Stern
标识
DOI:10.1177/01461672241307276
摘要
Do people prefer that artificial intelligence (AI) aligns with gender stereotypes when requesting help to answer a question? We found that people preferred gender stereotypicality (over counterstereotypicality and androgyny) in voice-based AI when seeking help (e.g., preferring feminine voices to answer questions in feminine domains; Studies 1a–1b). Preferences for stereotypicality were stronger when using binary zero-sum (vs. continuous non-zero-sum) assessments (Study 2). Contrary to expectations, biases were larger when judging human (vs. AI) targets (Study 3). Finally, people were more likely to request (vs. decline) assistance from gender stereotypical (vs. counterstereotypical) human targets, but this choice bias did not extend to AI targets (Study 4). Across studies, we observed stronger preferences for gender stereotypicality in feminine (vs. masculine) domains, potentially due to examining biases in a stereotypically feminine context (helping). These studies offer nuanced insights into conditions under which people use gender stereotypes to evaluate human and non-human entities.
科研通智能强力驱动
Strongly Powered by AbleSci AI