感觉
聊天机器人
感知
心理学
对话系统
情感(语言学)
样品(材料)
计算机科学
社会心理学
人机交互
认知心理学
人工智能
万维网
沟通
色谱法
神经科学
对话框
化学
作者
Marian McDonnell,David Baxter
出处
期刊:Interacting with Computers
[Oxford University Press]
日期:2019-03-01
卷期号:31 (2): 116-121
被引量:37
摘要
Abstract Chatbots are very much an emerging technology, and there is still much to learn about how conversational user interfaces will affect the way in which humans communicate not only with computers but also with one another. Further studies on anthropomorphic agents and the projection of human characteristics onto a system are required to further develop this area. Gender stereotypes operate a profound effect on human behaviour. The application of gender to a conversational agent brings along with it the projection of user biases and preconceptions. These feelings and perceptions about an agent can be used to develop mental models of a system. Users can be inclined to measure the success of a system based on their biases and emotional connections with the agent rather than that of the system’s performance. There have been many studies that show how gender affects human perceptions of a conversational agent. However, there is limited research on the effect of gender when applied to a chatbot system. This chapter presents early results from a research study which indicate that chatbot gender does have an effect on users overall satisfaction and gender-stereotypical perception. Subsequent studies could focus on examining the ethical implications of the results and further expanding the research by increasing the sample size to validate statistical significance, as well as recruiting a more diverse sample size from various backgrounds and experiences. RESEARCH HIGHLIGHTS Many studies have indicated how gender affects human perceptions of a conversational agent. However, there is limited research on the effect of gender when applied to a chatbot system. This research study presents early results which indicate that chatbot gender does have an effect on users overall satisfaction and gender-stereotypical perception. Users are more likely to apply gender stereotypes when a chatbot system operates within a gender-stereotypical subject domain, such as mechanics, and when the chatbot gender does not conform to gender stereotypes. This study raise ethical issues. Should we exploit this result and perpetuate the bias and stereotyping? Should we really have a male chatbot for technical advice bots? Is this perpetuating stereotyping, the dilemma being that a male version would elicit more trust?
科研通智能强力驱动
Strongly Powered by AbleSci AI