多样性(控制论)
透视图(图形)
心理学
认知
互联网隐私
阿凡达
感知
社会心理学
计算机科学
人机交互
人工智能
神经科学
作者
Tianling Xie,Benjamin George,Iryna Pentina
出处
期刊:Developments in marketing science: proceedings of the Academy of Marketing Science
日期:2023-01-01
卷期号:: 35-36
被引量:1
标识
DOI:10.1007/978-3-031-24687-6_12
摘要
A Conversational Agent (CA) is a computer program that can generate dialogues to interact with human beings. It can assume a physical manifestation, such as a robot, or take the format of an avatar, or can be merely embedded within a website or a smartphone application (app) without any human-like appearance. Both text and voice-based CAs are being increasingly accepted in our daily life: smart home products such as Alexa and Google Home have achieved commercial success; smartphone voice assistants like Siri and Cortana are being used as a productivity tool. However, because of the human-like conversing skills CAs manifest, they also sometimes assume a unique social role under special circumstances. While a number of studies have considered the consumption intention of CA products from a trust perspective, no research, to our knowledge, has considered the human-CA relationship as a source of the emotional trust. Trust has been included in many previous CA acceptance and use studies. However, very few researchers have differentiated between the cognitive and emotional trust in CA studies. To address this gap, we interviewed 14 existing users of the AI companion app Replika as a pilot study to understand users' perceptions about interacting with the app. Because of the newness of the phenomenon, we did not assume a theoretical framework in the pilot study Instead, the grounded theory approach was adopted to capture a variety of perspectives, such as trust, privacy, satisfaction, and dissatisfaction. The content analysis of the interview transcripts revealed separate sources of cognitive and emotional trust. Based on the results, this study proposed a theoretical model of the effects of emotional and cognitive trust on intentions to continue using social conversational agent products. Furthermore, the study uniquely included the perception of relationship strength as an antecedent of emotional trust. Practically, this study may be appealing to marketing practitioners who are interested in understanding consumer behaviors toward similar products, such as chatbots, voice assistants, and social robots. It may also offer insights to policy makers regarding the mechanism of trust building toward artificial intelligence agents and potential privacy risk reduction measures.
科研通智能强力驱动
Strongly Powered by AbleSci AI