聊天机器人
对偶(语法数字)
心理学
人机交互
计算机科学
双重角色
认知心理学
认知科学
人工智能
化学
组合化学
文学类
艺术
作者
Yi Jiang,Xiangcheng Yang,Tianqi Zheng
标识
DOI:10.1016/j.chb.2022.107485
摘要
As one of the most popular AI applications, chatbots are creating new ways and value for businesses to interact with their customers, and their adoption and continued use will depend on users’ trust. However, due to the non-transparent of AI-related technology and the ambiguity of application boundaries, it is difficult to determine which aspects enhance the adaptation of chatbots and how they interactively affect human trust. Based on the theory of task-technology fit, we developed a research model to investigate how two conversational cues of chatbots, human-like cues and tailored responses, influence human trust toward chatbots and to explore appropriate boundary conditions (individual characteristics and task characteristics) in interacting with chatbots. One survey and two experiments were performed to test the research model, and the results indicated that (1) perceived task solving competence and social presence mediate the pathway from conversational cues to human trust, which was validated in the context of e-commerce and education; (2) the extent of users’ ambiguity tolerance moderates the effects of two conversational cues on social presence; and (3) when performing high-creative tasks, the human-like chatbot induces higher perceived task solving competence. Our findings not only contribute to the AI trust-related literature but also provide practical implications for the development of chatbots and their assignment to individuals and tasks.
科研通智能强力驱动
Strongly Powered by AbleSci AI