生成语法
计算机科学
过程(计算)
可用性
人工智能
生成模型
透视图(图形)
商业化
生成设计
数据科学
人机交互
营销
业务
公制(单位)
操作系统
作者
Soobin Jang,Haeyoon Lee,Yujin Kim,Daeho Lee,Jungwoo Shin,Jungwoo Nam
标识
DOI:10.1016/j.tele.2024.102175
摘要
With the commercialization of ChatGPT, generative artificial intelligence (AI) has been applied almost everywhere in our lives. However, even though generative AI has become a daily technology that anyone can use, most non-majors need to know the process and reason for the results because it can be misused due to lack of sufficient knowledge and misunderstanding. Therefore, this study investigated users' preferences for when, what, and how generative AI should provide explanations about the process of generating and the reasoning behind the results, using conjoint method and mixed logit analysis. The results show that users are most sensitive to the timing of providing eXplainable AI (XAI), and that users want additional information only when they ask for explanations during the process of using generative AI. The results of this study will help shape the XAI design of future generative AI from a user perspective and improve usability.
科研通智能强力驱动
Strongly Powered by AbleSci AI