计算机科学
业务
心理学
社会心理学
广告
互联网隐私
公共关系
语音识别
计量经济学
认知心理学
政治学
经济
作者
Scott Schanke,Gordon Burtch,Gautam Ray
出处
期刊:Management Science
[Institute for Operations Research and the Management Sciences]
日期:2024-11-01
标识
DOI:10.1287/mnsc.2022.03316
摘要
We consider the pairing of audio chatbot technologies with voice-based deep fakes, that is, voice clones, examining the potential of this combination to induce consumer trust. We report on a set of controlled experiments based on the investment game, evaluating how voice cloning and chatbot disclosure jointly affect participants’ trust, reflected by their willingness to play with an autonomous, AI-enabled partner. We observe evidence that voice-based agents garner significantly greater trust from subjects when imbued with a clone of the subject’s voice. Recognizing that these technologies present not only opportunities but also the potential for misuse, we further consider the moderating impact of AI disclosure, a recent regulatory proposal advocated by some policymakers. We find no evidence that AI disclosure attenuates the trust-inducing effect of voice clones. Finally, we explore underlying mechanisms and contextual moderators for the trust-inducing effects, with an eye toward informing future efforts to manage and regulate voice-cloning applications. We find that a voice clone’s effects operate, at least in part, by inducing a perception of homophily and that the effects are increasing in the clarity and quality of generated audio. Implications of these results for consumers, policymakers, and society are discussed. This paper has been This paper was accepted by D. J. Wu for the Special Issue on the Human-Algorithm Connection. Funding: This work was supported by funding from the University of Wisconsin-Milwaukee Research Assistance Fund. Supplemental Material: The online appendix and data files are available at https://doi.org/10.1287/mnsc.2022.03316 .
科研通智能强力驱动
Strongly Powered by AbleSci AI