预设
机器人
欺骗
幻觉
心理学
计算机科学
先验与后验
社会心理学
社交机器人
认知心理学
人机交互
人工智能
认识论
移动机器人
哲学
机器人控制
出处
期刊:IEEE Transactions on Affective Computing
[Institute of Electrical and Electronics Engineers]
日期:2011-08-31
卷期号:3 (4): 388-393
被引量:60
标识
DOI:10.1109/t-affc.2011.29
摘要
A common objection to the use and development of "emotional" robots is that they are deceptive. This intuitive response assumes 1) that these robots intend to deceive, 2) that their emotions are not real, and 3) that they pretend to be a kind of entity they are not. We use these criteria to judge if an entity is deceptive in emotional communication (good intention, emotional authenticity, and ontological authenticity). They can also be regarded as "ideal emotional communication" conditions that saliently operate as presuppositions in our communications with other entities. While the good intention presupposition might be a bias or illusion we really need for sustaining the social life, in the future we may want to dispense with the other conditions in order to facilitate cross-entity communication. What we need instead are not "authentic" but appropriate emotional responses-appropriate to relevant social contexts. Criteria for this cannot be given a priori but must be learned-by humans and by robots. In the future, we may learn to live with "emotional" robots, especially if our values would change. However, contemporary robot designers who want their robots to receive trust from humans had better take into account current concerns about deception and create robots that do not evoke the three-fold deception response.
科研通智能强力驱动
Strongly Powered by AbleSci AI