不透明度
透明度(行为)
利益相关者
透视图(图形)
计算机科学
知识管理
保证
人工智能
业务
政治学
公共关系
计算机安全
财务
光学
物理
作者
Markus Langer,Cornelius J. König
标识
DOI:10.1016/j.hrmr.2021.100881
摘要
Artificial Intelligence and algorithmic technologies support or even automate a large variety of human resource management (HRM) activities. This affects a range of stakeholders with different, partially conflicting perspectives on the opacity and transparency of algorithm-based HRM. In this paper, we explain why opacity is a key characteristic of algorithm-based HRM, describe reasons for opaque algorithm-based HRM, and highlight the implications of opacity from the perspective of the main stakeholders involved (users, affected people, deployers, developers, and regulators). We also review strategies to reduce opacity and promote transparency of algorithm-based HRM (technical solutions, education and training, regulation and guidelines), and emphasize that opacity and transparency in algorithm-based HRM can simultaneously have beneficial and detrimental consequences that warrant taking a multi-stakeholder view when considering these consequences. We conclude with a research agenda highlighting stakeholders' interests regarding opacity, strategies to reduce opacity, and consequences of opacity and transparency in algorithm-based HRM.
科研通智能强力驱动
Strongly Powered by AbleSci AI