计算机科学
后悔
机器学习
对偶(语法数字)
数学优化
多样性(控制论)
下游(制造业)
最优化问题
人工智能
算法
数学
艺术
运营管理
文学类
经济
作者
Yichun Hu,Nathan Kallus,Xiaojie Mao
出处
期刊:Management Science
[Institute for Operations Research and the Management Sciences]
日期:2022-06-01
卷期号:68 (6): 4236-4245
被引量:16
标识
DOI:10.1287/mnsc.2022.4383
摘要
Incorporating side observations in decision making can reduce uncertainty and boost performance, but it also requires that we tackle a potentially complex predictive relationship. Although one may use off-the-shelf machine learning methods to separately learn a predictive model and plug it in, a variety of recent methods instead integrate estimation and optimization by fitting the model to directly optimize downstream decision performance. Surprisingly, in the case of contextual linear optimization, we show that the naïve plug-in approach actually achieves regret convergence rates that are significantly faster than methods that directly optimize downstream decision performance. We show this by leveraging the fact that specific problem instances do not have arbitrarily bad near-dual-degeneracy. Although there are other pros and cons to consider as we discuss and illustrate numerically, our results highlight a nuanced landscape for the enterprise to integrate estimation and optimization. Our results are overall positive for practice: predictive models are easy and fast to train using existing tools; simple to interpret; and, as we show, lead to decisions that perform very well. This paper was accepted by Hamid Nazerzadeh, data science.
科研通智能强力驱动
Strongly Powered by AbleSci AI