人工智能
计算机科学
生成语法
估计
模式识别(心理学)
生成模型
数学
统计
机器学习
经济
管理
作者
Minsuk Shin,Shijie Wang,Jun S. Liu
标识
DOI:10.1080/10618600.2023.2292668
摘要
To overcome computational bottlenecks of various data perturbation procedures such as the bootstrap and cross-validations, we propose the Generative Multi-purpose Sampler (GMS), which directly constructs a generator function to produce solutions of weighted M-estimators from a set of given weights and tuning parameters. The GMS is implemented by a single optimization procedure without having to repeatedly evaluate the minimizers of weighted losses, and is thus capable of significantly reducing the computational time. We demonstrate that the GMS framework enables the implementation of various statistical procedures that would be unfeasible in a conventional framework, such as iterated bootstrap procedures and cross-validation for penalized likelihood. To construct a computationally efficient generator function, we also propose a novel form of neural network called the weight multiplicative multilayer perceptron to achieve fast convergence. An R package called GMS is provided, which runs under Pytorch to implement the proposed methods and allows the user to provide a customized loss function to tailor to their own models of interest. Supplementary materials for this article are available online.
科研通智能强力驱动
Strongly Powered by AbleSci AI