系列(地层学)
资产管理
计量经济学
变压器
时间序列
计算机科学
经济
工程类
财务
机器学习
电气工程
地质学
古生物学
电压
出处
期刊:Social Science Research Network
[Social Science Electronic Publishing]
日期:2023-01-01
被引量:11
摘要
Since its introduction in 2017 (Vaswani et al., 2017), the Transformer model has excelled in a wide range of tasks involving natural language processing and computer vision. We investigate the Transformer model to address an important sequence learning problem in finance: time series forecasting. The underlying idea is to use the attention mechanism and the seq2seq architecture in the Transformer model to capture long-range dependencies and interactions across assets and perform multi-step time series forecasting in finance. The first part of this article systematically reviews the Transformer model while highlighting its strengths and limitations. In particular, we focus on the attention mechanism and the seq2seq architecture, which are at the core of the Transformer model. Inspired by the concept of weak learners in ensemble learning, we identify the diversification benefit of generating a collection of low-complexity models with simple structures and fewer features. The second part is dedicated to two financial applications. First, we consider the construction of trend-following strategies. Specifically, we use the encoder part of the Transformer model to construct a binary classification model to predict the sign of an asset’s future returns. The second application is the multi-period portfolio optimization problem, particularly volatility forecasting. In addition, our paper discusses the issues and considerations when using machine learning models in finance.
科研通智能强力驱动
Strongly Powered by AbleSci AI