Paper
28 March 2023 Transformer and long short-term memory networks for long sequence time sequence forecasting problem
Wei Fang
Author Affiliations +
Proceedings Volume 12566, Fifth International Conference on Computer Information Science and Artificial Intelligence (CISAI 2022); 125660W (2023) https://doi.org/10.1117/12.2667895
Event: Fifth International Conference on Computer Information Science and Artificial Intelligence (CISAI 2022), 2022, Chongqing, China
Abstract
The long sequence time-sequence forecasting problem attracts a lot of organizations. Many prediction application scenes are about long sequence time-sequence forecasting problems. Under such circumstances, many researchers have tried to solve these problems by employing some models that have proved efficient in the Natural Language Processing field, like long short term memory networks and Transformers, etc. And there are a lot of improvements based on the primary recurrent neural network, and Transformer. Recently, a model called informer which is made for the LSTF was proposed. This model claimed that it improves prediction performance on the long sequence time-series forecasting problem. But in the later experiments, more and more researchers found that informers still cannot handle all the long sequence time-sequence forecasting problems. This paper is going to look at how datasets effect the performance of different models. The experiment is carried out on the Bitcoin dataset with four features and one output. The result shows that the Informer (transformer-like model) cannot always perform well so that sometimes choosing models with simple architecture may gain better results.
© (2023) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Wei Fang "Transformer and long short-term memory networks for long sequence time sequence forecasting problem", Proc. SPIE 12566, Fifth International Conference on Computer Information Science and Artificial Intelligence (CISAI 2022), 125660W (28 March 2023); https://doi.org/10.1117/12.2667895
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Data modeling

Education and training

Transformers

Computer programming

Networks

Data processing

Feature extraction

Back to Top