Click the above artificial intelligence algorithm and Python big data to get more dry goods
In the upper right ... Set as a star ★, get resources at the first time
Only for academic sharing, if there is any infringement, please contact to delete
Reprinted from: Expertise
How to model sequence data in various settings is an important machine learning problem across many domains, including prediction on time series data, natural language text, and event streams. Sequence data in different domains usually have different characteristics. For example, natural language text can be viewed as a sequence of discrete variables, while sensor network signals can be viewed as a multivariate sequence in a continuous vector space. To develop successful neural network models in so many real-world domains, we need to tailor architectures and algorithms to the nature of the data and problem. This paper designs a novel and efficient neural network solution for sequential modeling and its applications. Specifically, these contributions can be divided into four parts.
https://www.cs.cmu.edu/~glai1/
The first part mainly focuses on the correlation between variables in multivariate sequence data, such as time series of multiple sensors, and proposes a new algorithm to improve prediction accuracy using correlation patterns, namely Deep Separable Graph Convolutional Networks (DSGC) (Chapter 2 ) [60] and Factorization Recurrent Neural Networks (FRNN) (Chapter 3) [63].
The second part focuses on incorporating human prior knowledge in temporal modeling of dependent patterns of time series data. Specifically, we propose a novel method called Long Short-Term Time Series Network (LSTNet) (Chapter 4) [59], which has been shown to be particularly effective in capturing various periodic patterns in different applications.
Section III focuses on efficient algorithms for Transformers in sequence classification tasks. Specifically, by identifying computational redundancy in commonly used Transformer architectures and proposing a new alternative, Funnel Transformers (Chapter 5) [27], we achieve a better trade-off between computation and accuracy.
The fourth section focuses on modeling/predicting temporal relationships between events, where the main challenge is to learn efficiently from sparsely labeled data. We address this challenge by combining advanced data augmentation, semi-supervised learning, and introducing human priors (Chapter 6). Therefore, we greatly improve the state-of-the-art performance on this task.
---------♥---------
Statement: This content comes from the Internet, and the copyright belongs to the original author
The pictures are sourced from the Internet and do not represent the position of this official account. If there is any infringement, please contact to delete
Dr. AI's private WeChat, there are still a few vacancies
How to draw a beautiful deep learning model diagram?
How to draw a beautiful neural network diagram?
One article to understand various convolutions in deep learning
Click to see support