Abstract:
|
As an emerging machine learning paradigm, recurrent neural network (RNN) has become a powerful tool for analyzing sequence data. The long short-term memory (LSTM) and gated recurrent unit (GRU) are two popular RNN models for sequence classification. Recently, some new developments, such as attention, have been shown to improve the performance of LSTM and GRU. There are many different RNN models, but there is a shortage of systematic comparison of their performance. In this study, we will conduct a comprehensive simulation study in the context of sequence classification to evaluate the performance of LSTM and GRU, in the presence or absence of attention.
|