Saturday, February 12, 2022

Deep Recurrent Neural Nets with Long Short Term Memory

 Again.  LTSM may be emerging as very big deal in recurrent NN modeling.  I blogged on it before (e.g. here) but I still don't understand it deeply.  Does anyone?

Maybe it's just a device for avoiding the vanishing-gradient problem (not that that isn't important); maybe it's more.

This new paper is very well done and features LSTM prominently.

By:Lars Lien AnkileKjartan Krange
Abstract:This paper presents an ensemble forecasting method that shows strong results on the M4Competition dataset by decreasing feature and model selection assumptions, termed DONUT(DO Not UTilize human assumptions). Our assumption reductions, consisting mainly of auto-generated features and a more diverse model pool for the ensemble, significantly outperforms the statistical-feature-based ensemble method FFORMA by Montero-Manso et al. (2020). Furthermore, we investigate feature extraction with a Long short-term memory Network(LSTM) Autoencoder and find that such features contain crucial information not captured by traditional statistical feature approaches. The ensemble weighting model uses both LSTM features and statistical features to combine the models accurately. Analysis of feature importance and interaction show a slight superiority for LSTM features over the statistical ones alone. Clustering analysis shows that different essential LSTM features are different from most statistical features and each other. We also find that increasing the solution space of the weighting model by augmenting the ensemble with new models is something the weighting model learns to use, explaining part of the accuracy gains. Lastly, we present a formal ex-post-facto analysis of optimal combination and selection for ensembles, quantifying differences through linear optimization on the M4 dataset. We also include a short proof that model combination is superior to model selection, a posteriori.
Date:2022–01
URL:http://d.repec.org/n?u=RePEc:arx:papers:2201.00426&r=&r=for

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.