The focus of this work is to review and summarize a thread of few memory augmented machine learning models in the context of time series analysis. The methods are presented in the loose order of increasing complexity of the memory architecture. We begin with the classical neural networks for sequential data, namely the RNN and the LSTM. Then we focus on more sophisticated memory interfaces as described in works of Recurrent Entity Networks (EntNet), Neural Turing Machines (NTM), and Differentiable Neural Computers (DNC). After that we analyze the scalability solutions as addressed in the work on Sparse Access Memory (SAM). Finally, we outline several research directions related to this class of memory augmented neural networks.
-
Notifications
You must be signed in to change notification settings - Fork 0
jonasdaugalas/Time-Series-Analysis-NTM-DNC-EntNet
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
No description, website, or topics provided.
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published