Back to Results
First PageMeta Content
Neuroscience / Science / Recurrent neural network / Backpropagation through time / Long short term memory / Perceptron / XT / Neural networks / Computational neuroscience / Cybernetics


On the difficulty of training recurrent neural networks Razvan Pascanu Universit´e de Montr´eal, 2920, chemin de la Tour, Montr´eal, Qu´ebec, Canada, H3T 1J8 Tomas Mikolov
Add to Reading List

Document Date: 2013-08-14 01:36:44


Open Document

File Size: 627,52 KB

Share Result on Facebook

City

San Francisco / Atlanta / Qu´ebec / Montreal / New York / /

Company

Echo State Networks / AA AB BA / IEEE Press / Neural Networks / GPU / Penn Treebank / Nottingham (nll) MuseData (nll) Penn Treebank / Perseus Books Group / Recurrent Neural Networks / /

Country

United States / Canada / /

/

Facility

Brno University of Technology / University of Toronto / /

IndustryTerm

input-less recurrent network / recurrent networks / wireless telecommunication / state networks / sigmoid unit network / autonomous systems / recurrent network / hidden unit recurrent network / chaotic systems / sum-of-products / tanh unit network / generator networks / dynamical systems / dynamical systems tools / recurrent gas market model / energy / not be analyzed using standard dynamical systems tools / /

MarketIndex

set 10000 / /

Organization

international speech communication association / US Federal Reserve / Brno University of Technology / University of Toronto / Pattern Analysis and Machine Intelligence / /

Person

Tomas Mikolov Speech / Tomas Mikolov / Morgan Kaufmann / /

Position

author / teacher / neurodynamical model for working memory / model for each time step / /

ProgrammingLanguage

E / Python / T / /

ProvinceOrState

Utah / New York / Georgia / /

PublishedMedium

IEEE Transactions on Pattern Analysis and Machine Intelligence / Machine Learning / /

RadioStation

4 When / /

Technology

This algorithm / neural network / Machine Learning / simulation / midi / pdf / /

URL

http /

SocialTag