Title | CSE 475-Machine Learning 14 |
---|---|
Author | Shamim Akhter |
Course | Business Law |
Institution | Presidency University Banglasesh |
Pages | 21 |
File Size | 1.3 MB |
File Type | |
Total Downloads | 94 |
Total Views | 166 |
Download CSE 475-Machine Learning 14 PDF
CSE 475-Machine Learning Recurrent Network
Dr. Shamim Akhter Associate Professor, CSE, EWU, Bangladesh
Recurrent NN • Built in 1980 but show power on 1990s • Works with sequential data(NLP) – Text data, News Paper, Tweet, DNA Sequence, Audio, Video, Speech – Google Voice Search
• Remember its input input, due to its internal memory • RNN applies – Sequential data but – Relationship/connection between data is more important than spatial content
Feed Forward NN
• Information only moves one direction – Never touches a node twice
• No memory(Previous Input) – difficult to predict next – just remember their training parameters (W,B) Example : “NEURON”- During “R” it will lost N, E, & U information.
Back propagation + FF
Weight Updating: Gradient Descent
Recurrent NN • Information Cycle through a loop • Take decision/process output on – Current input – Previous P i iinput ((recent past )
• Short term memory
RNN • RNN=Sequence of NN that can be trained one after another with back propagation.
ht = (Wt) T . h(t-1) = (Wt) T . (Wt-1). h(t-2) = (Wt) T . (Wt-1). (Wt-2) . ….. (W0).h(0)
(Wl)>1 Explode (Wl) Greater computation – Sol: LSTM
Long and Short Term (LSTM) NN
C
C...