Vanilla rnns Recurrent Neural Network Input sequence: 32 32 output sequence: 121「32 12」32 Changing the sequence order will store change the output 16 16 6 6 All the weights are" 1, no bias 2 All activation functions are linear Sourceofslidehttp://speech.ee.ntu.edu.tw/-tlkagk/coursesMl16.hTml
Vanilla RNNs ▪ Recurrent Neural Network 1 x 2 x 2 y 1 y a1 a2 store All the weights are “1”, no bias All activation functions are linear Input sequence: 1 1 1 1 2 2 …… 2 2 6 6 16 16 32 32 output sequence: 4 4 12 12 32 32 Changing the sequence order will change the output. Source of slide: http://speech.ee.ntu.edu.tw/~tlkagk/courses_ML16.html
Vanilla rnns EXample Application Probability of Probability of Probability of arrive"in each slot "Shenyang"in each "on"in each slot y store store The same a network is a used again and again. arrive Shenyang on November 2nd Sourceofslidehttp://speech.ee.ntu.edu.tw_/-tilkagk/coursesMl16.html
Vanilla RNNs ▪ Example Application store store x 1 x 2 x 3 y 1 y 2 y 3 a 1 a 1 a 2 a 2 a 3 arrive Shenyang on November 2nd Probability of “arrive” in each slot Probability of “Shenyang” in each slot Probability of “on” in each slot The same network is used again and again. Source of slide: http://speech.ee.ntu.edu.tw/~tlkagk/courses_ML16.html
Vanilla rnns EXample Application Different Prob of eave Prob of“ Shenyang" Prob of" arrive′ Prob of" Shenyang in each slot in each slot in each slot in each slot y. tore tore X X leave Shenyang arrive Shenyang The values stored in the memory is different Sourceofslidehttps hee ntu. edu. tw/-tlkaak/courses ML 16. html
Vanilla RNNs ▪ Example Application store x 1 x 2 y 1 y 2 a 1 a 1 a 2 …… …… …… store x 1 x 2 y 1 y 2 a 1 a 1 a 2 …… …… …… leave Shenyang Prob of “leave” in each slot Prob of “Shenyang” in each slot Prob of “arrive” in each slot Prob of “Shenyang” in each slot arrive Shenyang Different The values stored in the memory is different. Source of slide: http://speech.ee.ntu.edu.tw/~tlkagk/courses_ML16.html
Some rnn variants Deep Recurrent Neural Network t+1 t+2 t+1 X t+2 X Sourceofslidehttp://speech.ee.ntu.edu.tw_/-tilkagk/coursesMl16.html
Some RNN Variants ▪ Deep Recurrent Neural Network …… …… x t x t+1 x t+2 …… …… y t …… …… y t+1 …… y t+2 …… …… Source of slide: http://speech.ee.ntu.edu.tw/~tlkagk/courses_ML16.html
Some rnn variants Different Input and Output one to one one to many many to one many to many many to many
Some RNN Variants ▪ Different Input and Output