Transient behavior(2) o The learning curve JxE(n+1)=Jmin+(w(n+1)-Wop)"R(W(n+1)-Wop) =Jmn+(w(O)-wgm))(I-Rm)RI-R)1(w(O)-w) =Jmn+(w(0)-wm)Q(I-uA)A(I-uA)Q“(w(O)-wp) Consists of a sum of M exponentials ● Decay with different 'time constants'.The worst (slowest decay)components are those that have 1-2,2m≈1 2020-01-18 11
2020-01-18 11 Transient behavior (2) The learning curve Consists of a sum of M exponentials Decay with different ‘time constants’. The worst (slowest decay) components are those that have min 1 1 min 1 1 min ( 1) ( 1) ( 1) (0) ( ) ( ) (0) (0) (0) H MSE opt uu opt H n n opt uu uu uu opt H n n H opt opt J n J n n J J w w R w w w w I R R I R w w w w Q I Λ Λ I Λ Q w w 2( 1) 1 1 n k
Learning curve c0)=Q“(w(O)-wom) c(n+1)=Q“(w(n+1)-wr)=(I-4)c(0) Jsn+1)=Jnn+(w(n+1)-wm))”R(w(n+)-wger) =Jin+c(n+1)Ac(n+1) 2020-01-18 12
2020-01-18 12 Learning curve 1 min min (0) (0) ( 1) ( 1) (0) ( 1) ( 1) ( 1) ( 1) ( 1) H opt H n opt H MSE opt uu opt H n n J n J n n J n n c Q w w c Q w w I Λ c w w R w w c Λc
Reiterate o Given statistics case assume that the process statistics (i.e.)are given o n is the iteration index rather than a time index in the iterative search 2020-01-18 13
2020-01-18 13 Reiterate Given statistics case assume that the process statistics (i.e.) are given n is the iteration index rather than a time index in the iterative search
S2.LMS algorithm o 'Given data'version of the steepest- descent algorithm w(n+1)=w(n)+u(R,d-RW(n)) w(n+1)=w(n)+u(u(n)d"(n)-u(n)u"(n)w(n) =w(m)+4(d(n)-uH(n)(n)u(n) -w(n)+ue (n)u(n) 2020-01-18 14
2020-01-18 14 S2. LMS algorithm ‘Given data’ version of the steepestdescent algorithm w w R R w ( 1) ( ) ( ) n n n ud uu * * * ˆ ( 1) ( ) ( ) ( ) ( ) ( ) ( ) ˆ ˆ ˆ ( ) ( ) ( ) ( ) ( ) ˆ ˆ ( ) ( ) ( ) H H n n n d n n n n n d n n n n n e n n w w u u u w w u w u w u
Modification o Iteration index n is changed into the time index n ● The steepest-descent algorithm is turned into a recursive algorithm with one weight-update (one iteration)per time step. o Leave out the expectation operators Substitute an instantaneous estimate for the gradient, sometimes referred to as the stochastic gradient or the noisy gradient. @/ysE(W)-2Rw-2R- w olen=2u(n)e(n) Ow 2020-01-18 15
2020-01-18 15 Modification Iteration index n is changed into the time index n The steepest-descent algorithm is turned into a recursive algorithm with one weight-update (one iteration) per time step. Leave out the expectation operators Substitute an instantaneous estimate for the gradient, sometimes referred to as the stochastic gradient or the noisy gradient. 2 * ( ) ( ) 2 2 2 ( ) ( ) MSE uu ud J e n n e n w R w R u