Implementing logical functions Wo=1.5 Wo=0.5 W0=-0.5 = W1=-1 W,=1 W2=1 AND OR NOT McCulloch and Pitts:every Boolean function can be implemented Chapter 20,Section 5 6
Implementing logical functions AND W0 = 1.5 W1 = 1 W2 = 1 OR W2 = 1 W1 = 1 W0 = 0.5 NOT W1 = –1 W0 = – 0.5 McCulloch and Pitts: every Boolean function can be implemented Chapter 20, Section 5 6
Network structures Feed-forward networks: single-layer perceptrons multi-layer perceptrons Feed-forward networks implement functions,have no internal state Recurrent networks: -Hopfield networks have symmetric weights(Wij-Wj) g(x)=sign(x),ai=+1;holographic associative memory Boltzmann machines use stochastic activation functions, ≈MCMC in Bayes nets recurrent neural nets have directed cycles with delays have internal state (like flip-flops),can oscillate etc. Chapter 20.Section 5 7
Network structures Feed-forward networks: – single-layer perceptrons – multi-layer perceptrons Feed-forward networks implement functions, have no internal state Recurrent networks: – Hopfield networks have symmetric weights (Wi,j = Wj,i) g(x) = sign(x), ai = ± 1; holographic associative memory – Boltzmann machines use stochastic activation functions, ≈ MCMC in Bayes nets – recurrent neural nets have directed cycles with delays ⇒ have internal state (like flip-flops), can oscillate etc. Chapter 20, Section 5 7