Basics of anns Neuron operation Input signals Weights denoting connection strengths multiply the input signals The sum of the weighted inputs is sent through the neuron It passes through a non-liner activation function It is transferred to the output-used as input to neighbouring units or units at a next layer DCABES. October 2009
DCABES, October 2009 Basics of ANNs Neuron operation – Input signals – Weights denoting connection strengths multiply the input signals – The sum of the weighted inputs is sent through the neuron – It passes through a non-liner activation function – It is transferred to the output-used as input to neighbouring units or units at a next layer
Basics of anns Neuron operation Complete view( Input I Input 2 Input 3 DCABES. October 2009
DCABES, October 2009 Basics of ANNs wij Inputi Complete view (I) Neuron operation
Basics of anns Ne euron operation Complete view(i) Input I Activation Input 2 Output function Input 3 DCABES. October 2009
DCABES, October 2009 Basics of ANNs Complete view (II) Activation function Neuron operation
Basics of anns Activation Function Introduces non-linearity to a problem Maps the neuron output to a required interval usually or Commonly used activations functions Activation function Mathematical Expression Logistic sigmoid f(x)= I+exp( Hyperbolic tangent f(x)=tanh(x) Gaussian f(x)=exp(-x/202) DCABES. October 2009
DCABES, October 2009 Basics of ANNs Activation Function • Introduces non-linearity to a problem • Maps the neuron output to a required interval, usually [-1, 1] or [0, 1] • Commonly used activations functions: Activation Function Mathematical Expression Logistic Sigmoid Hyperbolic tangent f(x) = tanh(x) Gaussian f(x) = exp(-x 2 /2σ 2 ) 1 ( ) 1 exp( ) f x x = + −
Basics of anns Weights Very important for learning rate and accuracy of network Too large or too small values at initialization may cause problems Initial weight range Source Ardo et al.(1997) [-0.1,0.1] Paola(1994), Gopal and Woodcock(1996), Lawrence et al (1996), Bebis et al. (1997), Paola and Showengerdt(1997) Staufer and Fischer (1997) [-0.15,0.15 Vuurpijl (1998) 0.25,0.25 Gallagher and downs(1997) 0.3.0. Rumelhart et al.(1986), Eberhart and Dobbins(1990) 0.5,0.5 Sietsma and Dow(1991), Huurneman et al.(1996) Partridge and Yates(1996 DCABES. October 2009
DCABES, October 2009 Basics of ANNs Weights • Very important for learning rate and accuracy of network • Too large or too small values at initialization may cause problems