Exercise 12: Linear Neurons and Simple Perceptrons
Date: Summer Term 2006
The last lecture was about simple one-layer perceptrons and the graphical representation of their input/output behavior.
Unit 1: How does the decision line look like in case of a simple “and”, “or”, and exclusive-or (XOR) function? All three functions may have two inputs.
Unit 2: Show by symbolic representations that the simple one-layer threshold perceptron with two inputsx1andx2 as well as a thresholdΘcannot realize the XOR function.
Unit 3: In the last lecture, we have discussed two realizations of simple one-layer percep- trons. The first one had a thresholdΘ, whereas the other one had an additional bias link with a constant input value of “1”. “Suddenly”, the second realization has an additional input, and the inequality “≥ 0” means that the angle between the input and weight vector is between−π/2and π/2. Why is this and why may that be the same as a freely moving decision line?
Hint: This is a surprisingly difficult task, and stop this exercise if you have not succeed within five minutes.
Unit 4: Another neural network model consists of only linear neurons where the output is given as oi = P
jwijoj. What is the output function in case of two inputs? How is the behavior changing, if you add two additional linear neurons in a hidden layer, i.e., neurons between the input and output layers? What is changing, if you add further hidden layers?
Have fun, Hagen and Ralf.
1