• Keine Ergebnisse gefunden

Exercise 12: Linear Neurons and Simple Perceptrons Date: Summer Term 2006

N/A
N/A
Protected

Academic year: 2021

Aktie "Exercise 12: Linear Neurons and Simple Perceptrons Date: Summer Term 2006"

Copied!
1
0
0

Wird geladen.... (Jetzt Volltext ansehen)

Volltext

(1)

Exercise 12: Linear Neurons and Simple Perceptrons

Date: Summer Term 2006

The last lecture was about simple one-layer perceptrons and the graphical representation of their input/output behavior.

Unit 1: How does the decision line look like in case of a simple “and”, “or”, and exclusive-or (XOR) function? All three functions may have two inputs.

Unit 2: Show by symbolic representations that the simple one-layer threshold perceptron with two inputsx1andx2 as well as a thresholdΘcannot realize the XOR function.

Unit 3: In the last lecture, we have discussed two realizations of simple one-layer percep- trons. The first one had a thresholdΘ, whereas the other one had an additional bias link with a constant input value of “1”. “Suddenly”, the second realization has an additional input, and the inequality “≥ 0” means that the angle between the input and weight vector is between−π/2and π/2. Why is this and why may that be the same as a freely moving decision line?

Hint: This is a surprisingly difficult task, and stop this exercise if you have not succeed within five minutes.

Unit 4: Another neural network model consists of only linear neurons where the output is given as oi = P

jwijoj. What is the output function in case of two inputs? How is the behavior changing, if you add two additional linear neurons in a hidden layer, i.e., neurons between the input and output layers? What is changing, if you add further hidden layers?

Have fun, Hagen and Ralf.

1

Referenzen

ÄHNLICHE DOKUMENTE

The positions, i.e., the x and y coordinates, of the transformer and the houses are fixed and can be found in

Monte Carlo (Global Random Search): This method visits “every” point in search space at random with equal probability.. Systematic Search: This method systematically visits

Replace the fitness function (i.e., the sphere model as noted above), by the calculation of the total network length (as described in exercise 2).. Questions: How many parameters x i

Write a program that prints 100 lines each of which should contain the line number and one Gaussian-distributed random number2. Reuse the program but print the squares of the

Unit 1: Implement a simple Monte Carlo and a simple systematic search strategy by us- ing the programming framework already used in exercise 5.. How do these

Unit 1: parent ↔ offspring relation: Consider the two most commonly used mutation prob- abilities p m = 1 (evolution strategies) and p m = 1/n (genetic algorithms), with n denoting

As you have learned during the lecture, threshold units are the simplest neural networks: depend- ing on whether the netinput h i is below zero or not, the unit’s output is a “0”

Please note that the given code material does not provide any backpropagation func- tionality but demonstrates how to use the network functions. You might want to start with a