• Keine Ergebnisse gefunden

Exercise 10: Implementation of a Backpropagation Network Date: Summer Term 2006

N/A
N/A
Protected

Academic year: 2021

Aktie "Exercise 10: Implementation of a Backpropagation Network Date: Summer Term 2006"

Copied!
1
0
0

Wird geladen.... (Jetzt Volltext ansehen)

Volltext

(1)

Exercise 10: Implementation of a Backpropagation Network

Date: Summer Term 2006

Now, its time to implement a backpropagation network. The maintenance of the units and con- nections is already taken care of (filenn.c/nn.h). The other parts, such as the conversion of given arguments, is done in the same way as in exercise 5. First of all, you are strongly advised to read the API documentation, which can be found at

http://www-md.e-technik.uni-rostock.de/ma/rs/lv/nn.zip.

Unit 1: Download the prepared material and read the API documentation.

Unit 2: Implement the main functions in the filebp.cof a simple backpropagation network.

Please note that the given code material does not provide any backpropagation func- tionality but demonstrates how to use the network functions. You might want to start with a simple 1-1-1 network that should be realizing an inverter. How many training patterns do you need? This unit is to make sure that your backpropagation code is working well, and the prt-functions are mainly for debugging purposes ;-) Save this program version for later usage.

Unit 3: Classification is a typical application area for neural networks. To this end, im- plement a 4-2-4 encoder that maps the four possible input patterns ‘1000’, ‘0100’,

‘0010’, and ‘0001’ onto identical outputs. You will be using this program in the next exercise.

Unit 4: Another application area is known as approximation. The network should learn to approximate the two-dimensional functionf(x, y) = cos(x) + cos(y)in the range (x, y)∈[−π..π]. Please, take into account the following two demands:

1. Your program has to generate both the training and the test patterns. Allow (by implementing an option) for generating these patterns at random as well as systematically.

2. The two parameters l pats (learning) and t pats (testing) should specify the number of patterns per dimensions, i.e., generatingl pats2plust pats2 in total.

You will be using this program in the next exercise as well.

Hint: How to directly callgnuplotfrom your own program, is shown in the doc- umentation.

Have fun, Hagen and Ralf.

1

Referenzen

ÄHNLICHE DOKUMENTE

Replace the fitness function (i.e., the sphere model as noted above), by the calculation of the total network length (as described in exercise 2).. Questions: How many parameters x i

Write a program that prints 100 lines each of which should contain the line number and one Gaussian-distributed random number2. Reuse the program but print the squares of the

Unit 1: Implement a simple Monte Carlo and a simple systematic search strategy by us- ing the programming framework already used in exercise 5.. How do these

Unit 1: parent ↔ offspring relation: Consider the two most commonly used mutation prob- abilities p m = 1 (evolution strategies) and p m = 1/n (genetic algorithms), with n denoting

As you have learned during the lecture, threshold units are the simplest neural networks: depend- ing on whether the netinput h i is below zero or not, the unit’s output is a “0”

How does the random initialization of the weights, i.e., their minimal and maxi- mal random values, influence the test error, particularly if the number of hidden neurons is close to

Unit 1: How does the decision line look like in case of a simple “and”, “or”, and exclusive-or (XOR) function.. All three functions may have

Please note that the given code material does not provide any backpropagation func- tionality but demonstrates how to use the network functions. You might want to start with a