Literatur
Lehrbücher
D. Amit: Modelin~
BrainFunctions
Cambridge Universlty Press, Cambridge (EngJand) 1989
Wieder Untertitel "the world 0/ attroctor neural networks" schon andeutet.
beschäftigt sich dieses Buch ausschließlich mit rückgekoppelten Netzwerken.
J.A. Anderson, E.Rosenfeld (Eds): Neurocomputing: Foundations 0/ Research MIT Press, Cambridge USA, London, 1988
Kein Lehrbuch
imengen Sinn, sondern eine interessante Sammlung wichtiger Veröffentlichungen zum Thema "Neuronale Netze" von 1890-1987.
R.Beale, T. Jackson: Neural Computing Adam Hilger, Bristol, EngJand 1990
Ein sehreinjache. klar geschriebene Einführung in Neuronale Netze.
Tarun Khanna:
Foundations 0/ Neural Networks Addison-Wesley 1990
B. Müller, J. Reinhardt Neural Networks Springer Verlag Berlin 1990
Ein Lehrbuch.
dasneben /eed{orward Modellen hauptsächlich rückgekoppelte Netze mit physikalischen Methoden behandelt. Ein besonderer Teil enthält die Rechnungen zur Stabilitiit
undKapazitiit des Hop/ield-Modells.
Sehr praktisch ist eine beiliegeride Diskette
fürMS-DOS Rechner. auf der die wichtigsten Algorithmen zum Ausprobieren in C programmiert sind.
D.E. Rumelhart, J.L. McClelland: Parallel Distributed Processing; Vol
Uf. IIfMIT press, Cambridge, Massachusetts 1986
Diese klassische Lehrbuchreihe enthält einige Modelle (CompLeam.,Back-prop.
etc)
undVeranschaulichungen sowie die Harmony-Theorie von Sejnowsky.
Zum dritten Band wird ein kleiner Simulator mitgeliefert.
Eberhard Schöneburg: Neuronale Netze Markt
&Technik Verlag, München 1990 Eine kleine. leichtverständliche Einführung.
S ehr praktisch: Auch hier eine MS -DOS Diskette, die einen Simulator enthält.
Patrick K. Simpson: Artificial Neural Systems Pergamon Press, Oxford 1989
und viele weitere Bücher (s. Referenzen), die aber als Aufsatzsammlungen für Laien
nur einen facettenhaften Eindruck des Gebiets vermitteln.
Zei tsehri ften
Neural Networks, Pergamon Press zweimonatlich
Offizielles Organ der INNS- International Neural Network Society
Biological Cybernetics, Springer Verlag monatlich
Biologisch-mathematische BeiträgeConnection Science, Carfax Publ. Comp., Mass.,USA IEEE-Transactions on Neural Networks
International Journal of Neural Systems, W orld Scientific Publishing Co., Singapur Network: Computation in Neural Systems Blackwell Scientific Publications, Bristol, UK Neural Computation, MIT Press, Boston
GrundlagenbeiträgeNeuroComputing, Elsevier Science Publ.
Konferenzen
vierteljährlich zweimonatlich vierteljährlich vierteljährlich
halbjährlich zweimonatlich
Fast alle Konferenzen über Mustererkennung, Kybernetik, Künstliche Intelligenz, VLSI, Robotik, u.d.gl.m. enthalten Beiträge über Anwendungen neuronaler Netze.
Besonders sind die spezialisierten Konferenzen zu nennen:
Auf Internationaler (amerikanischer) Ebene halbjährlich
DCNN - International Joint Conference on Neural Networks
Winter: INNS- DCNN
Sommer: IEEE - DCNN
Auf Internationaler
(europäischer)Ebene wird gerade eine einheitliche Konferenz eingerichtet. Dies war
1990: INNC-90 Int. Neural Network Conference, Paris
1991: ICANN-91 Int. Conference on Artificial Neural Networks, Helsinki 1992: ICANN 92 Int. Conference on Artificial Neural Networks, Brighton, UK
Den Schwerpunkt auf (industrielle) Anwendungen legt die europäische Konferenz Neuro-Nimes
die seit 1988 jeweils im November in Ntmes (Südfrankreich) tagt.
Referenzen
[AARTS89]
[ABL87]
[ABU85]
[ACK85]
[ACK90]
[AHA90]
[ALB72]
[ALB75]
[ALM89]
[AMA71]
[AMA72]
[AMA77]
[AMA77b]
[AMA78]
[AMA79]
[AMA80]
[AMA83]
[AMA88]
E.Aarts, J .Korst: S imulded A nnealing and B oltzman M aehines;
1. Wiley & Sons, Chichester, UK 1989
Paul Ablay: Optimieren mit Evolutionsstrategien;
Spektnnn der Wissenschaft, Juli 1987
Y.S. Abu-Mostafa, J.-M. StJaques: Information Capa:ity of the Hopfield Model;
IEEE Trans. on Inf. Theory, Vol IT-31,N04, pp.461464 (1985)
D.Ackley, G. Hinton, T. Sejnowski: A leaming algorithmforthe Boltzmann maehines; Cognitive Science, Vo19, pp. 147-169, (1985); auch in [ANDR88]
Reinhard Acker, Andreas Kurz: On the Biologically Motivated Derivation of Kohonen's S elJ-Organizing Feature M ",s; in [ECK90
J,
pp. 229-232Stanley Ahalt, Prakoon Chen, Cheng-Tao Chou: The Neural Shell: A Neural Network Simulation Tool; IEEE Proc. Tools for Art. Intell. TAl-90, pp.1l8-122 A. Albert: Regression and the Moore-Penrose pseudoinverse;
Academic Press, New York 1972
1.S.Albus: A New Approach To M onipuldor Control CMA C;
Transactions of the American Society of Mechanical Eng. (ASmE),
Series G: Journal ofDynamic Systems, Measurement and Control, Vo197!3 (1975) G.A1masi, A. Gottlieb: Highly Parallel Computing;
Benjamin/Cummings Publ. Corp., Redwood City CA 1989
S. Amari: Charaeteristics of Randomly Connected Threshold-Element Networks and Network Systems; Proc. IEEE, Vol59, No. 1, W. 3547 (1971)
S. Amari: Leaming Pdtems anti Pattern Sequences by Self-Organizing Nets of Threshold Elements; IEEE Trans. on Comp., Vo!. C-21 No.11, W. 1197-1206 (1972) S. Amari: Neural Theory of Association and Concept-Formation;
Bio!. Cyb. Vol26, pp.175-185 (1977)
S. Amari: Dynamics of Pdtem Formation in Lateral-Inhibition Type Neural Fields; Bio!. Cyb. Vo127, pp.77-87 (1977)
S. Amari, A. Takeuchi: Mathematical Theory on formation of category detecting nerve cells; Bio!. Cyb., Vo129, pp. 127-136 (197S)
s. [TAK79]
S. Amari: Topogr",hic organization of nerve fields;
Bulletin of Mathematical Biology, Vo142, pp. 339-364 (1980)
S. Amari: Field Theory of Self-Organizing Neural Nets; IEEE Trans. System, Man and Cybernetics, Vol SMC-13 No.5, pp.741-748 (1983)
S. Amari, K. Maginu: Statistical Neurodynomics of Associative Memory;
Neural Networks, Voll, pp.63-73 (1988)
[AMA89] S. Amari : Chartx:teristics of Sparsely Encoded Associative Memory;
Neural Networks, Vo1.2, pp. 451-457 (1989)
[AMIT85] D. Amit, H. Gutfreund: Storing infinite Numbers o[ Patterns in a Spin-Glass Model ofNeuralNetworks; Phys. Rev. Len., Vol55, Nr 14, pp. 1530-1533 (1985) [AMIT89] D. Amit Modeling BrainFunetions; Cambridge University Press,
Cambridge (England) 1989
[AND72] 1.Anderson: A simple neural network generating an interactive memory;
Math. Biosc. , Vol14, pp 197-220 (1972); auch in [ANDR88]
[ANDR88] 1.A.Anderson, E.Rosenfeld (Eds): Neurocomputing: Foundations o[ Research;
MIT Press, Cambridge USA, London, 1988
[AND88] Diana Z. Anderson (Ed): Neural Informtiion Processing Systems -Natural and Synthetic; American Institute of Physics, New York 1988
[ANG88] B. Ang6niol, G. de la Croix Vaubois, 1.-Y. le Texier:
Self-Organizing Fetiure Mops and the Travelling SaksmanProblem;
Neural Networks Voll, pp.289-293, Pergamon Press, New York 1988 [AP087] Special Issue on Neural Networks, Applied Optics, Vol26, No. 23 (1987) [ARN64] B. Arnold: Elementare Topologie; Van den Hoek & Ruprecht, Göttingen 1964 [BAK90] Gregory Baker, Jerry Gollub: Chaotic dynomics: an introduction; Cambridge
University Press, 1990
[BALD89] P. Baldi, K. Hornik: Neural Networks and Principal Component Analysis;
Neural Networks, Vol2, pp. 53-58, Pergamon Press 1989 [BALL82] D. H. Ballard, Ch. Brown: Computer Vision; Prentice Hall 1982 [BAR90] Etienne Bamard, David Casasent: Shift Invariance and the Neocognitron;
Neural Networks, Vol.3, pp.403-410 (1990)
[BARN86] M.F. Barnsley, V. Ervin, D.Hardin, J. Lancaster: Solution o[ an inverse problem [or[rtx:tals ond other sets;
Proc.
Natl., Acad. Sei. USA, Vol83, pp.1975-1977 (1986) [BARN88a] M. Barnsley, A. Sloan: A Better Way to Compress Images;Byte, pp. 215-223 ( January 1988)
[BARN88b] Michael F. Barnsley: Frtx:tals everywhere; Academic Press 1988
[BELF88] L. Belfore. B. Johnson, 1. Aylor: The Design o[ Inherently Fault-Tolerant Systems; in: S.Tewksbury, B.Dickinson, S.Schwartz (Eds);
[BERT88]
[BERT90]
[BET76]
Concurrent Computations, Plenum Press, New York 1988
J-M. Bertille. J-C. Perez: Le modele neuronal hologrophique chaos [ra:tal; Bases Theorique et applications industrielles,
Proc.
Neuro-Nimes, Nimes 1988J-M. Bertille. J-C. Perez: Dynomical Change o[ E[[ective Degrees o[ Freedom in Fractal Chaos Model; Proc. INNC-90, pp. 948-951, Kluwer Academic Publ. 1990 Albert D. Bethke: Comparison o[ genetic algorithms and gradient-based optimizers on parallel processors: Efficient o[ use o[ processing coptx:ity;
Logic of Computers Group, Technical Report No. 197, The University of Michigan, Comp. and Com. Sc. Dep. 1976
[BICH89] M. Bichsel, P. Seitz: Minimum Closs Entropy: A Maximum Informalion Approa:h to LayeredNetworlcs; Neural Netwarks, Vol2, pp.133-141 (1989) [BLQM] R. BIomer, C. Raschewa, RudolfThurmayr, RoswithaThunnayr:
A /ocally sensitive mopping of multivoriote dota onto a two-dimensional plane, Medical Data Processing, Taylar & Francis Ltd., London
[BLUM54] J.Blum: Multidimensional stochostic 'f1proximalion methods;
Ann. Math. Stat., Vol25, pp.737-744 (1954)
[BOTT80] S.Bottini: AnAlgebraic Model
01
",Associative Noise-like Coding Memory;Biol. Cybemetics, Vol36, pp. 221-228 (1980)
[BOTT88] S. Bottini: An A/ter-Shonnon Meosure
0/
the Storage Copa::ity0/
an Associalive Noise-Like Coding Memory; Biol. Cybemetics, Vol59, pp. 151-159 (1988) [BRAI67] V. Braitenberg: Is the cerebellor cortex a biological dock in the millisecondrange?; in: C.A.Fox, R.S. Snider (eds.), The cerebellum, Progess in brain research, Vol2, Elsevier, Amsterdam 1967, pp.334-346
[BRAI89] V. Braitenberg, A. Schütz: Cortex: hohe Ordnung oder größtmögliches Durchei1lll1llkr?; Spektnnn d. Wissensch., pp. 74-86 (Mai 1989) [BRA79] R. Brause, M. Dal Cin: Calastrophic Ef/ects in Poltern Recognition;
in: Structural Stability in Physics,
-Ed.
W.Güttinger, H.Eickemeier, Springer Verlag Berlin, HeideUJerg 1979[BRA88a] R. Brause: Fehlertoleranz in intelligenten Benutzerschnittstellen;
Informationstechnik it 3/88, pp.219-224, Oldenbourg Verlag 1988 [BRA88b] R. Brause: Fault TolerCftCe in Non-Linear N etworks;
Informatik Fachberichte 188, pp.412-433, Springer Verlag 1988 [BRA89a] R. Brause: Neural Network Simulation using INES;
IEEE Proc. Int. Workshop on tools for AI, Fairfax, USA 1989.
[BRA89b] R. Brause: PeiformCftCe Cftd Storage Requirements ofTopology-conserving Mops lor Robot MCftipuldor Control; Interner Bericht 5/89 des Fachbereichs lnformatik der J.W. Goethe Universität Frankfurt a. M., 1989
und in:
Proc.
INNC-90, pp. 221-224, Kluwer Academic Publ. 1990[BRA91] R.Brause: Approximalor Networks and the Principle of Optimal Informalion Distribution; Interner Bericht 1/91 des Fachbereichs Informatik der J.VI. Goethe Universität Frankfurt L M., 1991
und in:
Proc.
ICANN-91, Elsevier Science Publ., North Holland 1991[BROD09] K.Brodman: Vergleichende Lokalisationslehre der Großhirnrinde in ihren Prinzipien dargestellt auf Grund des Zellenbaues, J.A. Bart, Leipzig 1909 [BUH87] J.Buhmann, K.Schulten: Noise-driven Temporal Association in Neural Networks;
Emophysics Letters, Vol4 (10), pp. 1205-1209 (1987)
[CAJ55] Ramon y Cajal: Histologie du Systhne Nerveaux I/; C.Sl.C. Madrid 1955 [CARD89] H.C.Card, W.R. Moore: VLSI Devices and Circuits/or Neural Networks;
Int. Joum. ofNeural Syst. Voll, No. 2 (1989), pp 149-165
Literatur
[CAR87a] G. CBIpellter, S. Grossberg: A MaYsively Parallel Architecture tor Self-Organi- zing Neural Pattern Recognition M crhine; Computer Vision, Graphies, and Image Processing Vo137, pp.54-115, Aeademie Press 1987; aueh in [GR088]
[CAR87b] G. CBIpellter, S. Grossberg: ART2: Self-organizfiion
0/
stahle category recognition codes/orOlllJlog input pdtems; Applied Optics Vol26, pp. 4919-4930 [CAR90] Gail A. CBIpellter, Stephen Grossberg: ART3: Hierorchical Seorch UsingChemical Transmitters in Self-organizing Pattern Recognition Architectures;
Neural Networks Vo13, pp. 129-152, Pergamon Press 1990
[CECI88] L. Ceci, P. Lynn, P. Gardner: Efficient Distribution
0/
Back-Propagation Models on Parallel Architecture; Report CU-CS-409-88, University of Colorado, Sept.1988 und Proc. Int Conf. Neural Networlcs, Boston, Pergamon Press 1988 [CHAP66] R. Chapman: The repetitive responses
0/
isololed axons /7Om the crah corcinosmaenar; Iournal ofExp. Biol., Vol. 45 (1966)
[CHOU88] P.A. Chou: The capacity of the Kanerva aYsociotive memory is exponential;
in: [AND88]
[COOL89] A.C.C. Coolen, F.W.Kujik: A Leaming Mechonism For Invariant Pattern Recognition in Neural Networks; Neural Networlcs Vol. 2, pp.495-506 (1989) [COOP73] LN. Cooper: A possible orgfllization
0/
animal memory 01Id leamingProe. Nobel Symp. on Collective Prop. of Physieal Systems, B.Lundquist, S.Lundquist (eds.), Aeademic Press New York 1973; auch in [ANDR88]
[COOP85] Lynn A. Cooper, Roger N. Shepard: Rotttionen in der räumlichen Vorstellung;
Spektrum d. Wiss., pp. 102-109, Febr. 1985
[C0TI88] R.M.Cotterill (ed.): Computer simulation in brain science;
Cambridge University Press, Cambridge UK (1988) [CRUT87] I. Crutchfie1d, I.Fanner, N. Packard, R. Shaw: Chaos;
Spektrum d. Wissenschaft, Februar 1987
[DAU88] 1. Daugman: Complete Descrete 2-D Gabor Transforms by Neural Networks tor Image Analysis ond Compression; IEEE Transact10ns on Acousties, Speech and Signal Processing Vo136, No 7, pp.1169-1179 (1988)
[DA V60] HDavis: M echanism
0/
exication0/
auditory nerve impulses; G.Rasmussen, W.Windle (Hrsg), Neural Mechanism of the Auditory and Vestibular Systems, Thomas, Springfield, llIinois, USA 1960[DEN55]
J.
Denavit, R.S. Hartenberg: A Kinematic Notfiion/or Lower-pair Mechanismen BaYed on Matrices; Ioum. Applied Mech., Vol 77, pp.215-221 (1955)[DEN86] I.S. Denker (Ed.): Neural Networks tor Computing;
American Inst. of Physics, Conf. Proc. Vol. 151 (1986)
[DEP89] Etienne Deprit Implementing recurrent Bcrk-Propagation on the Connection mcrhine; Neural Networks, Vol.2, pp. 295-314 (1989)
[DIE87] 1. Diederieh, C. Lischka: Spread-3. Ein Werkzeug zur Simulation konnektionistischer Modelle
auf
Lisp-MaYchinen; Kl-Rundbrief Vol.46, pp.75-82, Oldenbourg Verlag 1987[D0D90] Nigel Dodd: Optimisation 0/ Network Structure Using Genetic Techniques;
Proc. Int Neural Network Conf. INNC 90, Kluwer Acad. Publ., 1990 [DQW66] J.E.Dowling, B.B. Boycott: Organizt6ion of the primale retina: Electron
microscopy; Proc. ofthe Royal Society ofLondon, VoIB166, pp.80-1l1 (1966) [DUD73] R.Duda, P.Hart: Pattern Classlficalion cwl Scene AnalysiS;
John Wiley & Sons, New York 1973
[ECK90] R.Eckmiller, G.Hartmann, G.Hauske (eds.): Parallel Processing in Neural Systems cwl Computers; North Holland, Amsterdam 1990
[EIM85] P. D. Eimas: Sprachwahmehmrmg beim Säugling;
Spektrum d Wissenschaft, Mlirz 1985
[ELL88] D. Ellison: On the Convergence of the Albus Perceptron;
[FELD80]
[FU68]
[FU87]
[FUCHS88]
[FUH87]
[FUK72]
[FUK80]
[FUK84]
[FUK86]
[FUK87]
[FUK88a]
[FUK88b]
[FUK89]
IMA, Journal ofMath. ContrI. and Inf., Vol. 5, pp.315-331 (1988) J.A. Feldman, D.H. Ballard: Computing with connections;
University of Rochester, Computer Science Department, TR72, 1980 K.S. Fu: Sequential M ethods in Pattern R ecognition;
Academic Press, New Y mk 1968
Fu, Gonzales, Lee: Robotics:Control, Sensing, Vision and Intelligence;
McGraw-Hill 1987
A. Fuchs, H. Haken: Computer Simult6ions of Pattern Recognition os a DYNJmical Process of a Syn.ergetic System; in: H.Haken (ed.), Neural and Synergetic Computers, pp.16-28,Springer Verlag Berlin Heidelberg 1988 Cory Fujiki, John Dickinson: Using the Genetic Algorithm to Generate Lisp Source Code to Solve the Prisoners Dilemma; in [GRE87], pp.236-240
K.Fukunaga: Introduction to Stalistical Pattern R ecognition;
Academic Press, New Y mk 1972
K. Fukushima: Neocognitron: A Self-Organized Neural Network Model for a Mechanism of Pt6tern Recognition Uiuffected by Shift in Position;
Biolog. Cybernetics, Vol. 36, pp. 193-202 (1980)
K. Fukushima: A HierfI'Chical Neural N etwork Model for Associalive Memory;
Biolog. Cybernetics, Vol50, pp. 105-113 (1984)
K. Fukushima: Neural N etwork Model for selective A nention in Visual Poltern Recognition; Biological Cybemetics 55, pp 5-15, (1986)
K. Fukushima: A neural network model for selective t6tention in visual pattern m:ognition and associdive recoll; Applied Opties, Vol.26 No.23, pp.49854992 (1987) K. Fukushima: A Neural Network for Visual Pattern Recognition;
IEEE Computer, pp. 65-75, March 1988
K. Fukushima: N eocognitron: A Hierarchical Neural N etwork Capable of V isual Pattern Recognition; Neural NetwOlks, VoLl, pp.119-130, (1988)
K. Fukushima: Analysis of the Process of Visual Pattern Recognition by the N eo- cognitron; Neural Netwmks, Vol. 2, pp.413-420, (1989), Pergamon Press
[GAL88]
[GALl.88]
[GIL87]
[GIL88]
[GLA63]
[GLU88]
[GOD87]
[GOL87]
[G0L89]
[GOS87]
[GRA88]
[GRE87]
[GRE90]
[GR069]
[GR072]
[GR076]
[GR087]
[GR088]
[GR088b]
A. R. Gallant, H. White: TMre aisIS a neural nelWork thot does not make avoidable mistakes; IEEE Sec. InL Conf. on Neural Networks, pp. 657-664, 1988 S. L Gallant: Conneetionist Expert Systems;
Comm. ACM, Vo13112. pp. 152-169 (Febr. 1988)
C.
Lee
Giles, Tom Maxwell: Leaming, invmionce, and generalization in high-order neural networks; Applied Optics, Vo126 No.23, pp. 49724978 (1987) C.Lee
Giles, R.D. Griffin, Tom Maxwell: Encoding geometrie invariances in high-order neural networks; in [AND88], pp. 301-309R. Glauber: Time-Depeneknt Statistics o[ the Ising Model;
Journal of Math. Physics, Vo14, p. 294 (1963)
M. Gluck, G. Bower: Evaluating an t1I'.ÜfJtive nelWork model o[ human 1eaming;
Jounal of memory and language 27, (1988)
N. Goddard: The Rochester Connectionist Simulator, UserManual and Advanced Programming Manual; Dep. ofComp. Sei., Univ. ofRochester, USA, April 1987 David E. Goldben~, Philip Segrest: Finite Markov Chain Analysis of Genetie A 19orithm; in [GRB87]
David Goldberg: Genetic algorithm in search, optimization and machine 1eaming;
Addison Wesley, 1989
U. Rueckert, L Kreuzer, K.Goser: A VLSI eoncept[ort1l'.ÜfJtive associative matrix bosed on neural networks; IEEE- Proc. EuroComp '87, pp. 31-34 (1987)
Hans P. Graf, Lawrence D. Jackel, Wayne E. Hubbard: VLSI ImplementaJion of aNeuralNetworkModel; IEEE Computer, March 1988
John Grefenstette (Ed.): Genetie A 19orithms and tMir applieations;
Proc. Second Int. Conf. Genetic Alg., Lawrence Erlbaum Ass., 1987
John J. Grefenstette, Alan C. Schultz: Improving Taetical Plans with Genetie Algorithms; IEEE
Proc.
Tools for Al TAl-90, pp. 328-334, Dulles 1990S. Grossberg: Some Networks Thot Can Leam, Remember, and Reproduce Any Number Q{ Complicated Spaee-Time Patterns, I;
Journal
01
Mathematics and Mechanies, Vo119, No. 1, pp.53-91 (1969)S. Grossberg: Neural Expeetation: Cerebellar and retinal ana/ogs o[ eells fired by 1earndJle or un1eamed pattern closses.; Kybernetik, Vo1.10, pp. 49-57 (1972) S. Grossberg: A doptive pattern classification and universal reeoding I + Il;
Biological Cybernetics, Vol.23, Springer Verlag (1976)
S. Grossberg: Competitive Leaming: From Interaetion to Adaptive Resonance;
Cognitive Science, Vol11, pp.23-63 (1987); auch in [GR088]
S. Grossberg ( ed.): Neural N etworks and Natural Intelligence;
MIT Press, Cambridge, Massachusetts 1988
S.Grossberg: Nonlinear Neural Networks; Neural Networks,Vol. 1,pp. 17-61 (1988)
[GRUB88]
[GRU89]
[GUEST87]
[HAK88]
[HARP90]
[HAS89]
[HEBB49]
[HEISE83]
[HEM87]
[HEM88a]
[HEM88b]
[HER88a]
[HER88b]
[HER89]
[HEY87]
[Hll..L85]
[Hll..L87]
[HIN81a]
[HIN81b]
H. Grubmüller, H. Heller, K. Schulten: Eine Cray
für
'1edermann";mc 11/88, Franzis Verlag, Mtlnchen 1988
A. Grumbach: Modeles connexionistes du diagnostic;
Proc. Journees d' electron., Ecole Polytechnique F6dirale, Lausanne 1989 Clarlc C. Guest, Robert TeKolste: Designs ond devices tor optical bidirectional associative memories; in: [AP087], pp. 5055-5060
Hermann Haken: Informdion ond Selj'-Organization;
Springer Verlag Berlin Heidelberg 1988
Steven A. Harp, Tariq Samad, Aloke Guha: Designing AppliCr4ion-Specific Neural Networlcs Using tlu! Gl!1II!tic Algorithm; in: David S. Touretzky, Advances in Neural Information Processing Systems 2, Morgan Kaufmann Publishers, 1990 M.H.Hassoun: Dynanic Heteroassociotive Neural Memories;
Neural Networks, Voll, pp. 275-287 (1989)
D.O. Hebb: TIu! OrganizOlion
of
Behavior; Wiley, New YoIk 1949 W.Heise: Informations-und Cod~rungstlu!orie; Springer Verlag 1983 lL. van Hemmen: Nonlinew Neural N etworks New Saturation;Phys. Rev. A36, pp.1959 (1987)
lL. van Hemmen, D.Grensing, A.Huber, R.KUhn: NonlineQl' Neural Networks;
Journal of Statist. Physics, VoL50, pp 231 u. 259 (1988) lL. van Hemmen, G. Keller, R. Ktlhn: Forgetful Memories;
Europhys. Lett., Vol. 5, pp. 663 (1988)
Andreas Herz: Representation tw:l recognition of spr4io-temporal objects within a generalized H optu!ld S cheme; Connectionism in Perspective, Zürich Oct. 1988 A.Herz, B. Sulzer, R.Kühn, J.L.van Hemmen: TIu! Hebb Rule: Storing Statioc tw:l Dynomic Objects in tftAssociotive Neural Network;
Europhys. Letters Vol. 7, pp. 663-669 (1988)
A. Herz, B. Sulzer, R. Kühn, J.L van Hemmen: HebbionLeaming Reconsidered:
Representation of Static ond Dynanic Objects in Associative Neural N ets;
Biol. Cybernetics, Vo160, pp. 457467 (1989)
A. Hey: PQI'allel Decomposition of lwge Scale Simulations in Science and Engineering; Report SHEP 86187-7, University of Southampton 1987
D. Hillis: TIu! ConnectionMerhine; MIT Press, Cambridge, Massachusetts, 1985 Daniel Hillis, Joshua Bames: Progrtmming a highly pQl'allel computer;
Nature Vol. 326, pp. 27-30 (1987)
G. Hinton, Anderson (eds): Parallel Models ot Associotive Memory;
Lawrence Erlbamn associates, Hillsdale 1981
G. Hinton: Implementing Semtfttic Networks in Pwallel HQI'dwQl'e; in [HIN81a]
[HN86]
[HN87]
[HN88]
[H065]
[HOL75]
[HOP82]
[HOP84]
[HOP85]
[HOR89]
[HORN89]
[HRY88]
[KAL75]
[KAM90]
[KAN87]
[KAN86]
[KEE87]
[KEE88]
[KIEF52]
[KIM89]
Robert Hecht-Nielsen: Performonce Limits
0/
Optical, Electro-Optical, and Electronie Neurocomputers; Optical and Hybrid Computing SPI, Vol. 634, pp.277-306 (1986)Robert Hecht-Nielsen: Counterpropagdion networks; ffiEE Proc. Int. Conf.
Neural Networks, New York 1987; auch in [AP087]
Robert Hecht-Nielsen: Applicdions
0/
Counterpropagalion Networks;Neural Networks, Vol. 1, pp.131-139 (1988), Pergamon Press
Y. Ho, R.L. Kashyap: An A 19orithm tor linear inequalities and its appliCalion;
mEE Trans. on Eleetronie Computers, Vol EC-14, pp. 683-688 (1965) J. H. Holland: A~tioninNalural andArtificial Systems;
University of Miehigan Press, Ann Arbor, MI, 1975
1.1. Hopfield: Neural Networks and Physical Systems with Emergenl Collective Computational Abilities;
Proc.
Natl. Acad. Sei, USA, Vo179, pp.2554-2558 (1982) J.J. Hopfield: Neurons with graded response halle collective computional properties like those oftwo-state neurons; Proe. Nat!. Aead. Sei. USA, Vo181,pp 3088-3092 (1984)
1.1. Hopfield, D.W. Tank: 'Neural' Computalion
0/
Dercisions in OptimiZalion Problems; Biologieal Cybemetics, Vol.52; pp. 141-152 (1985)H.Homer: Neural Networks with Low Levels 0/ aetivity: Ising vS. M cCulloch- Pitts Neurons; Zeitschr. f. Physik, B 75, S. 13j" (1989)
K. Homilc, M. Stinchcombe, H. White: Multilayer Feed/orward Networks are UniversalApproximators; Neural Networks, Vo12, pp. 359-366, Perg. Press 1989 T. Hryeej: Fealure Discovery by Bockw"d Inhibition; Arbeitspapiere der GMD, Vo1329, pp.73-79, Gesellschaft für Math. und Datenverarb., St Augustin 1988 S.KaIlert: Einzelzellverhalten in verschiet.ieMn HlJrbahnteilen;
W.Keidel (Hrsg.), Physiologie des Gehörs, Thieme Verlag, Stuttgart 1975
Behzad und Behrooz Kamgar-Parsi: On Problem Solving with Hopfield Neural Networks; Bio!. Cybemeties, Vol. 62, pp.415-423 (1990)
I.Kanter, H. Sompolinsky: Associlltive recall ofmemory without errors;
Physieal Review A, Vo135, No 1, p. 380 (198")
P. Kanerva: Parallel Structures in human and computer memory; in: [DEN86]
1.D. Keeler: B osin
0/
A ttraetion0/
Neural N etwork models; in: [DEN86]1.D. Keeler: Capaeity tor patterns and sequences in Kanervds SDM os compared to other ossocilltive memory models; in: [AND88]
J.Kiefer, J.Wolfowitz: Stochastic estimdion of the 11Ul%imum
0/
a regression /unction; Ann. Math. Stat, Vol23, pp. 462-466 (1952)H.Kimelberg, M.Norenberg: Astrocyten und Hirn/unktion:, Spektrum der Wissenschaft, Juni 1989
282
Literatur [KIN89][KIN85]
[KIRK83]
[KLE86]
[KOH72]
[KOH76]
[KOH77]
[KOH82]
[KOH84]
[KOH88]
[KOR89]
[KOS8?]
[K0090]
[KR090]
[KOH89a]
[KOH89b]
[KOH90]
[KOR88]
[KUFF53]
I. Kindennann: Inverting Multilayer Perceptrons; Proc. DANIP Workshop on Neural Netw., GMD St Augustin, April 1989
W. Kinzel: Spin Glosses QS Model Systems /or Neural Networks; Int. Symp.
Complex Syst., Elmau 1985, Lecture Notes, Springer Series on Synergetics S.Kirkpatrick, C.D. Gelatt, Ir, M.P. Vecchi: Optimizotion by simulated annealing;
Science, Vol220, pp.671-680 (1983); auch in [ANDR88]
David Kleinfeld: Sequential stOle generation by model neural networks;
Proc. Natl. Acad. Set. USA, VoL 83, pp. 9469-9473, (1986)
T. Kohonen: Comdlltion Mdrix Memories; IEEE Transactions on Computers, Vol C2I, pp.353-359, (1972); auch in [ANDR88]
T. Kohonen. E. Oja: Fast Adoptive FOrmoJion of Orthogonalizing Filters anti Associative Memory in Recurrenl Networks
0/
Neuron-Like Elements;Biological Cybernetics, Vol. 21, pp. 85-95 (1976)
T.Kohonen: Associlltive Memory; Springer Verlag Berlin 1977 T. Kohonen: Analysis of asimple self-organizing process;
Biological Cybemetics, Vol. 40, pp. 135-140 (1982) T. Kohonen: S elf-O'Janislltion anti A ssociotive Memory;
Springer Verlag Berlin,New York, Tokyo 1984
T. Kohonen: The "Neural" Phonetic Typewriter
0/
Helsinki University0/
Technology; IEEE Computer, March 1988
T. Korb, A. Zell: A declorative Neural Network Description Language; Proc.
Euromicro, Köln 1989, Microprogr. and Microproc., Vo127/1-5, North- Holland Bart Kosko: Adoptive bidirectional QSSocidive memories; in: [AP087]
P.Koopman, L.Rutten, M. van Eekelen. M. Plasmeijer: Functional Descriptions
0/
Neural Networks; Proc. INNC-90, pp. 701-704, Kluwer Academic Publ. 1990 A. Krogh, I. Hertz: Hebbian Leaming of Principal Components;
in: [ECK90], pp.183-186
R. Kühn, I.L. van Hemmen. U.Riedel: Complex temporal QSsocidion in neural networks; 1. Phys. A: Math. Gen. Vol22 (1989), pp. 3123-3135
R. Kühn, J.L. van Hemmen. U.Riedel: Complex temporal QSsocidion in neural nets; Proc. Conference nEuro'88, L. Personnez, G.Dreyfus (Hrsg), I.D.S.E.T., Paris 1989, pp. 289-298
H. Kühnel, P.Tavan: The Anti-Hebb Rule derived/rom Information Theory;
in: [ECK90], pp.187-190
K.E. Kürten. J.W. Clark: Exemplificdion
0/
chaJtic aetivity in non-linear neural networks obeying a deterministic dynomics in continous time; in [C0'IT88]S.W.Kuffler: Dischorge Pdtems fIIIl Functional Organization
0/
MammalianRetina; Journal of Neurophys., Vol16, Nol, pp.37-68, (1953)
[LAN88]
[LAP87]
[LAP88]
[LASH50]
[LAW71]
[LEE88]
[LEE88b]
[LEV85]
[LIN73]
[LIN86]
[LIN88a]
[LIN88b]
[LIN88c]
[LIT78]
[LJUNG77]
[LONG68]
[LOOS88]
[MC43]
[MAAS90]
[MAL73]
D. Lang: In[ormationsverarbeitung mit künstlichen neuronalen Netzwerken;
Dissertation am Fachbereich Physik der Universität Tübingen, 1988
A. Lapedes, R. Farber: Nonlinear Signal Processing using Neural Networks:
Prediction and System Modelling; Los Alamos preprint LA-UR-87-2662 (1987) A. Lapedes, R. Farber: How Neural Nets Work;
Report LA-UR-88-418, Los Alamos Nat. Lab. 1988; und in [LEE88b]
K.S.Lashley: In search o[ the engram; Soc. of Exp. Biol. Symp. Nr.4: Psycholog.
Mech. in Anim. Behaviour, Cambridge University Press, pp.454, 468-473, 477-480 Cambridge 1950; auch in [ANDR88]
D.N. Lawley, A.E. Maxwell: Factor Analysis as a Statistical M ethod;
Butterworths, London 1971
Y.C. Lee: Efficient Stochastic Gradient Learning A Igorithm tor Neural Network;
in [LEE88bT, pp.27-50
Y.C. Lee (Ed.): Evolution, Learning and Cognition;
World Scientific, Singapore, New Jersey, London 1988 M. Levine: Vision in man and machine; McGraw Hill1985
S. Lin, B.W. Kernighan: An effective heuristic algorithm tor the traveling salesman problem; Operation Research, Vol21, pp. 498-516 (1973)
R. Linsker: From Basic Network Principles to Neural Architecture; Proc. Natl.
Academy of Science, USA, Vol. 83, pp7508-7512, 8390-8394, 8779-8783 R. Linsker: S elf-Organization in a Perceptual N etwork;
IEEE Computer, pp. 105-117, (March 1988)
R. Linsker: T owards an Organiz ing Principle tor aLayered Perceptual N etwork;
in [AND88]
R. Linsker: Developement o[ [eature-anaIyzing cells and their columnar organization in a layered self-adaptive network; in [COTT88]
W.A. LinIe, G.L. Shaw: Analytic Study of the Memory Storage Copacity o[ a Neural Network; Math. Biosc., Vol39, pp. 281-290 (1978); auch in [SHAW88]
L. Ljung: Analysis o[ Recursive Stochastic A Igorithms;
IEEE Transactions on Automatic Control, Vol AC-22!4, (August 1977)
H.C. Longuet-Higgins: Holographie malel
of
temporal recall; Nature 217, p.104 (1968) H.G.Loos: Reflexive Associative Memories; in [AND88]W. S. McCulloch, W. H. Pitts: A Logical Calculus o[ the Ideas Imminent in Neural Nets; Bulletin of Mathematical Biophysics (1943) Vol 5,pp. 115-133;
auch in: [ANDR88]
Han van der Maas, Paul F. Verschure, Peter C. Molenaar: A Note on Chaotic Behavior in Simple Neural Networks; Neural Networks, Vol.3, pp.119-122 (1990) C. von der Malsburg: Self-organization o[ orientation sensitive cel/s in the striate cortex; Kybernetik, Vo114 pp. 85-100 (1973); auch in: [ANDR88]
[MCE87]
[MET53]
[MIN88]
[MÜ90]
[MUR90]
[NEW88]
[OJA82]
[OJA85]
[OJA89]
[OU87]
[OP88]
[PALM80]
[PALM84]
[PAR86]
[PERL89]
[PER88]
[PER88b]
[PFAF72]
[PR090]
RJ.McEliece, E.C.Posner, E.R. Rodemich, S.S.Venk:atesh: TIu! Copacity otthe
Hopf~ldAssociotive Memory; IEEE Trans. onInf. Theory, Vol IT-33, No.4, pp.461-482 (1987)
N. Metropolis, A.W. Rosenbluth, MN. Rosenbluth, AH. Teller: Equation ot S tate Calculations by Fast Computing Machines; The Journal of Chemical Physics, Vol21, No.6, pp.1087-1092 (1953)
M. Minsky, Papert: Perceptrons;
MIT
Press, 1988B. Müller, J. Reinhardt: Neural Networks; Springer Verlag Berlin 1990
Jacob Murre. Steven Kleynenberg: The MetdVet Network Environment tor tlu!
Developement ot Modular Neural Networks;
Proc.
INNC-90, pp. 717-720, Kluwer Academic Publ. 1990F. Newbury: EDGE: A n Extendible Directed Graph Editor, Intema1 Report 8/88, Universität Karlsruhe, West Gennany
E. Oja: A simplijiedNeuronModel os Principal Component Analyzer, 1. Math. Biology, Vol. 15, pp.267-273 (1982)
E. Oja, J. Karhunen: On Stochastic Approximation ot tlu! Eigenvektors and Eigenvalues of tlu! Expectation of a random Matrix; Report TKK-F-A458, Helsinki Univ. ofTechn., Dept Techn. Phys.(1981);
und in 1. M. A. A., Vol. 106, pp.69-84 (1985)
E. Oja: Neural Networks, Principal Components, and Subspaces;
Int Journ. Neural Syst Voll/I, pp. 61-68, World Scientific, London 1989 I.M. Oliver, 0.1. Smith, J.R.C. Holland: A Study of Permutation Crossover OpertZors on tlu! Travelling Salesman Problem; in [aRE87], pp. 224-230 M.Opper: Learning Times ot Neural Networks: Exact Solution tor a PERCEPIRON Algorithm; Phys. Rev. A38, p.3824 (1988)
G.Palm: OnAssociotive Memory; Biolog. Cybernetics, Vol36, pp. 19-31 (1980) G.Palm: Local synoptic modi/ication can leod to organized connectivity patterns in ossociotive memory; in: E.Frebland (Ed.), S}lnergetics: from microscopic to macroscopic order, Springer Verlag Berlin, Heidelberg, New York 1984
G.Parisi:
A
memory whichForgets; Journal ofPhysics, Vol. A19,L
617 (1986) M. Perlin,J.
-M. Debaud: MatchBox: Fine grained Parallelism at the Match Level; IEEE TAI-89, Proc. Int. Workshop on tools for AI, USA 1989.1.-C. Perez: De nouvelles vo~s vers l'intelligence artific~lle;
Editions Masson, 1988
1.-C. Perez: La memoire hologrophiquetractale; IBM France, Montpellier 1988 E. Pfaffelhuber: Leaming and Information Tlu!ory; Int. J. Neuroscience, Vo13, pp. 83-88, Gomon and Breach Publ. , 1972
P.Protzel: Arti/icial Neural Network tor Real-Time Ttlfk Allocation in Fault-Tolerant, Distributed Processing System; in [ECK90], pp. 307-310
[RE73]
[RIE88]
[RTIT86]
[RTIT88]
[RTIT89]
[RTIT90]
[ROB51]
[ROD65]
[ROS58]
[ROS62]
[RUB90]
[RUM86]
[RUM85]
[SAN88]
[SCH76]
[SCH90]
[SCH090]
[SCHU84]
[SCHU90]
[SEG82]
Ingo Rechenberg: EvolutionsstrtZegie; problemata frommann-holzboog, 1973 U. Riedel, R.Kühn, J.L. van Hemmen: Temporal sequences and chaos in neural nets; Physical review A. Vo138/2, pp. 1105-1108 (1988)
H. Ritter, K. Schulten: On tM Statio1lll1Y State
0/
Kohonen's Self-Organizing Sensory M'f'ping; Biolog. Cyb. Vol54, pp. 99-106, (1986)H. Ritter, K. Schulten: Convergence PropOTties
0/
Kohonen's Topology conserving Mops; Biological Cyb., Vo160, pp. 59ff,
(1988)H. Ritter, T. Martinetz, K. Schulten: Topology-Conserving Mops tor Leaming Visuomotor-Coordination; Neural Networks, Vol2/3, pp. 159-167, (1989) H. Ritter, T .Martinetz, K.Schulten: Neuronale Netze;
Addison-Wesley, Boml990
H.Robbins, S.Momo: A stochostic 'f'proximation method;
Ann. Math. Stat, Vo122, pp. 400-407 (1951)
R.W. Rodieck: Qumtitative Analysis
0/
Cat Retinol Ganglion Cell Response to Visuol Stimuli; Vision Research. Vo1.5, No.ll/12, pp.583-601 (1965)F.Rosenblatt: TM peTCeptron: aprobobilisitic model/or information storage anti organization in the brain; Psychological Review, Vo165, pp. 386408 (1958) auch in [ANDR88]
F.Rosenblatt: Princip1es
0/
neurodynomics; Spartan Books, Washington DC, 1962 1. Rubner, K. Schulten, P. Tavan: A Self-Organizing Network tor Complete Feature Extraction; in [ECK90], pp. 365-36SD.E. Rumelhart, 1.L. McClelland: Parallel Distributed Processing; Vol1,1I,
m
MIT press, Cambridge, Massachusetts 1986
D.E. Rumelhart, D. Zipser: Feature discovery by competitive learning;
Cognitive Science Vo19, pp. 7-112
Terence D. Sanger: Optimal Unsupervised Leaming in a Single-Layer Linear Feedfo7Ward Neural Network; Proc. of the Int Conf. Neural Networks, Boston 1988, Pergamon press; und in Neural Networlcs, Vol. 2, pp.459473 (1989) R. F. Schrnidt, G. Thews: Einführung in die Physiologie des MenscMn;
Springer Verlag, Berlin 1976
B. Schünnann: Vorlesung "NeuronaleNetze"WS90191;
Universität Frankfurt 1990
E. Schönebur$: Stock price prediction using neural networks: A project report;
Neurocomputing, Vo12, pp. 17-27 ; Elsevier Science Publ. 1990 H. G. Schuster: Deterministic Choos: An Introduction;
Physik-Verlag, Weilheim 1984
H. G. Schuster: Private Mitteilung; Inst f. Theor. Physik, Universität Kiel, 1990 Segraves, Rosenquist: TM afferent anti efferent callosal connections
0/
retinotopic defined areos in cat cortex; J. Neurosci., Vo18, pp. 1090-1107 (1982)
286
[SEJ86] T J. Sejnowski, C.R. Rosenberg: NEI'talk: A parallel network tholleams to read aloud; Jolm Hopkins University, Eleclrical Engeneering and Computer Science Techni.ealRepon JHUJEE CS- 86/01;au.ch in [ANDR88]
[SEJ86b] TJ. Sejnowski: High-Order Boltzmann Machines;
AlP Conf.
Proc.
Voll51, 398 (1986)[SHA49] C.E. Shannon, W.W eaver: The M dhematieal Theory of Information;
University of lllinois Press, Urbana 1949
und C.E. Shannon, W.Weaver: Mathematische Grundlagen der Informationstheorie;
Oldenbourg Verlag, München 1976 [SHAW88]
[SHU88]
[SING85]
[SM90]
[SMI82]
[SOM86]
[SPI90]
[STEE85]
[STElN61]
[TAK.79]
[TON60]
[TOU74]
[TRE89]
[TRE90]
[TSYP73]
G.L.shaw, G. Palrn (eds.): Brain ThuJry; World Scientific, Singapur 1988 Nan C. Shu: VisuaI Programming;
van nostrand Reinhold Comp., New York 1988 W. Singer: Hirnentwicklung und Umwelt;
Spektrum der Wissenschaft, MiIrz pp. 48-61 (1985)
Wolfram Schiffmann, Klaus Mecklenburg: Genetic Generation o[
Backpropagation Trained Neural Networks; in [ECK90]. pp. 205-208
D.Smith, C.lrby, R.Kimball, B. Verplanek: Designing the STAR User Interface;
Byte, April 1982, pp. 242-282
H.Sompolinsky, I. Kanter: Temporal Association in Asymmetrie Neural Networks; Physical review Letters, Vol.57, No. 22, pp.2861-2864 (1986)
Piet Spiessens, Bemard Manderick: A Genetie Algorithm For Massively Parallel Computers; in [ECK90], pp. 31-36
L. Steels, W. Van de WeIde: Leoming in second generation expen systems;
in: Kowalik (ed), Knowledge based problem solvmg, Prentice Hall 1985 K. Steinbuch: Die Lemmolrix; Kybernetik, Voll, pp.3645 (1961)
T.Tacheuchi, S.-I. Amati: FOTf1I.dion of topogrtflhic mops and eolumnar microstrllcture in nerveFl41ds; Biological Cybernetics, Vol3, pp.63-72 (1979) 1.Tonndorf: Dimensi01llJl Analysis 0/ eochlear models;
1. Acoust. Soc. Amer., Vo132, pp 293 (1960)
1.T. Tou, R.C. Gonzalez: Pdtem Reeognition Principles;
Addison-Wesley Publ. Comp., 1974
Philip Treleaven: Neurocomputers; Int Joum. of Neurocomputing, Vol 1/1, pp.4-31, Elsevier Publ. Comp. (1989)
M. Azema-Barac, M. Hewetson, M. Reece, 1. Taylor, P. Treleaven, M. Vellasco:
PY GMALION Neural Network Programming Environment;
Proc. INNC-90, pp. 709-712, Kluwer Ac. Publ. 1990 Tsypkin: Foundttions o/Ihe ThuJry o[ Leaming Systems;
Academic Press, New
York
1973[UES72] Uesa, Ozeki: Some properties of associfiive type memories;
Joum. Inst of EI. and Comm. Eng. of Japan,YoI55-D, pp.323-330 (1972) [VITI89a] E. Vittoz: Analog VLSIImplemenlfiion of Neural Networks;
Proc. Joumees d' electron., Ecole Polytechnique Federale, Lausanne 1989
[VITI89b] E. Vittoz, X. Arreguit: CMOS integrfiion of Herault-Jutten cells for seperation of sources; in: C. Mead, M. Ismail (eds), Analog Implementation ofNeural Systems, Kluwer Academic PubI. ,Norwell1989
[W AN90] L.Wang, J.Ross: On Dynomics of HigMr Order Neural Networks: Existences of OscilloJions and Chaos; Proc. INNC-90, pp.945-947 (Paris 1990)
[WECH88] Harry Wechsler, George L. Zimmermann: Invariant Object Reeognition Using a DistributedAssociative Memory; in [AND88], pp. 830-839
[WID60] B Widrow, M. Hoff: Adaptive switching circuits; 1960 IRE WESCON Convention Record, New York: IRE, pp.96-104; auch in [ANDR88]
[WID85] B.Widrow, S. Steams: Adaptive Signal Proeessing;
Prentice Hall, Englewood Cliffs, N.J., 1985
[WID88] B.Widrow, R.Winter: Neural Nets for Adaptive Filtering and Adaptive Pattern Recognition; IEEE Computer, Vol21 No 3, pp.25-39 (1988)
[WILL69] D. Willshaw, O.Buneman, H. Longuet-Higgins: Non-holographie associative memory; Nature, Vol222, pp. 960-962 (1969); auch in [ANDR88]
[WILL71] D. Willshaw: Models of distributed associative memory;
Unpublished doctoral dissertion, Edinburgh University (1971)
[WILL76] D. Willshaw, C.van der Malsburg: How pattemed neural connections ean be set up by self-organizfiion; Proc. Royal Soc. of London, Vol B-I94, pp. 431-335 (1976) [WIL76] G. Willwacher: Fähigkeiten eines assozifiiven SpeicMrsystems im Vergleich zu
Gehimfunktionen; Biol. Cybemetics, Vol. 24, pp. 181-198 (1976)
[WIL88] G.V.Wilson, G.S.Pawley: On the Stability of tM Travelling Salesman Problem A 19orithm of Hopfu!ld and Tank; BioI. Cybemetics, Vol 58 pp.63-70 (1988) [WIN89] Jack H. Winters, Christopher Rose: Minimum Distance Automata in Parallel
N etworks for Optimum Clossification; Neural Networks,VoI.2,pp.127 -132 (1989)
Stichworte
A
Abklingkurve 40 Abtastung 225, 232 Adaline 86ff Ähnlichkeit
- visuelle 67 - Korrelations- 95 - Abstands- 97 Affine Funktionen 52, 221 Aktienkurse 228
Aktionspotential 33 Aktivitätsfunktion 37 Approximation 50ff ART-Architektur 154ff Assoziativspeicher 90ff - konventionelle 90 ff - konvolutions 105 - korrelative 92ff - rückgekoppelte 169 ff Attraktor 187, 2Q2ff
Aufmerksamkeitsgest. Systeme 154ff Ausgabefunktion 38,
~- begrenzt-linear 41
- binär 40- Fermi-Funktion 42 - Hyperbol. Tangens 42 - Kosinus 43
- probabilistisch 200, 206 - sigmoidal42
Axon 33, 36
BBackpropagation 50, .l.O1. 227ff, 254, 263 BAM -+ Bidirektionaler Speicher Basi1armembran 24ff
Bayes-Risiko 61
Bidirektionaler Speicher l22ff, 197 Bifurkationen 211, 216
Börsenkurse 228
Boltzmann-Maschinen 205ff Boltzmann-Verteilung 200, 206 building blocks 251
C
Chaos209ff
chaotisches System 209ff Chromosomen 242 cerebellum -+ Kleinhirn clamping 205ff
Clipped synapses 103, 177, 121 cochlea 23ff
Codebespiele
1.2.1 Sigma-Neuron 43
1.3.1 Stochast. Approximation 66 2.1.1 Perceptron 83
2.1.2 Perceptron 83 2.2.1 Assoziativspeicher 94 2.3.1 Backpropagation 111 2.6.1 Nachbarschaftser. Abb. 139 2.7.1 ART! 162
2.7.2 Winner-take-allI63 2.7.3 Aufmerksamkeit 163 2.7.4 Musterspeicherung 163 3.1.1 Hopfield-Netzalgorithmus 180 3.1.2 Energie-Algorithmus 182 3.1.3 Ho-Kashyap Verfahren 198 3.2.1 Metropolis-Algorithmus 201 3.2.2 Simulated Annealing 203 5.1.1 Mut.-Selektions Strategie 243 5.2.2 Reproduktionsplan 248 5.2.3 Selektion und Fitness 248 5.3.1 Architektur-Fitnessfunktion 253 competitive leaming.l21ff, 157, 160 complex cell 18
connection- machine 258., 260, 264 corpus callosum
13cortex.ll, 167
- Motor-
13,27.22.ff- Somato-sensorischer 13, 14 Counterpropagation-Netze 142 cross-over 246, 249
D
Datenkompression 118 Dendriten 32
Differenzialgleichungen, zeitliche 39 Diskrimierungsfunktion 45
Display 267
E
Eigenfrequenz 78 Eigenfunktion 80
Eigenwert 75,18ff, 125, 170ff Eigenvektor 75,18ff, 118, 130, 170 Energie 167, 179ff, 199,216,219 Entropie 52,58,207,212 Erwartungswert 62 Eulerzahl 85
Evolutionäre Algorithmen 242ff F
Faktorenanalyse 77
Feed-forward Netze.52ff, 107
Feed-forward Assoziativspeicher 90, 230 Feigenbaum-Zahl 211
Fermi-Funktion ~ Ausgabefunktion Fitness 242, 246, 248, 251, 253
Fixpunkt~,217
formales Neuron 35ff,
.Ja
Fovea 16ff
Gamba-Perzeptron 51, 86
G
Gatter- UND 37 -ODER 37 - XOR46ff Gehör 23ff Gene
246,W
genetische Algorithmen 245ff genetische Drift 252
genetische Operatoren 245, 249 Genotyp 246, 254
Genpool245
Glass-Mackey Gleichung 226 Glauber-Dynamik 181 globale Daten 255
globales Optimum ~ Optimum Gradientensuche 62ff
graphische Programmierung 268ff Großmuttemeuron 18
Gütefunktion 242 ~ Zielfunktion
H
Hamming-Abstand 100, 189
Handlungsreisender lllff, 183ff, 203ff, 219,243
Hauptachsentransformation 77 Hauptkomponentenanalyse 76ff, 172 hidden units 107, 144ff, 205
höhere Synapsen,38, 47, 70, 73,208 Ho-Kashyap Verfahren 196
homogene Markov-Kette 202 Hopfield-Modell.l12ff Hypercube-Netzwerk 258 I
Information 5$f,
.56,
79, 105, 118, 125, 141,148,207,212- maximale 57, 77, 79,118,141 - subjektive 207
- -verteilung 148
Invariante Mustererkennung.6Qff, 102 Inversion 249
Iterierte Funktionensysteme (IFS) 220ff K
Kanal 57
Kanalkapazität 57
Kanerva-Speicher 104, 194ff Kantenmutation 244
Karhunen-Loeve Transf. 80, 118, 121, 123 Klassengrenze 44
Klassenprototyp 66, 95,137,154,177,196 Klassifizierung 44,
m,
98ff, 125Kleinhirn 13,22., 233 Kontext 234ff, 240ff Kontrollsignal 136, 158, 230 Konvolutionsspeicher 105 Korrelationsmatrix 75, 79 Korrelationsspeicher 92ff Kostenfunktion ~ Zielfunktion Kovarianzmatrix 79, 115, 172 Kühlschema
202,
218, 245, 252 Kurzzeitspeicher 156, 176, 191L
Langzeitspeicher 159, 176 laterale Inhibition 119, 124, 133ff Lemen60
Lemrate 63,
M,
130Lernregel 63 - Anti-Hebb 124 - Boltzmann 208
- Competitive Leaming 129 - Delta 88, 109
- generalisierte Hebb 121 - Hebb 73
- inverse Kinematik 147 - Kohonen map 138 - Oja 75,119 - Perzeptron 83ff - Stoch. Approx. 65 - Widrow-Hoff88 lineare Schichten 54
lineare Separierung 44, 46, 81ff, 84 Ljapunov-Funktion
l68.,
181logistische Abbildung 50,
2Q2
long term memory ~ Langzeitspeicher M
mean-field Theorie 190 Metropolis-Algorithmus 199ff Mexikanerhut-Funktion 18, 133, 138 Monte-Carlo Verfahren 199,245
Motor-Cortex~ cortex Motorik: 27ff
Multi-Layer Perzeptron 86 Multiprozessoren 187,257,260 Muskeln 27ff
Muskelsensoren 28 Muskelspindeln 28 Muster 44
Musterergänzung 100, 102, 178, 189 Mustererkennung 44,
6Qff
Mustergenerator 267 Mutation 243, 244, 246,25.0.
Mutations-Selektions Strategie 242ff N
Nachbarschaftsabhäng. Parallelarbeit 261
Nachbarschaftserhalt. Abbildung 132ff Neocognitron 70ff
NETtalk 112
Netzwerkbeschreibung 268 Netzwerkeditor 268 Neuronen 31ff, 35 neuronale Chips 259 neuronales Netz 38
o
ON/OFF-Zellen 18ff, 157
On-Line/Off-Line Algorithmus 110,267 optimale Abbildungen 141
optimale Informationsverteilung 148 optimale Schichten 57
Optimierung 242
Optimum, lokales u. globales 63,116,243 Outstar 230ff
Outstar avalanche 233 p
Parallelarbeit 98ff, 261, 263 Partitionierung 261
Perceptron 81 ff Periodenfenster 211 Phasenraum 226
Plastizitäts-Stabilitäts Dilemma 154 Positionsmutation 244
Prädikat 82
principal components 76 principal variables 76 Prototyp ~Klassenprototyp Pseudo-Inverse~, 198 Pyramidenzellen 31ff
Q
Quetschfunktion 42, 52R
Reflexe 29
Reflexiver Speicher 192 Rekombination 250 Relationale Datenbank 100
Relaxation 205 Reproduktionsplan 248 Reset-SignalI36, 156, 165 Resonanz 78,.l1Q, 173
- adaptive 155, l.S.8.
Retina 15ff, 81 Rezeptive Felder 18ff
Risikofunktion
~Zielfunktion Robotersteuerung 31, 1M.
Rückkopplung 53, 167ff, 234 Rundreise
~Handlungsreisender S
sampling
~Abtasttmg Schema 247
Sehrinde 13
Selbstorganisation 140
short-term memory
~Kurzzeitspeicher Sigma-Pi unit47, 52
Sigma unit 37, 51 sigmoide Funktion
~,49
simulated annealing 2Ulff, 218, 245 Simulation 265ff
Simulationskontrolle 266 Single-Prozessor 257
Somato-sensorischer
cortex~cortex Somatopie 29ff
spärliche Kodierung 103, 166, l2Mf Speicherkapazität 102, 177, 188, 190 Spikes 33ff
Sping1äser 179
Spracherkennung 143, 174
sqashing function
~Quetschfunktion Stammhirn 13
stochastische Approximation 64,138 stochastisches Lernen 65
stochastische Mustererkennung
~
Mustererlcennung Stützmotorik 29
Straffunktion
~Zielfunktion subspace-Netzwerk 118ff Synapsen 33, 36
Systemzustand
~Zustand291 T
Tanzsaal-Architektur 256 Tektorialmembran 24ff
topologie-erhaltende Abbildung 132ff Transferfunktion 38
Transinformation 58 U Übertragungsrate 57, 89
V
virtuelle Maschinen 268 Vorzimmer-Architektur 256
w Wanderwelle 26
winner-take-a11128, 137, 157 Z
Zeitmodellierung 32., 74 Zeitfenster 226
Zeitreihen 225ff Zeitsequenzen 225ff Zeitverzögerung 237ff Zielfunktion 60,168,176,242
- Adaline 88
- Backpropagation 108 - Perzeptron 84
Zustand 170, 173,176,179
Leitfäden und Monographien der Informatik
Bolch: Leistungsbewertung von Rechensystemen mittels analytischer Warteschlangenmodelle 320 Seiten. Kart. DM 44,-
Brauer: Automatentheorie 493 Seiten. Geb. DM 62,- Brause: Neuronale Netze 291 Seiten. Kart. DM 39,80
Dal Gin: Grundlagen der systemnahen Programmierung 221 Seiten. Kart. DM 36,-
DoberkaVFox: Software Prototyping mit SETL 227 Seiten. Kart. DM 38,-
Ehrich/Gogolla/Lipeck: Algebraische Spezifikation abstrakter Datentypen 246 Seiten. Kart. DM 38,-
Engeler/Läuchli: Berechnungstheorie für Informatiker 120 Seiten. Kart. DM 26,-
Erhard: Parallelrechnerstrukturen X, 251 Seiten. Kart. DM 39.80
HeinemannIWeihrauch: Logik für Informatiker VIII, 239 Seiten. Kart. DM 36,-
Hentschke: Grundzüge der Digitaltechnik 247 Seiten. Kart. DM 36,-
Hotz: Einführung in die Informatik 548 Seiten. Kart. DM 52,-
Kiyek/Schwarz: Mathematik für Informatiker 1 307 Seiten. Kart. DM 39,80
Kiyek/Schwarz: Mathematik für Informatiker 2 X, 460 Seiten. Kart. DM 54,-
Klaeren: Vom Problem zum Programm 228 Seiten. Kart. DM 32,-
Kolla/Molitor/Osthof: Einführung in den VLSI-Entwurf 352 Seiten. Kart. DM 48,-
Loeckx/MehlhomlWilhelm: Grundlagen der Programmiersprachen 448 Seiten. Kart. DM 48,-
Mathar/Pfeifer: Stochastik für Informatiker VIII, 359 Seiten. Kart. DM 48,-
B. G. Teubner Stuttgart
Leitfäden und Monographien der Informatik
Mehlhom: Datenstrukturen und effiziente Algorithmen Band 1: Sortieren und Suchen
2. Aufl. 317 Seiten. Geb. DM 49,80
Messerschmidt: Linguistische Datenverarbeitung mit Comskee 207 Seiten. Kart. DM 36,-
NiemannlBunke: Künstliche Intelligenz in Bild- und Sprachanalyse 256 Seiten. Kart. DM 38,-
Pflug: Stochastische Modelle in der Informatik 272 Seiten. Kart. DM 39,80
Post: Entwurf und Technologie hochintegrierter Schaltungen 247 Seiten. Kart. DM 38,-
Rammig: Systematische"r Entwurf digitaler Systeme 353 Seiten. Kart. DM 46,-
Heischuck: Einführung in die Komplexitätstheorie XVII, 413 Seiten. Kart. DM 52,-
Richter: Betriebssysteme 2. Aufl. 303 Seiten. Kart. DM 39,80
Richter: Prinzipien der Künstlichen Intelligenz 359 Seiten. Kart. DM 46,-
Starke: Analyse von Petri-Netz-Modellen 253 Seiten. Kart. DM 42,-
Weck: Prinzipien und Realisierung von Betriebssystemen 3. Aufl. 306 Seiten. Kart. DM 42,-
Wegener: Effiziente Algorithmen für grundlegende Funktionen 270 Seiten. Kart. DM 39,80
Wirth: Algorithmen und Datenstrukturen Pascal-Version
3. Aufl. 320 Seiten. Kart. DM 42,-
Wirth: Algorithmen und Datenstrukturen mit Modula-2 4. Aufl. 299 Seiten. Kart. DM 42,-
Wojtkowiak: Test und Testbarkeit digitaler Schaltungen 226 Seiten. Kart. DM 36,-
Preisänderungen vorbehalten