Neural Network Frequently Asked Questions (FAQ)

The information displayed here is part of the FAQ monthly posted to comp.ai.neural-nets
Last modification: 12-01-1994

  1. What is the comp.ai.neural-nets newsgroup for ? How should it be used ?
  2. What is a neural network (NN) ?
  3. What can you do with a Neural Network and what not ?
  4. Who is concerned with Neural Networks ?
  5. What does 'backprop' mean ?
  6. How many learning methods for NNs exist ? Which ?
  7. What about Genetic Algorithms ?
  8. What about Fuzzy Logic ?
  9. Good introductory literature about Neural Networks ?
  10. Any journals and magazines about Neural Networks ?
  11. The most important conferences concerned with Neural Networks ?
  12. Neural Network Associations ?
  13. Other sources of information about NNs ?
  14. Freely available software packages for NN simulation ?
  15. Commercial software packages for NN simulation ?
  16. Neural Network hardware ?
  17. Databases for experimentation with NNs ?
  18. Acknowledgements

A1) What is the comp.ai.neural-nets newsgroup for ?

The newsgroup comp.ai.neural-nets is inteded as a forum for people who want to use or explore the capabilities of Artificial Neural Networks or Neural-Network-like structures.
There should be the following types of articles in this newsgroup:
  1. Requests.
    Requests are articles of the form
    "I am looking for X"
    where X is something public like a book, an article, a piece of software. The most important about such a request is to be as specific as possible!
    If multiple different answers can be expected, the person making the request should prepare to make a summary of the answers he/she got and announce to do so with a phrase like
    "Please reply by email, I'll summarize to the group"
    at the end of the posting.
    The Subject line of the posting should then be something like
    "Request: X"
  2. Questions.
    As opposed to requests, question ask for a larger piece of information or a more or less detailed explanation of something. To avoid lots of redundant traffic it is important that the poster provides with the question all information s/he already has about the subject asked and state the actual question as precise and narrow as possible.
    The poster should prepare to make a summary of the answers he/she got and announce to do so with a phrase like
    "Please reply by email, I'll summarize to the group"
    at the end of the posting.
    The Subject line of the posting should be something like
    "Question: this-and-that"
    or have the form of a question (i.e., end with a question mark)
  3. Answers
    These are reactions to questions or requests. As a rule of thumb articles of type "answer" should be rare. Ideally, in most cases either the answer is too specific to be of general interest (and should thus be e-mailed to the poster) or a summary was announced with the question or request (and answers should thus be e-mailed to the poster).
    The subject lines of answers are automatically adjusted by the news software. Note that sometimes longer threads of discussion evolve from an answer to a question or request. In this case posters should change the subject line suitably as soon as the topic goes too far away from the one announced in the original subject line. You can still carry along the old subject in parentheses in the form
    "Subject: <...new subject...> (was: <...old subject...>)
  4. Summaries
    In all cases of requests or questions the answers for which can be assumed to be of some general interest, the poster of the request or question shall summarize the ansers he/she received. Such a summary should be announced in the original posting of the question or request with a phrase like
    "Please answer by email, I'll summarize"
    In such a case, people who answer to a question should NOT post their answer to the newsgroup but instead mail them to the poster of the question who collects and reviews them. After about 5 to 20 days after the original posting, its poster should make the summary of answers and post it to the newsgroup.
    Some care should be invested into a summary:
    1. simple concatenation of all the answers is not enough: instead, redundancies, irrelevancies, verbosities, and errors should be filtered out (as good as possible)
    2. the answers should be separated clearly
    3. the contributors of the individual answers should be identifiable (unless they requested to remain anonymous [yes, that happens])
    4. the summary should start with the "quintessence" of the answers, as seen by the original poster
    5. A summary should, when posted, clearly be indicated to be one by giving it a Subject line starting with "SUMMARY:"
    Note that a good summary is pure gold for the rest of the newsgroup community, so summary work will be most appreciated by all of us. (Good summaries are more valuable than any moderator ! :-> )
  5. Announcements
    Some articles never need any public reaction. These are called announcements (for instance for a workshop, conference or the availability of some technical report or software system).
    Announcements should be clearly indicated to be such by giving them a subject line of the form
    "Announcement: this-and-that"
  6. Reports
    Sometimes people spontaneously want to report something to the newsgroup. This might be special experiences with some software, results of own experiments or conceptual work, or especially interesting information from somewhere else.
    Reports should be clearly indicated to be such by giving them a subject line of the form
    "Report: this-and-that"
  7. Discussions
    An especially valuable possibility of Usenet is of course that of discussing a certain topic with hundreds of potential participants. All traffic in the newsgroup that can not be subsumed under one of the above categories should belong to a discussion.
    If somebody explicitly wants to start a discussion, he/she can do so by giving the posting a subject line of the form
    "Subject: Discussion: this-and-that"
    It is quite difficult to keep a discussion from drifting into chaos, but, unfortunately, as many many other newsgroups show there seems to be no secure way to avoid this. On the other hand, comp.ai.neural-nets has not had many problems with this effect in the past, so let's just go and hope... :->

A2) What is a neural network (NN) ?

First of all, when we are talking about a neural network, we *should* usually better say "artificial neural network" (ANN), because that is what we mean most of the time. Biological neural networks are much more complicated in their elementary structures than the mathematical models we use for ANNs.
A vague description is as follows:
An ANN is a network of many very simple processors ("units"), each possibly having a (small amount of) local memory. The units are connected by unidirectional communication channels ("connections"), which carry numeric (as opposed to symbolic) data. The units operate only on their local data and on the inputs they receive via the connections.
The design motivation is what distinguishes neural networks from other mathematical techniques:
A neural network is a processing device, either an algorithm, or actual hardware, whose design was motivated by the design and functioning of human brains and components thereof.
Most neural networks have some sort of "training" rule whereby the weights of connections are adjusted on the basis of presented patterns. In other words, neural networks "learn" from examples, just like children learn to recognize dogs from examples of dogs, and exhibit some structural capability for generalization.
Neural networks normally have great potential for parallelism, since the computations of the components are independent of each other.

A3) What can you do with a Neural Network and what not ?

In principle, NNs can compute any computable function, i.e. they can do everything a normal digital computer can do. Especially anything that can be represented as a mapping between vector spaces be approximated to arbitrary precision by feedforward NNs (which is the most often used type).
In practice, NNs are especially useful for mapping problems which are tolerant of a high error rate, have lots of example data available, but to which hard and fast rules can not easily be applied. NNs are, at least today, difficult to apply successfully to problems that concern manipulation of symbols and memory.

A4) Who is concerned with Neural Networks ?

Neural Networks are interesting for quite a lot of very dissimilar people:

A5) What does 'backprop' mean ?

It is an abbreviation for 'backpropagation of error' which is the most widely used learning method for neural networks today. Although it has many disadvantages, which could be summarized in the sentence
"You are almost not knowing what you are actually doing when using backpropagation" :-)
it has pretty much success on practical applications and is relatively easy to apply.
It is for the training of layered (i.e., nodes are grouped in layers) feedforward (i.e., the arcs joining nodes are unidirectional, and there are no cycles) nets.
Back-propagation needs a teacher that knows the correct output for any input ("supervised learning") and uses gradient descent on the error (as provided by the teacher) to train the weights. The activation function is (usually) a sigmoidal (i.e., bounded above and below, but differentiable) function of a weighted sum of the nodes inputs.
The use of a gradient descent algorithm to train its weights makes it slow to train; but being a feedforward algorithm, it is quite rapid during the recall phase.
Literature:
Rumelhart, D. E. and McClelland, J. L. (1986): Parallel Distributed Processing: Explorations in the Microstructure of Cognition (volume 1, pp 318-362). The MIT Press.
(this is the classic one) or one of the dozens of other books or articles on backpropagation.

A6) How many learning methods for NNs exist ? Which ?

There are many many learning methods for NNs by now. Nobody can know exactly how many. New ones (at least variations of existing ones) are invented continuously. Below is a collection of some of the most well known methods; not claiming to be complete.
The main categorization of these methods is the distiction of supervised from unsupervised learning: Many of these learning methods are closely connected with a certain (class of) network topology.
Now here is the list, just giving some names:
  1. UNSUPERVISED LEARNING (i.e. without a "teacher"):
    • Feedback Nets:
      • Additive Grossberg (AG)
      • Shunting Grossberg (SG)
      • Binary Adaptive Resonance Theory (ART1)
      • Analog Adaptive Resonance Theory (ART2, ART2a)
      • Discrete Hopfield (DH)
      • Continuous Hopfield (CH)
      • Discrete Bidirectional Associative Memory (BAM)
      • Temporal Associative Memory (TAM)
      • Adaptive Bidirectional Associative Memory (ABAM)
      • Kohonen Self-organizing Map (SOM)
      • Kohonen Topology-preserving Map (TPM)
    • Feedforward-only Nets:
      • Learning Matrix (LM)
      • Driver-Reinforcement Learning (DR)
      • Linear Associative Memory (LAM)
      • Optimal Linear Associative Memory (OLAM)
      • Sparse Distributed Associative Memory (SDM)
      • Fuzzy Associative Memory (FAM)
      • Counterprogation (CPN)
  2. SUPERVISED LEARNING (i.e. with a "teacher"):
    • Feedback Nets:
      • Brain-State-in-a-Box (BSB)
      • Fuzzy Congitive Map (FCM)
      • Boltzmann Machine (BM)
      • Mean Field Annealing (MFT)
      • Recurrent Cascade Correlation (RCC)
      • Learning Vector Quantization (LVQ)
    • Feedforward-only Nets:
      • Perceptron
      • Adaline, Madaline
      • Backpropagation (BP)
      • Cauchy Machine (CM)
      • Adaptive Heuristic Critic (AHC)
      • Time Delay Neural Network (TDNN)
      • Associative Reward Penalty (ARP)
      • Avalanche Matched Filter (AMF)
      • Backpercolation (Perc)
      • Artmap
      • Adaptive Logic Network (ALN)
      • Cascade Correlation (CasCor)

A7) What about Genetic Algorithms ?

There are a number of definitions of GA (Genetic Algorithm). A possible one is
A GA is an optimization program
that starts with some encoded procedure, (Creation of Life)
mutates it stochastically, (Get cancer or so)
and uses a selection process (Darwinism)
to prefer the mutants with high fitness
and perhaps a recombination process (Make babies)
to combine properties of (preferably) the succesful mutants.
There is a newsgroup that is dedicated to the field of evolutionary computation called comp.ai.genetic. It has a detailed FAQ posting which, for instance, explains the terms "Genetic Algorithm", "Evolutionary Programming", "Evolution Strategy", "Classifier System", and "Genetic Programming". That FAQ also contains lots of pointers to relevant literature, software, other sources of information, et cetera et cetera. Please see the comp.ai.genetic FAQ for further information.
Here on our server we have a copy: here(1) and here(2) and here(3) and here(4) and also about free packages etc here(5) and here(6)
The FAQ files also include a nice glossary , unfortunately only in a plain text form.

A8) What about Fuzzy Logic ?

here(4) Fuzzy Logic is an area of research based on the work of L.A. Zadeh. It is a departure from classical two-valued sets and logic, that uses "soft" linguistic (e.g. large, hot, tall) system variables and a continuous range of truth values in the interval [0,1], rather than strict binary (True or False) decisions and assignments.
Fuzzy logic is used where a system is difficult to model exactly (but an inexact model is available), is controlled by a human operator or expert, or where ambiguity or vagueness is common. A typical fuzzy system consists of a rule base, membership functions, and an inference procedure.
Most Fuzzy Logic discussion takes place in the newsgroup comp.ai.fuzzy, but there is also some work (and discussion) about combining fuzzy logic with Neural Network approaches in comp.ai.neural-nets.
There is also A href="ftp://rtfm.mit.edu/pub/usenet-by-group/comp.ai.fuzzy/">FAQ of comp.ai.fuzzy. For more details see (for example):

A9) Good introductory literature about Neural Networks ?

  1. The best:
    • Hecht-Nielsen, R. (1990). Neurocomputing. Addison Wesley.
      Comments: "A good book", "comprises a nice historical overview and a chapter about NN hardware. Well structured prose. Makes important concepts clear."
    • Hertz, J., Krogh, A., and Palmer, R. (1991). Introduction to the Theory of Neural Computation. Addison-Wesley: Redwood City, California. ISBN 0-201-50395-6 (hardbound) and 0-201-51560-1 (paperbound)
      Comments: "My first impression is that this one is by far the best book on the topic. And it's below $30 for the paperback."; "Well written, theoretical (but not overwhelming)"; It provides a good balance of model development, computational algorithms, and applications. The mathematical derivations are especially well done"; "Nice mathematical analysis on the mechanism of different learning algorithms"; "It is NOT for mathematical beginner. If you don't have a good grasp of higher level math, this book can be really tough to get through."

  2. Books for the beginner:
    • Aleksander, I. and Morton, H. (1990). An Introduction to Neural Computing. Chapman and Hall. (ISBN 0-412-37780-2).
      Comments: "This book seems to be intended for the first year of university education."
    • Beale, R. and Jackson, T. (1990). Neural Computing, an Introduction. Adam Hilger, IOP Publishing Ltd : Bristol. (ISBN 0-85274-262-2).
      Comments: "It's clearly written. Lots of hints as to how to get the adaptive models covered to work (not always well explained in the original sources). Consistent mathematical terminology. Covers perceptrons, error-backpropagation, Kohonen self-org model, Hopfield type models, ART, and associative memories."
    • Dayhoff, J. E. (1990). Neural Network Architectures: An Introduction. Van Nostrand Reinhold: New York.
      Comments: "Like Wasserman's book, Dayhoff's book is also very easy to understand".
    • McClelland, J. L. and Rumelhart, D. E. (1988). Explorations in Parallel Distributed Processing: Computational Models of Cognition and Perception (software manual). The MIT Press.
      Comments: "Written in a tutorial style, and includes 2 diskettes of NN simulation programs that can be compiled on MS-DOS or Unix (and they do too !)"; "The programs are pretty reasonable as an introduction to some of the things that NNs can do."; "There are two editions of this book. One comes with disks for the IBM PC, the other comes with disks for the Macintosh".
    • McCord Nelson, M. and Illingworth, W.T. (1990). A Practical Guide to Neural Nets. Addison-Wesley Publishing Company, Inc. (ISBN 0-201-52376-0).
      Comments: "No formulas at all( ==> no good)"; "It does not have much detailed model development (very few equations), but it does present many areas of application. It includes a chapter on current areas of research. A variety of commercial applications is discussed in chapter 1. It also includes a program diskette with a fancy graphical interface (unlike the PDP diskette)".
    • Orchard, G.A. & Phillips, W.A. (1991). Neural Computation: A Beginner's Guide. Lawrence Earlbaum Associates: London.
      Comments: "Short user-friendly introduction to the area, with a non-technical flavour. Apparently accompanies a software package, but I haven't seen that yet".
    • Wasserman, P. D. (1989). Neural Computing: Theory & Practice. Van Nostrand Reinhold: New York. (ISBN 0-442-20743-3).
      Comments: "Wasserman flatly enumerates some common architectures from an engineer's perspective ('how it works') without ever addressing the underlying fundamentals ('why it works') - important basic concepts such as clustering, principal components or gradient descent are not treated. It's also full of errors, and unhelpful diagrams drawn with what appears to be PCB board layout software from the '70s. For anyone who wants to do active research in the field I consider it quite inadequate"; "Okay, but too shallow"; "Quite easy to understand"; "The best bedtime reading for Neural Networks. I have given this book to numerous collegues who want to know NN basics, but who never plan to implement anything. An excellent book to give your manager."

  3. The classics:
    • Kohonen, T. (1984). Self-organization and Associative Memory. Springer-Verlag: New York. (2nd Edition: 1988; 3rd edition: 1989).
      Comments: "The section on Pattern mathematics is excellent."
    • Rumelhart, D. E. and McClelland, J. L. (1986). Parallel Distributed Processing: Explorations in the Microstructure of Cognition (volumes 1 & 2). The MIT Press.
      Comments: "As a computer scientist I found the two Rumelhart and McClelland books really heavy going and definitely not the sort of thing to read if you are a beginner."; "It's quite readable, and affordable (about $65 for both volumes)."; "THE Connectionist bible.".

  4. Introductory journal articles:
    • Hinton, G. E. (1989). Connectionist learning procedures. Artificial Intelligence, Vol. 40, pp. 185--234.
      Comments: "One of the better neural networks overview papers, although the distinction between network topology and learning algorithm is not always very clear. Could very well be used as an introduction to neural networks."
    • Knight, K. (1990). Connectionist, Ideas and Algorithms. Communications of the ACM. November 1990. Vol.33 nr.11, pp 59-74.
      Comments:"A good article, while it is for most people easy to find a copy of this journal."
    • Kohonen, T. (1988). An Introduction to Neural Computing. Neural Networks, vol. 1, no. 1. pp. 3-16.
      Comments: "A general review".

  5. Not-quite-so-introductory literature:
    • Anderson, J. A. and Rosenfeld, E. (Eds). (1988). Neurocomputing: Foundations of Research. The MIT Press: Cambridge, MA.
      Comments: "An expensive book, but excellent for reference. It is a collection of reprints of most of the major papers in the field."
    • Anderson, J. A., Pellionisz, A. and Rosenfeld, E. (Eds). (1990). Neurocomputing 2: Directions for Research. The MIT Press: Cambridge, MA.
      Comments: "The sequel to their well-known Neurocomputing book."
    • Caudill, M. and Butler, C. (1990). Naturally Intelligent Systems. MIT Press: Cambridge, Massachusetts. (ISBN 0-262-03156-6).
      Comments: "I guess one of the best books I read"; "May not be suited for people who want to do some research in the area".
    • Khanna, T. (1990). Foundations of Neural Networks. Addison-Wesley: New York.
      Comments: "Not so bad (with a page of erroneous formulas (if I remember well), and #hidden layers isn't well described)."; "Khanna's intention in writing his book with math analysis should be commended but he made several mistakes in the math part".
    • Levine, D. S. (1990). Introduction to Neural and Cognitive Modeling. Lawrence Erlbaum: Hillsdale, N.J.
      Comments: "Highly recommended".
    • Lippmann, R. P. (April 1987). An introduction to computing with neural nets. IEEE Acoustics, Speech, and Signal Processing Magazine. vol. 2, no. 4, pp 4-22.
      Comments: "Much acclaimed as an overview of neural networks, but rather inaccurate on several points. The categorization into binary and continuous- valued input neural networks is rather arbitrary, and may work confusing for the unexperienced reader. Not all networks discussed are of equal importance."
    • Maren, A., Harston, C. and Pap, R., (1990). Handbook of Neural Computing Applications. Academic Press. ISBN: 0-12-471260-6. (451 pages).
      Comments: "They cover a broad area"; "Introductory with suggested applications implementation".
    • Pao, Y. H. (1989). Adaptive Pattern Recognition and Neural Networks Addison-Wesley Publishing Company, Inc. (ISBN 0-201-12584-6).
      Comments: "An excellent book that ties together classical approaches to pattern recognition with Neural Nets. Most other NN books do not even mention conventional approaches."
    • Rumelhart, D. E., Hinton, G. E. and Williams, R. J. (1986). Learning representations by back-propagating errors. Nature, vol 323 (9 October), pp. 533-536.
      Comments: "Gives a very good potted explanation of backprop NN's. It gives sufficient detail to write your own NN simulation."
    • Simpson, P. K. (1990). Artificial Neural Systems: Foundations, Paradigms, Applications and Implementations. Pergamon Press: New York.
      Comments: "Contains a very useful 37 page bibliography. A large number of paradigms are presented. On the negative side the book is very shallow. Best used as a complement to other books".
    • Zeidenberg. M. (1990). Neural Networks in Artificial Intelligence. Ellis Horwood, Ltd., Chichester.
      Comments: "Gives the AI point of view".
    • Zornetzer, S. F., Davis, J. L. and Lau, C. (1990). An Introduction to Neural and Electronic Networks. Academic Press. (ISBN 0-12-781881-2).
      Comments: "Covers quite a broad range of topics (collection of articles/papers )."; "Provides a primer-like introduction and overview for a broad audience, and employs a strong interdisciplinary emphasis".

A10) Any journals and magazines about Neural Networks ?

Dedicated Neural Network Journals:
Title: Neural Networks
Publish: Pergamon Press
Address: Pergamon Journals Inc., Fairview Park, Elmsford, New York 10523, USA and Pergamon Journals Ltd. Headington Hill Hall, Oxford OX3, 0BW, England
Freq.: 6 issues/year (vol. 1 in 1988)
Cost/Yr: Free with INNS membership ($45?), Individual $65, Institution $175
ISSN #: 0893-6080
Remark: Official Journal of International Neural Network Society (INNS). Contains Original Contributions, Invited Review Articles, Letters to Editor, Invited Book Reviews, Editorials, Announcements and INNS News, Software Surveys. This is probably the most popular NN journal. (Note: Remarks supplied by Mike Plonski "plonski@aero.org")
Title: Neural Computation
Publish: MIT Press
Address: MIT Press Journals, 55 Hayward Street Cambridge, MA 02142-9949, USA, Phone: (617) 253-2889 Freq.: Quarterly (vol. 1 in 1989)
Cost/Yr: Individual $45, Institution $90, Students $35; Add $9 Outside USA
ISSN #: 0899-7667
Remark: Combination of Reviews (10,000 words), Views (4,000 words) and Letters (2,000 words). I have found this journal to be of outstanding quality. (Note: Remarks supplied by Mike Plonski "plonski@aero.org")
Title: IEEE Transaction on Neural Networks
Publish: Institute of Electrical and Electronics Engineers (IEEE)
Address: IEEE Service Cemter, 445 Hoes Lane, P.O. Box 1331, Piscataway, NJ, 08855-1331 USA. Tel: (201) 981-0060
Cost/Yr: $10 for Members belonging to participating IEEE societies
Freq.: Quarterly (vol. 1 in March 1990)
Remark: Devoted to the science and technology of neural networks which disclose significant technical knowledge, exploratory developments and applications of neural networks from biology to software to hardware. Emphasis is on artificial neural networks. Specific aspects include self organizing systems, neurobiological connections, network dynamics and architecture, speech recognition, electronic and photonic implementation, robotics and controls. Includes Letters concerning new research results. (Note: Remarks are from journal announcement)
Title: International Journal of Neural Systems
Publish: World Scientific Publishing
Address: USA: World Scientific Publishing Co., 687 Hartwell Street, Teaneck, NJ 07666. Tel: (201) 837-8858; Eurpoe: World Scientific Publishing Co. Pte. Ltd., 73 Lynton Mead, Totteridge, London N20-8DH, England. Tel: (01) 4462461; Other: World Scientific Publishing Co. Pte. Ltd., Farrer Road, P.O. Box 128, Singapore 9128. Tel: 2786188
Freq.: Quarterly (Vol. 1 in 1990?)
Cost/Yr: Individual $42, Institution $88 (plus $9-$17 for postage)
ISSN #: 0129-0657 (IJNS)
Remark: The International Journal of Neural Systems is a quarterly journal which covers information processing in natural and artificial neural systems. It publishes original contributions on all aspects of this broad subject which involves physics, biology, psychology, computer science and engineering. Contributions include research papers, reviews and short communications. The journal presents a fresh undogmatic attitude towards this multidisciplinary field with the aim to be a forum for novel ideas and improved understanding of collective and cooperative phenomena with computational capabilities. (Note: Remarks supplied by B. Lautrup (editor), "LAUTRUP%nbivax.nbi.dk@CUNYVM.CUNY.EDU" ) Review is reported to be very slow.
Title: Neural Network News
Publish: AIWeek Inc.
Address: Neural Network News, 2555 Cumberland Parkway, Suite 299, Atlanta, GA 30339 USA. Tel: (404) 434-2187
Freq.: Monthly (beginning September 1989)
Cost/Yr: USA and Canada $249, Elsewhere $299
Remark: Commericial Newsletter
Title: Network: Computation in Neural Systems
Publish: IOP Publishing Ltd
Address: Europe: IOP Publishing Ltd, Techno House, Redcliffe Way, Bristol BS1 6NX, UK; IN USA: American Institute of Physics, Subscriber Services 500 Sunnyside Blvd., Woodbury, NY 11797-2999
Freq.: Quarterly (1st issue 1990)
Cost/Yr: USA: $180, Europe: 110 pounds
Remark: Description: "a forum for integrating theoretical and experimental findings across relevant interdisciplinary boundaries." Contents: Submitted articles reviewed by two technical referees paper's interdisciplinary format and accessability." Also Viewpoints and Reviews commissioned by the editors, abstracts (with reviews) of articles published in other journals, and book reviews. Comment: While the price discourages me (my comments are based upon a free sample copy), I think that the journal succeeds very well. The highest density of interesting articles I have found in any journal. (Note: Remarks supplied by brandt kehoe "kehoe@csufres.CSUFresno.EDU")
Title: Connection Science: Journal of Neural Computing, Artificial Intelligence and Cognitive Research
Publish: Carfax Publishing
Address: Europe: Carfax Publishing Company, P. O. Box 25, Arbingdon, Oxfordshire OK143UE, UK, E-mail:Carfax@ibmpcug.co.uk. USA: Carafax Publishing Company, 85 Ash Street, Hopkinton, MA 01748
Freq.: Quarterly (vol. 1 in 1989)
Cost/Yr: Individual $82, Institution $184, Institution (U.K.) 74 pounds
Title: International Journal of Neural Networks
Publish: Learned Information
Freq.: Quarterly (vol. 1 in 1989)
Cost/Yr: 90 pounds
ISSN #: 0954-9889
Remark: The journal contains articles, a conference report (at least the issue I have), news and a calendar. (Note: remark provided by J.R.M. Smits "anjos@sci.kun.nl")
Title: Concepts in NeuroScience
Publish: World Scientific Publishing
Address: Same Address (?) as for International Journal of Neural Systems
Freq.: Twice per year (vol. 1 in 1989)
Remark: Mainly Review Articles(?) (Note: remarks by Osamu Saito "saito@nttica.NTT.JP")
Title: International Journal of Neurocomputing
Publish: ecn Neurocomputing GmbH
Freq.: Quarterly (vol. 1 in 1989)
Remark: Commercial journal, not the academic periodicals (Note: remarks by Osamu Saito "saito@nttica.NTT.JP") Review has been reported to be fast (less than 3 months)
Title: Neurocomputers
Publish: Gallifrey Publishing
Address: Gallifrey Publishing, PO Box 155, Vicksburg, Michigan, 49097, USA Tel: (616) 649-3772
Freq.: Monthly (1st issue 1987?)
ISSN #: 0893-1585
Editor: Derek F. Stubbs
Cost/Yr: $32 (USA, Canada), $48 (elsewhere)
Remark: I only have one exemplar so I cannot give you much detail about the contents. It is a very small one (12 pages) but it has a lot of (short) information in it about e.g. conferences, books, (new) ideas etc. I don't think it is very expensive but I'm not sure. (Note: remark provided by J.R.M. Smits "anjos@sci.kun.nl")
Title: JNNS Newsletter (Newsletter of the Japan Neural Network Society)
Publish: The Japan Neural Network Society
Freq.: Quarterly (vol. 1 in 1989)
Remark: (IN JAPANESE LANGUAGE) Official Newsletter of the Japan Neural Network Society(JNNS) (Note: remarks by Osamu Saito "saito@nttica.NTT.JP")
Title: Neural Networks Today
Remark: I found this title in a bulletin board of october last year. It was a message of Tim Pattison, timpatt@augean.OZ (Note: remark provided by J.R.M. Smits "anjos@sci.kun.nl")
Title: Computer Simulations in Brain Science
Title: Internation Journal of Neuroscience
Title: Neural Network Computation
Remark: Possibly the same as "Neural Computation"
Title: Neural Computing and Applications
Freq.: Quarterly
Publish: Springer Verlag
Cost/yr: 120 Pounds
Remark: Is the journal of the Neural Computing Applications Forum. Publishes original research and other information in the field of practical applications of neural computing.
NN Related Journals:
Title: Complex Systems
Publish: Complex Systems Publications
Address: Complex Systems Publications, Inc., P.O. Box 6149, Champaign, IL 61821-8149, USA
Freq.: 6 times per year (1st volume is 1987)
ISSN #: 0891-2513
Cost/Yr: Individual $75, Institution $225
Remark: Journal COMPLEX SYSTEMS devotes to the rapid publication of research on the science, mathematics, and engineering of systems with simple components but complex overall behavior. Send mail to "jcs@complex.ccsr.uiuc.edu" for additional info. (Remark is from announcement on Net)
Note: A WWW page on complex systems is available from http://life.anu.edu.au/complex_systems/complex.html
Title: Biological Cybernetics (Kybernetik)
Publish: Springer Verlag
Remark: Monthly (vol. 1 in 1961)
Title: Various IEEE Transactions and Magazines
Publish: IEEE
Remark: Primarily see IEEE Trans. on System, Man and Cybernetics; Various Special Issues: April 1990 IEEE Control Systems Magazine.; May 1989 IEEE Trans. Circuits and Systems.; July 1988 IEEE Trans. Acoust. Speech Signal Process.
Title: The Journal of Experimental and Theoretical Artificial Intelligence
Publish: Taylor & Francis, Ltd.
Address: London, New York, Philadelphia
Freq.: ? (1st issue Jan 1989)
Remark: For submission information, please contact either of the editors: Eric Dietrich, PACSS - Department of Philosophy, SUNY Binghamton, Binghamton, NY 13901, dietrich@bingvaxu.cc.binghamton.edu and Chris Fields, Box 30001/3CRL, New Mexico State University, Las Cruces, NM 88003-0001, cfields@nmsu.edu
Title: The Behavioral and Brain Sciences
Publish: Cambridge University Press
Remark:
Title: International Journal of Applied Intelligence
Publish: Kluwer Academic Publishers
Remark: first issue in 1990(?)
Title: Bulletin of Mathematica Biology
Title: Intelligence
Title: Journal of Mathematical Biology
Title: Journal of Complex System
Title: AI Expert
Publish: Miller Freeman Publishing Co., for subscription call ++415-267-7672.
Remark: Regularly includes ANN related articles, product announcements, and application reports. Listings of ANN programs are available on AI Expert affiliated BBS's.
Title: International Journal of Modern Physics C
Publish: World Scientific Publ. Co. Farrer Rd. P.O.Box 128, Singapore 9128 or: 687 Hartwell St., Teaneck, N.J. 07666 U.S.A or: 73 Lynton Mead, Totteridge, London N20 8DH, England
Freq: published quarterly
Eds: G. Fox, H. Herrmann and K. Kaneko
Title: Machine Learning
Publish: Kluwer Academic Publishers
Address: Kluwer Academic Publishers P.O. Box 358 Accord Station Hingham, MA 02018-0358 USA
Freq.: Monthly (8 issues per year; increasing to 12 in 1993)
Cost/Yr: Individual $140 (1992); Member of AAAI or CSCSI $88
Remark: Description: Machine Learning is an international forum for research on computational approaches to learning. The journal publishes articles reporting substantive research results on a wide range of learning methods applied to a variety of task domains. The ideal paper will make a theoretical contribution supported by a computer implementation. The journal has published many key papers in learning theory, reinforcement learning, and decision tree methods. Recently it has published a special issue on connectionist approaches to symbolic reasoning. The journal regularly publishes issues devoted to genetic algorithms as well.
Title: Journal of Physics A: Mathematical and General
Publish: Inst. of Physics, Bristol
Freq: 24 issues per year.
Remark: Statistical mechanics aspects of neural networks (mostly Hopfield models).
Title: Physical Review A: Atomic, Molecular and Optical Physics
Publish: The American Physical Society (Am. Inst. of Physics)
Freq: Monthly
Remark: Statistical mechanics of neural networks.
Journals loosely related to NNs:
JOURNAL OF COMPLEXITY
(Must rank alongside Wolfram's Complex Systems)
IEEE ASSP Magazine
(April 1987 had the Lippmann intro. which everyone likes to cite)
ARTIFICIAL INTELLIGENCE
(Vol 40, September 1989 had the survey paper by Hinton)
COGNITIVE SCIENCE
(the Boltzmann machine paper by Ackley et al appeared here in Vol 9, 1983)
COGNITION
(Vol 28, March 1988 contained the Fodor and Pylyshyn critique of connectionism)
COGNITIVE PSYCHOLOGY
(no comment!)
JOURNAL OF MATHEMATICAL PSYCHOLOGY
(several good book reviews)

A11) The most important conferences concerned with Neural Networks ?

  • Dedicated Neural Network Conferences:
    1. Neural Information Processing Systems (NIPS) Annually in Denver, Colorado; late November or early December
    2. International Joint Conference on Neural Networks (IJCNN) co-sponsored by INNS and IEEE
    3. Annual Conference on Neural Networks (ACNN)
    4. International Conference on Artificial Neural Networks (ICANN) Annually in Europe(?), 1992 in Brighton
    5. Major conference of European Neur. Netw. Soc. (ENNS)
  • Other Conferences
    1. International Joint Conference on Artificial Intelligence (IJCAI)
    2. Intern. Conf. on Acustics, Speech and Signal Processing (ICASSP)
    3. Annual Conference of the Cognitive Science Society
    4. [Vision Conferences?]
  • Pointers to Conferences
    1. The journal "Neural Networks" has a long list of conferences, workshops and meetings in each issue. This is quite interdisciplinary.
    2. There is a regular posting on comp.ai.neural-nets from Paultje Bakker: "Upcoming Neural Network Conferences", which lists names, dates, locations, contacts, and deadlines.

A12) Neural Network Associations ?

  1. International Neural Network Society (INNS).
    INNS membership includes subscription to "Neural Networks", the official journal of the society.
    Membership is $55 for non-students and $45 for students per year.
    Address: INNS Membership, P.O. Box 491166, Ft. Washington, MD 20749.
  2. International Student Society for Neural Networks (ISSNNets).
    Membership is $5 per year.
    Address: ISSNNet, Inc., P.O. Box 15661, Boston, MA 02215 USA.
  3. Women In Neural Network Research and technology (WINNERS).
    Address: WINNERS, c/o Judith Dayhoff, 11141 Georgia Ave., Suite 206, Wheaton, MD 20902. Telephone: 301-933-9000.
  4. European Neural Network Society (ENNS)
  5. Japanese Neural Network Society (JNNS)
    Address: Japanese Neural Network Society Department of Engineering, Tamagawa University, 6-1-1, Tamagawa Gakuen, Machida City, Tokyo, 194 JAPAN Phone: +81 427 28 3457, Fax: +81 427 28 3597
  6. Association des Connexionnistes en THese (ACTH)
    (the French Student Association for Neural Networks)
    Membership is 100 FF per year
    Activities : newsletter, conference (every year), list of members...
    Address : ACTH - Le Castelnau R2 23 avenue de la Galline 34170 Castelnau-le-Lez FRANCE
    Contact : jdmuller@vnet.ibm.com
  7. Neurosciences et Sciences de l'Ingenieur (NSI)
    Biology & Computer Science
    Activity : conference (every year)
    Address : NSI - TIRF / INPG 46 avenue Felix Viallet 38031 Grenoble Cedex FRANCE

A13) Other sources of information about NNs ?

  1. Neuron Digest
    Internet Mailing List. From the welcome blurb: "Neuron-Digest is a list (in digest form) dealing with all aspects of neural networks (and any type of network or neuromorphic system)"
    Moderated by Peter Marvit. To subscribe, send email to neuron-request@cattell.psych.upenn.edu comp.ai.neural-net readers also find the messages in that newsgroup in the form of digests.
  2. Usenet groups comp.ai.neural-nets and comp.theory.self-org-sys.
    There is a periodic posting on comp.ai.neural-nets sent by srctran@world.std.com (Gregory Aharonian) about Neural Network patents.
  3. Central Neural System Electronic Bulletin Board
    Modem: 509-627-6CNS; Sysop: Wesley R. Elsberry; P.O. Box 1187, Richland, WA 99352; welsberr@sandbox.kenn.wa.us Available thrugh FidoNet, RBBS-Net, and other EchoMail compatible bulletin board systems as NEURAL_NET echo.
  4. Neural ftp archive site ftp://ftp.funet.fi/pub/sci/neural
    Is administrating a large collection of neural network papers and software at the Finnish University Network file archive site ftp.funet.fi in directory /pub/sci/neural Contains all the public domain software and papers that they have been able to find. All of these files have been transferred from FTP sites in U.S. and are mirrored about every 3 months at fastest. Contact: neural-adm@ftp.funet.fi
  5. USENET newsgroup comp.org.issnnet
    Forum for discussion of academic/student-related issues in NNs, as well as information on ISSNNet and its activities.
  6. AI CD-ROM
    Network Cybernetics Corporation produces the "AI CD-ROM". It is an ISO-9660 format CD-ROM and contains a large assortment of software related to artificial intelligence, artificial life, virtual reality, and other topics. Programs for OS/2, MS-DOS, Macintosh, UNIX, and other operating systems are included. Research papers, tutorials, and other text files are included in ASCII, RTF, and other universal formats. The files have been collected from AI bulletin boards, Internet archive sites, University computer deptartments, and other government and civilian AI research organizations. Network Cybernetics Corporation intends to release annual revisions to the AI CD-ROM to keep it up to date with current developments in the field. The AI CD-ROM includes collections of files that address many specific AI/AL topics including:
    - Neural Networks: Source code and executables for many different platforms including Unix, DOS, and Macintosh. ANN development tools, example networks, sample data, and tutorials are included. A complete collection of Neural Digest is included as well.
    The AI CD-ROM may be ordered directly by check, money order, bank draft, or credit card from:
    Network Cybernetics Corporation
    4201 Wingren Road Suite 202
    Irving, TX 75062-2763
    Tel 214/650-2002
    Fax 214/650-1929
    The cost is $129 per disc + shipping ($5/disc domestic or $10/disc foreign) [See the comp.ai FAQ for further details]
  7. http://www.eeb.ele.tue.nl/ In World-Wide-Web (WWW, for example via the xmosaic program) you can read neural network information by opening the universal resource locator (URL) http://www.eeb.ele.tue.nl/ It contains a hypertext version of this FAQ and other NN-related information.


A14) Freely available software packages for NN simulation ?

  1. Rochester Connectionist Simulator
    A quite versatile simulator program for arbitrary types of neural nets. Comes with a backprop package and a X11/Sunview interface.
    anonymous FTP from cs.rochester.edu (192.5.53.209)
    Directory : pub/simulator
    Files:
    README (8 KB)
    rcs_v4.2.justdoc.tar.Z (documentation) (1.6 MB)
    rcs_v4.2.justsrc.tar.Z (source code) (1.4 MB)
  2. UCLA-SFINX
    ftp 131.179.16.6 (retina.cs.ucla.edu)
    Name: sfinxftp
    Password: joshua (currently not working)
    Directory: pub/
    Files :
    README
    sfinx_v2.0.tar.Z
    Email info request : sfinx@retina.cs.ucla.edu
  3. NeurDS
    request from mcclanahan%cookie.dec.com@decwrl.dec.com simulator for DEC systems supporting VT100 terminal.
    OR
    anonymous ftp gatekeeper.dec.com [16.1.0.2]
    Directory: pub/DEC
    File: NeurDS031.tar.Z
  4. PlaNet5.7 (also known as SunNet)
    ftp 133.15.240.3 (tutserver.tut.ac.jp)
    pub/misc/PlaNet5.7.tar.Z
    or
    ftp 128.138.240.1 (boulder.colorado.edu)
    pub/generic-sources/PlaNet5.7.tar.Z (also the old PlaNet5.6.tar.Z)
    A popular connectionist simulator with versions to run under X Windows, and non-graphics terminals created by Yoshiro Miyata (Chukyo Univ., Japan). 60-page User's Guide in Postscript. Send any questions to miyata@sccs.chukyo-u.ac.jp
  5. GENESIS
    GENESIS 1.4.1 (GEneral NEural SImulation System) is a general purpose simulation platform which was developed to support the simulation of neural systems ranging from complex models of single neurons to simulations of large networks made up of more abstract neuronal components. Most current GENESIS applications involve realistic simulations of biological neural systems. Although the software can also model more abstract networks, other simulators are more suitable for backpropagation and similar connectionist modeling. May be obtained via FTP from genesis.cns.caltech.edu [131.215.137.64]. Use
    'telnet' to genesis.cns.caltech.edu beforehand and login as the user "genesis" (no password required). If you answer all the questions asked of you an 'ftp' account will automatically be created for you. You can then 'ftp' back to the machine and download the software (ca. 3 MB). Contact: genesis@cns.caltech.edu.
  6. Mactivation
    anonymous ftp from bruno.cs.colorado.edu [128.138.243.151]
    Directory: /pub/cs/misc
    File: Mactivation-3.3.sea.hqx
  7. Cascade Correlation Simulator
    A simulator based on Scott Fahlman's Cascade Correlation algorithm.
    Anonymous ftp from ftp.cs.cmu.edu [128.2.206.173]
    Directory: /afs/cs/project/connect/code
    File: cascor1a.shar (206 KB)
    There is also a version of recurrent cascade correlation in the same
    directory in file rcc1.c (107 KB).
  8. Quickprop
    A variation of the back-propagation algorithm developed by Scott Fahlman. A simulator is available in the same directory as the cascade correlation simulator above
    File: nevprop116.shar (137 KB)
  9. DartNet
    DartNet is a Macintosh-based Neural Network Simulator. It makes full use of the Mac's graphical interface, and provides a number of powerful tools for building, editing, training, testing and examining networks.
    This program is available by anonymous ftp from dartvax.dartmouth.edu [129.170.16.4]
    Directory: /pub/mac
    File: dartnet.sit.hqx (124 KB)
    Copies may also be obtained through email from bharucha@dartmouth.edu. Along with a number of interface improvements and feature additions, v2.0 is an extensible simulator. That is, new network architectures and learning algorithms can be added to the system by writing small XCMD-like CODE resources called nDEF's ("Network Definitions"). A number of such architectures are included with v2.0, as well as header files for creating new nDEF's. Contact: sean@coos.dartmouth.edu (Sean P. Nolan)
  10. SNNS
    "Stuttgart Neural Network Simulator" from the University of Stuttgart, Germany.
    A luxurious simulator for many types of nets; with X11 interface: Graphical 2D and 3D topology editor/visualizer, training visualisation, etc. Currently supports backpropagation (vanilla, online, with momentum term and flat spot elimination, batch, time delay), counterpropagation, quickprop, backpercolation 1, generalized radial basis functions (RBF), RProp, ART1, ART2, ARTMAP, Cascade Correlation, Recurrent Cascade Correlation, Dynamic LVQ, Backpropagation through time (for recurrent networks), batch backpropagation through time (for recurrent networks), Quickpropagation through time (for recurrent networks), and is user-extendable.
    Available through anonymous ftp from ftp.informatik.uni-stuttgart.de [129.69.211.2]
    Directory: /pub/SNNS
    Files:
    SNNSv3.0.tar.Z (1530 KB)
    SNNSv3.1.Manual.ps.Z (1296 KB)
    SNNSv3.1.Readme (7744 Bytes)
  11. Aspirin/MIGRAINES
    Aspirin/MIGRAINES 6.0 consists of a code generator that builds neural network simulations by reading a network description (written in a language called "Aspirin") and generates a C simulation. An interface (called "MIGRAINES") is provided to export data from the neural network to visualization tools.
    The system has been ported to a large number of platforms. The goal of Aspirin is to provide a common extendible front-end language and parser for different network paradigms. The MIGRAINES interface is a terminal based interface that allows you to open Unix pipes to data in the neural network. This replaces the NeWS1.1 graphical interface in version 4.0 of the Aspirin/MIGRAINES software. The new interface is not a simple to use as the version 4.0 interface but is much more portable and flexible. The MIGRAINES interface allows users to output neural network weight and node vectors to disk or to other Unix processes. Users can display the data using either public or commercial graphics/analysis tools. Example filters are included that convert data exported through MIGRAINES to formats readable by Gnuplot 3.0, Matlab, Mathematica, and xgobi.
    The software is available from two FTP sites:
    CMU's simulator collection on pt.cs.cmu.edu [128.2.254.155]
    Directory: /afs/cs/project/connect/code
    File: am6.tar.Z
    and UCLA's cognitive science machine ftp.cognet.ucla.edu [128.97.50.19]
    Directory: /alexis
    File: am6.tar.Z
    The compressed tar file is a little less than 2 megabytes.
  12. Adaptive Logic Network kit
    Available from menaik.cs.ualberta.ca. This package differs from the traditional nets in that it uses logic functions rather than floating point; for many tasks, ALN's can show many orders of magnitude gain in training and performance speed.
    Anonymous ftp from menaik.cs.ualberta.ca [129.128.4.241]
    Directory: /pub/atree
    Files:
    atree.readme: README (7 KB)
    atree2.tar.Z: UNIX source code and examples (145 KB)
    atree2.ps.Z: Postscript documentation (76 KB)
    a27exe.exe: MS-Windows 3.x executable (412 KB)
    atre27.exe: MS-Windows 3.x source code (572 KB)
  13. NeuralShell
    Available from FTP site quanta.eng.ohio-state.edu [128.146.35.1]
    Directory: /pub/NeuralShell
    File: NeuralShell.tar
  14. PDP
    The PDP simulator package is available via anonymous FTP at nic.funet.fi [128.214.6.100]
    Directory: /pub/sci/neural/sims
    File: pdp.tar.Z (0.2 MB)
    The simulator is also available with the book
    "Explorations in Parallel Distributed Processing: A Handbook of Models, Programs, and Exercises" by McClelland and Rumelhart. MIT Press, 1988.
    Comment: "This book is often referred to as PDP vol III which is a very misleading practice! The book comes with software on an IBM disk but includes a makefile for compiling on UNIX systems. The version of PDP available at nic.funet.fi seems identical to the one with the book except for a bug in bp.c which occurs when you try to run a script of PDP commands using the DO command. This can be found and fixed easily."
  15. Xerion
    Xerion is available via anonymous ftp from ftp.cs.toronto.edu [128.100.3.6]
    Directory: /pub/xerion
    Files:
    xerion-3.1.ps.Z (153 kB)
    xerion-3.1.tar.Z (1322 kB)
    plus several concrete simulators built with xerion (about 40 kB each). Xerion runs on SGI and Sun machines and uses X Windows for graphics. The software contains modules that implement Back Propagation, Recurrent Back Propagation, Boltzmann Machine, Mean Field Theory, Free Energy Manipulation, Hard and Soft Competitive Learning, and Kohonen Networks. Sample networks built for each of the modules are also included. Contact: xerion@ai.toronto.edu
  16. Neocognitron simulator
    An implementation is available through anonymous ftp at unix.hensa.ac.uk [129.12.21.7]
    Directory: /pub/uunet/pub/ai/neural
    File: neocognitron.tar.Z
    The simulator is written in C and comes with a list of references which are necessary to read to understand the specifics of the implementation. The unsupervised version is coded without (!) C-cell inhibition.
  17. Multi-Module Neural Computing Environment (MUME)
    MUME is a simulation environment for multi-modules neural computing. It provides an object oriented facility for the simulation and training of multiple nets with various architectures and learning algorithms. MUME includes a library of network architectures including feedforward, simple recurrent, and continuously running recurrent neural networks. Each architecture is supported by a variety of learning algorithms. MUME can be used for large scale neural network simulations as it provides support for learning in multi-net environments. It also provide pre- and post-processing facilities. The modules are provided in a library. Several "front-ends" or clients are also available. X-Window support by editor/visualization tool Xmume. MUME can be used to include non-neural computing modules (decision trees, ...) in applications.
    Version 0.73 of MUME has been deposited for anonymous ftp on mickey.sedal.su.oz.au [129.78.24.170]
    Directory: /mume
    File:
    Contact: Marwan Jabri, SEDAL, Sydney University Electrical Engineering, NSW 2006 Australia, marwan@sedal.su.oz.au
  18. LVQ_PAK and SOM_PAK
    These are packages for Learning Vector Quantization and Self-Organizing Maps, respectively. They have been built by the LVQ/SOM Programming Team of the Helsinki University of Technology, Laboratory of Computer and Information Science, Rakentajanaukio 2 C, SF-02150 Espoo, FINLAND There are versions for Unix and MS-DOS available from cochlea.hut.fi [130.233.168.48]
    Directory: pub/lvq_pak
    Files:
    lvq_pak-2.1.tar.Z (340 kB, Unix)
    lvq_p2r1.exe (310 kB, MS-DOS self-extract archive)
    Directory: pub/som_pak
    Files:
    som_pak-1.1.tar.Z (246 kB, Unix)
    som_p1r1.exe (215 kB, MS-DOS self-extract archive)
  19. SESAME
    (Software Environment for the Simulation of Adaptive Modular Systems) SESAME is a prototypical software implementation which facilitates:
    • Object-oriented building blocks approach.
    • Contains a large set of C++ classes useful for neural nets, neurocontrol and pattern recognition. No C++ classes can be used as stand alone, though!
    • C++ classes include CartPole, nondynamic two-robot arms, Lunar Lander, Backpropagation, Feature Maps, Radial Basis Functions, TimeWindows, Fuzzy Set Coding, Potential Fields, Pandemonium, and diverse utility building blocks.
    • A kernel which is the framework for the C++ classes and allows run-time manipulation, construction, and integration of arbitrary complex and hybrid experiments.
    • Currently no graphic interface for construction, only for visualization.
    • Platform is SUN4, XWindows.
    Unfortunately no reasonable good introduction has been written until now. We hope to have something soon. For now we provide papers (eg. NIPS-92), a reference manual (>220 pages), source code (ca. 35.000 lines of code), and a SUN4-executable by ftp only. Sesame and its description is available for anonymous ftp on ftp ftp.gmd.de [129.26.8.90]
    Directory: gmd/as/sesame
    Files:
    sesame-4.5.tar.Z
    sesame-4.5-doc.ps.Z
    gmd/as/sesame Questions please to sesame-request@gmd.de There is only very limited support available. Currently we can not handle many users.
  20. Nevada Backpropagation (NevProp)
    NevProp is a user-friendly backpropagation program written in C for UNIX, Macintosh, and DOS. The original version was Quickprop 1.0 by Scott Fahlman, as translated from Common Lisp into C by Terry Regier. The quickprop algorithm itself was not substantively changed, but we inserted options to force gradient descent (per-epoch or per-pattern) and added generalization & stopped training, c index, and interface enhancements.
    FEATURES: NevProp version 1.15...
    • UNLIMITED (except by machine memory) number of input PATTERNS;
    • UNLIMITED number of input, hidden, and output UNITS;
    • Arbitrary CONNECTIONS among the various layers' units;
    • Clock-time or user-specified RANDOM SEED for initial random weights;
    • Choice of regular GRADIENT DESCENT or QUICKPROP;
    • Choice of LOGISTIC or TANH activation functions;
    • Choice of PER-EPOCH or PER-PATTERN (stochastic) weight updating;
    • GENERALIZATION to a test dataset;
    • AUTOMATICALLY STOPPED TRAINING based on generalization;
    • RETENTION of best-generalizing weights and predictions;
    • Simple but useful bar GRAPH to show smoothness of generalization;
    • SAVING of results to a file while working interactively;
    • SAVING of weights file and reloading for continued training;
    • PREDICTION-only on datasets by applying an existing weights file;
    • In addition to RMS error, the concordance, or c index is displayed. The c index shows the correctness of the RELATIVE ordering of predictions AMONG the cases; ie, it considers all possible PAIRS of vectors. This statistic is identical to the area under the receiver operating characteristic (ROC) curve, widely used in technology assessment.
    The most updated version of NevProp will be made available by anonymous ftp from the University of Nevada, Reno: unssun.scs.unr.edu [134.197.10.128]
    Directory: pub/goodman/nevpropdir
    File: nevprop1.16.shar
    Limited support is available from Phil Goodman (goodman@unr.edu), University of Nevada Center for Biomedical Research.
  21. Fuzzy ARTmap
    Available through anonymous ftp from park.bu.edu [128.176.121.56]
    Directory: /pub
    File: fuzzy-artmap.tar.Z (44 kB)
    (This is just a small example program.)
  22. PYGMALION
    This is a prototype that stems from an ESPRIT project. It implements back-propagation, self organising map, and Hopfield nets. On imag.imag.fr [129.88.32.1]
    Directory: /archive/neural/pygmalion
    File: pygmalion.tar.Z (1534 kb)
  23. Basis-of-AI-backprop
    Here are some of the details of a set of back-propagation programs I have been working on. Earlier versions have been posted in comp.sources.misc and people around the world have used them and liked them. This package is free for ordinary users but shareware for businesses and government agencies ($200/copy, but then for this you get the professional version as well). I do support this package via email.
    Some of the highlights are:
    • in C for UNIX and DOS and DOS binaries
    • gradient descent, delta-bar-delta and quickprop
    • extra fast 16-bit fixed point weight version as well as a conventional floating point version
    • recurrent networks
    • numerous sample problems
    To get this version simply ftp to ftp.mcs.com where you will land in the directory /work/public/mcsnet.users. Then cd to drt and read readme.1st. The expanded professional version is $30/copy for ordinary individuals including academics and $200/copy for businesses and government agencies. Prices and contents subject to change without notice. Some of the highlights are an improved user interface, more activation functions, networks can be read into your own programs, dynamic node creation, weight decay, SuperSAB
    Contact: Don Tveter; 5228 N. Nashville Ave.; Chicago, Illinois 60656 drt@mcs.com
For some of these simulators there are user mailing lists. Get the packages and look into their documentation for further info.
If you are using a small computer (PC, Mac, etc.) you may want to have a look at the Central Neural System Electronic Bulletin Board Modem: 509-627-6CNS; Sysop: Wesley R. Elsberry; P.O. Box 1187, Richland, WA 99352; welsberr@sandbox.kenn.wa.us There are lots of small simulator packages in the CNS ANNSIM file set. There is an ftp mirror site for the CNS ANNSIM file set at me.uta.edu (129.107.2.20) in the /pub/neural directory. Most ANN offerings are in /pub/neural/annsim.

A15) Commercial software packages for NN simulation ?

The Number 1 of each volume of the journal "Neural Networks" has a list of some dozens of commercial suppliers of Neural Network things: Software, Hardware, Support, Programming, Design and Service.
  1. nn/xnn
    Name: nn/xnn
    Company: Neureka ANS
    Address: Klaus Hansens vei 31B, 5037 Solheimsviken, NORWAY
    Phone:+47-55544163 / +47-55201548
    Email:arnemo@eik.ii.uib.no
    Basic capabilities:
    Neural network development tool. nn is a language for specification of neural network simulators. Produces C-code and executables for the specified models, therefore ideal for application development. xnn is a graphical front-end to nn and the simulation code produced by nn. Gives graphical representations in a number of formats of any variables during simulation run-time. Comes with a number of pre-implemented models, including: Backprop (several variants), Self Organizing Maps, LVQ1, LVQ2, Radial Basis Function Networks, Generalized Regression Neural Networks, Jordan nets, Elman nets, Hopfield, etc.
    Operating system: nn: UNIX or MS-DOS, xnn: UNIX/X-windows
    System requirements: 10 Mb HD, 2 Mb RAM
    Approx. price: USD 2000,-
  2. BrainMaker
    Name: BrainMaker, BrainMaker Pro
    Company: California Scientific Software
    Address: 10024 Newtown rd, Nevada City, CA, 95959 USA
    Phone,Fax: 916 478 9040, 916 478 9041
    Email: calsci!mittmann@gvgpsa.gvg.tek.com (flakey connection)
    Basic capabilities: train backprop neural nets
    Operating system: DOS, Windows, Mac
    System requirements: Uses XMS or EMS for large models(PCs only): Pro version
    Approx. price: $195, $795
    • BrainMaker Pro 3.0 (DOS/Windows) $795
      Gennetic Training add-on $250
    • BrainMaker 3.0 (DOS/Windows/Mac) $195
      Network Toolkit add-on $150
    • BrainMaker 2.5 Student version (quantity sales only, about $38 each)
    • BrainMaker Pro C30 Accelerator Board
      w/ 5Mb memory $9750
      w/32Mb memory $13,000
    • Intel iNNTS NN Development System $11,800
      Intel EMB Multi-Chip Board $9750
      Intel 80170 chip set $940
    • Introduction To Neural Networks book $30
    All Software has a 30 day money back guarantee, and unlimited free technical support.
    BrainMaker package includes:
    • The book Introduction to Neural Networks
    • BrainMaker Users Guide and reference manual
      300 pages , fully indexed, with tutorials, and sample Neural Networks
    • Netmaker
      Netmaker makes building and training Neural Networks easy, by importing and automatically creating BrainMaker's Neural Network files. Netmaker imports Lotus, Excel, dBase, and ASCII files.
    • BrainMaker
      Full menu and dialog box interface, runs Backprop at 750,000 cps on a 33Mhz 486.
      Feature                 BrainMaker  Professional    Benefit
    
      User Interface
      Pull-down Menus, Dialog Boxes   {    {    easy to learn and use; all parameters
                                                saved in a file you can edit.
      Programmable Output Files       {    {    exports data in your format to
                                                spreadsheets, graphics packages, etc.
      Editing in BrainMaker           {    {    quickly edit data, display, network
                                                connections, and more.
      Network Progress Display             {    monitors training with a simple
                                                graphic display.
      Fact Annotation                 {    {    attaches your comments to examples
                                                for display and printing.
      Printer Support                 {    {    HP LaserJet, DeskJet, InkJet,
                                                IBM Proprinter, Epson, etc.
      NetPlotter                      T    {    see how the input correlates with
                                                your output.
      Graphics Built In                    {    shows trends, cycles, network
                                                responses, statistics, etc.;
                                                see it on screen, plotter, or printer.
      Dynamic Data Exchange                {    puts your network in other windows
                                                programs
    
      Performance
      Binary Mode                     T    {    uses binary files for greater speed.
      Batch Mode                           {    add networks to your existing
                                                programs, train while you're away.
      EMS and XMS Memory                   {    up to 8192 independent variables.
      Save Network Periodically       {    {    saves results to a file in case of
                                                power failure.
      Fastest Algorithms              {    {    750,000 connections-per-second
                                                (486/50).
      Neurons per Layer             512  32,000 more inputs: model complex data
                                                with ease.
      Number of Layers                8    8    extra hidden layers can help tackle
                                                bigger problems.
    
      Training
      Specify Parameters by Layer          {    fine-tunes performance inside the netw
      Recurrence Networks                  {    Puts feedback in your network,
                                                automates historical input.
      Prune Connections and Neurons        {    improves accuracy by trimming away
                                                excess "fat".
      Add Hidden Neurons In Training  {    {    finds best size network quickly;
                                                fully automated with Professional.
      Custom Neuron Functions         {    {    optimizes training to suit any need.
      Testing While Training          {    {    trains for best performance on new
                                                data.
      Stop training when...                {    lets you decide when network has
                                                learned well.
      Heavy Weights                        {    helps networks train.
      Hypersonic Training             T    {    trains faster with this proprietary
                                                algorithm.
    
      Analysis, Advanced Functions
      Sensitivity Analysis                 {    shows you which inputs determined
                                                your results.
      Neuron Sensitivity                   {    shows you the total effect of one
                                                input on your results.
      Global Network Analysis              {    shows how the networks reacts to
                                                your inputs overall
      Contour Analysis                     {    shows peaks and valleys of the output
                                                when two inputs change
      Data Correlator                      {    finds important data and optimum
                                                time delays.
      Error Statistics Report         {    {    check your network error rate during
                                                training.
      Print or Edit Weight Matrices   {    {    examine, customize network internals.
      Competitor                           {    ranks horses, teams, stocks, etc.
                                                in finish order.
      Run Time System                      {    C source code - make programs with
                                                your network for resale.
      Chip Support                    {    {    Intel, American Neurologics,
                                                Micro Devices.
      Genetic Training Option              G    trains variations of your design
                                                and shows you which was the best.
    
      Network Data Management Functions
      NetMaker                        {    {    spreadsheet-like data manipulation
                                                and network file creation.
      NetChecker                      {    {    checks your files for errors and
                                                inconsistencies.
      Shuffle                         {    {    mixes up the order of examples for
                                                better training.
      Binary                          T    {    converts files to binary for quicker
                                                training.
      MinMax                          {    {    finds min / max / standard deviation
                                                of data for fine-tuned results.
      Data Importation                {    {    reads data from Lotus, dBASE,
                                                Excel, ASCII, binary.
      Finacial Data                        {    reads MetaStock, and Computrack
      Data Manipulation               {    {    finds indicators, oscillators,
                                                running averages, etc.
      Cyclic Analysis                      {    checks data for periodic or cyclic
                                                behavior.
      Data Types                      {    {    uses symbolic, text, picture,
                                                and numeric data.
    
      Documentation & User Support
      User's Guide                    {    {    an application development guide
                                                and quick-start booklet.
      Introduction to Neural Networks {    {    324 pp, gets you up to date in this
                                                exciting field.
    

  3. SAS Software/ Neural Net add-on
    Name: SAS Software
    Company: SAS Institute, Inc.
    Address: SAS Campus Drive, Cary, NC 27513, USA
    Phone,Fax: (919) 677-8000
    Email: saswss@unx.sas.com (Neural net inquiries only)
    Basic capabilities:
    Feedforward nets with numerous training methods and loss functions, plus statistical analogs of counterpropagation and various unsupervised architectures
    Operating system: Lots
    System requirements: Lots Uses XMS or EMS for large models(PCs only): Runs under Windows, OS/2
    Approx. price: Free neural net software, but you have to license SAS/Base software and preferably the SAS/OR, SAS/ETS, and/or SAS/STAT products.
    Comments: Oriented toward data analysis and statistical applications
  4. NeuralWorks
    Name: NeuralWorks Professional II Plus (from NeuralWare)
    Company: NeuralWare Inc.
    Adress: Pittsburgh, PA 15276-9910
    Phone: (412) 787-8222
    FAX: (412) 787-8220
    Distributor for Europe:
    Scientific Computers GmbH
    Franzstr. 107, 52064 Aachen
    Germany
    Tel. (49) +241-26041
    Fax. (49) +241-44983
    Email. info@scientific.de
    Basic capabilities:
    supports over 30 different nets: backprop, art-1,kohonen, modular neural network, General regression, Fuzzy art-map, probabilistic nets, self-organizing map, lvq, boltmann, bsb, spr, etc...
    Extendable with optional package.
    ExplainNet, Flashcode (compiles net in .c code for runtime), user-defined io in c possible. ExplainNet (to eliminate extra inputs), pruning, savebest,graph.instruments like correlation, hinton diagrams, rms error graphs etc.
    Operating system : PC,Sun,IBM RS6000,Apple Macintosh,SGI,Dec,HP. System requirements: varies. PC:2MB extended memory+6MB Harddisk space. Uses windows compatible memory driver (extended). Uses extended memory.
    Approx. price : call (depends on platform)
    Comments : award winning documentation, one of the market leaders in NN software.
  5. MATLAB Neural Network Toolbox (for use with Matlab 4.x)
    Contact: The MathWorks, Inc., 24 Prime Park Way,Natick, MA 01760 Phone: 508-653-1415, FAX: 508-653-2997, email: info@mathworks.com
    (Comment by Richard Andrew Miles Outerbridge, RAMO@UVPHYS.PHYS.UVIC.CA) Matlab is spreading like hotcakes (and the educational discounts are very impressive). The newest release of Matlab (4.0) answers the question "if you could only program in one language what would it be?". The neural network toolkit is worth getting for the manual alone. Matlab is available with lots of other toolkits (signal processing, optimization, etc.) but I don't use them much - the main package is more than enough. The nice thing about the Matlab approach is that you can easily interface the neural network stuff with anything else you are doing.

A16) Neural Network hardware ?

The Number 1 of each volume of the journal "Neural Networks" has a list of some dozens of suppliers of Neural Network support: Software, Hardware, Support, Programming, Design and Service.
Here is a list of companies contributed by xli@computing-maths.cardiff.ac.uk:
  1. HNC, INC.
    5501 Oberlin Drive
    San Diego
    California 92121
    (619) 546-8877
    and a second address at
    7799 Leesburg Pike, Suite 900
    Falls Church, Virginia
    22043
    (703) 847-6808
    Note: Australian Dist.: Unitronics
    Tel : (09) 4701443
    Contact: Martin Keye
    HNC markets:
    • 'Image Document Entry Processing Terminal' - it recognises handwritten documents and converts the info to ASCII.
    • 'ExploreNet 3000' - a NN demonstrator
    • 'Anza/DP Plus'- a Neural Net board with 25MFlop or 12.5M peak interconnects per second.

  2. SAIC (Sience Application International Corporation)
    10260 Campus Point Drive
    MS 71, San Diego
    CA 92121
    (619) 546 6148
    Fax: (619) 546 6736
  3. Micro Devices
    30 Skyline Drive
    Lake Mary
    FL 32746-6201
    (407) 333-4379
    MicroDevices makes MD1220 - 'Neural Bit Slice' Each of the products mentioned sofar have very different usages. Although this sounds similar to Intel's product, the architectures are not.
  4. Intel Corp
    2250 Mission College Blvd
    Santa Clara, Ca 95052-8125
    Attn ETANN, Mail Stop SC9-40
    (408) 765-9235
    Intel is making an experimental chip:
    80170NW - Electrically trainable Analog Neural Network (ETANN) It has 64 'neurons' on it - almost fully internally connectted and the chip can be put in an hierarchial architecture to do 2 Billion interconnects per second.
    Support software has already been made by
    California Scientific Software
    10141 Evening Star Dr #6
    Grass Valley, CA 95945-9051
    (916) 477-7481
    Their product is called 'BrainMaker'.
  5. NeuralWare, Inc
    Penn Center West
    Bldg IV Suite 227
    Pittsburgh
    PA 15276
    They only sell software/simulator but for many platforms.
  6. Tubb Research Limited
    7a Lavant Street
    Peterfield
    Hampshire
    GU32 2EL
    United Kingdom
    Tel: +44 730 60256
  7. Adaptive Solutions Inc
    1400 NW Compton Drive
    Suite 340
    Beaverton, OR 97006
    U. S. A.
    Tel: 503 - 690 - 1236 FAX: 503 - 690 - 1249
  8. NeuroDynamX, Inc.
    4730 Walnut St., Suite 101B
    Boulder, CO 80301
    Voice: (303) 442-3539 Fax: (303) 442-2854
    Internet: techsupport@ndx.com
    NDX sells a number neural network hardware products:
    • NDX Neural Accelerators: a line of i860-based accelerator cards for the PC that give up to 45 million connections per second for use with the DynaMind neural network software.
    • iNNTS: Intel's 80170NX (ETANN) Neural Network Training System. NDX's president was one of the co-designers of this chip.

And here is an incomplete list of Neurocomputers (provided by jon@kongle.idt.unit.no (Jon Gunnar Solheim)): Overview over known Neural Computers with their newest known reference.
AAP-2
Takumi Watanabe, Yoshi Sugiyama, Toshio Kondo, and Yoshihiro Kitamura. Neural network simulation on a massively parallel cellular array processor: AAP-2. In International Joint Conference on Neural Networks, 1989.
ANNA
B.E.Boser, E.Sackinger, J.Bromley, Y.leChun, and L.D.Jackel. Hardware Requirements for Neural Network Pattern Classifiers. In IEEE Micro, 12(1), pages 32-40, February 1992.
Analog Neural Computer
Paul Mueller et al. Design and performance of a prototype analog neural computer. In Neurocomputing, 4(6):311-323, 1992.
APx -- Array Processor Accelerator
F.Pazienti. Neural networks simulation with array processors. In Advanced Computer Technology, Reliable Systems and Applications; Proceedings of the 5th Annual Computer Conference, pages 547-551. IEEE Comput. Soc. Press, May 1991. ISBN: 0-8186-2141-9.
ASP -- Associative String Processor
A.Krikelis. A novel massively associative processing architecture for the implementation artificial neural networks. In 1991 International Conference on Acoustics, Speech and Signal Processing, volume 2, pages 1057-1060. IEEE Comput. Soc. Press, May 1991.
BSP400
Jan N.H. Heemskerk, Jacob M.J. Murre, Jaap Hoekstra, Leon H.J.G. Kemna, and Patrick T.W. Hudson. The bsp400: A modular neurocomputer assembled from 400 low-cost microprocessors. In International Conference on Artificial Neural Networks. Elsevier Science, 1991.
BLAST
J.G.Elias, M.D.Fisher, and C.M.Monemi. A multiprocessor machine for large-scale neural network simulation. In IJCNN91-Seattle: International Joint Conference on Neural Networks, volume 1, pages 469-474. IEEE Comput. Soc. Press, July 1991. ISBN: 0-7883-0164-1.
CNAPS Neurocomputer
H.McCartor Back Propagation Implementation on the Adaptive Solutions CNAPS Neurocomputer. In Advances in Neural Information Processing Systems, 3, 1991.
MA16 -- Neural Signal Processor
U.Ramacher, J.Beichter, and N.Bruls. Architecture of a general-purpose neural signal processor. In IJCNN91-Seattle: International Joint Conference on Neural Networks, volume 1, pages 443-446. IEEE Comput. Soc. Press, July 1991. ISBN: 0-7083-0164-1.
Mindshape
Jan N.H. Heemskerk, Jacob M.J. Murre Arend Melissant, Mirko Pelgrom, and Patrick T.W. Hudson. Mindshape: a neurocomputer concept based on a fractal architecture. In International Conference on Artificial Neural Networks. Elsevier Science, 1992.
mod 2
Michael L. Mumford, David K. Andes, and Lynn R. Kern. The mod 2 neurocomputer system design. In IEEE Transactions on Neural Networks, 3(3):423-433, 1992.
NERV
R.Hauser, H.Horner, R. Maenner, and M.Makhaniok. Architectural Considerations for NERV - a General Purpose Neural Network Simulation System. In Workshop on Parallel Processing: Logic, Organization and Technology -- WOPPLOT 89, pages 183-195. Springer Verlag, Mars 1989. ISBN: 3-5405-5027-5.
NP -- Neural Processor
D.A.Orrey, D.J.Myers, and J.M.Vincent. A high performance digital processor for implementing large artificial neural networks. In Proceedings of of the IEEE 1991 Custom Integrated Circuits Conference, pages 16.3/1-4. IEEE Comput. Soc. Press, May 1991. ISBN: 0-7883-0015-7.
RAP -- Ring Array Processor
N.Morgan, J.Beck, P.Kohn, J.Bilmes, E.Allman, and J.Beer. The ring array processor: A multiprocessing peripheral for connectionist applications. In Journal of Parallel and Distributed Computing, pages 248-259, April 1992.
RENNS -- REconfigurable Neural Networks Server
O.Landsverk, J.Greipsland, J.A.Mathisen, J.G.Solheim, and L.Utne. RENNS - a Reconfigurable Computer System for Simulating Artificial Neural Network Algorithms. In Parallel and Distributed Computing Systems, Proceedings of the ISMM 5th International Conference, pages 251-256. The International Society for Mini and Microcomputers - ISMM, October 1992. ISBN: 1-8808-4302-1.
SMART -- Sparse Matrix Adaptive and Recursive Transforms
P.Bessiere, A.Chams, A.Guerin, J.Herault, C.Jutten, and J.C.Lawson. From Hardware to Software: Designing a ``Neurostation''. In VLSI design of Neural Networks, pages 311-335, June 1990.
SNAP -- Scalable Neurocomputer Array Processor
E.Wojciechowski. SNAP: A parallel processor for implementing real time neural networks. In Proceedings of the IEEE 1991 National Aerospace and Electronics Conference; NAECON-91, volume 2, pages 736-742. IEEE Comput.Soc.Press, May 1991.
Toroidal Neural Network Processor
S.Jones, K.Sammut, C.Nielsen, and J.Staunstrup. Toroidal Neural Network: Architecture and Processor Granularity Issues. In VLSI design of Neural Networks, pages 229-254, June 1990.
SMART and SuperNode
P. Bessiere, A. Chams, and P. Chol. MENTAL : A virtual machine approach to artificial neural networks programming. In NERVES, ESPRIT B.R.A. project no 3049, 1991. (The report archived on neuroprose}