Dec 052017
 
FAQ on Neural Networks. Feb.93'.
File NNFAQ.ZIP from The Programmer’s Corner in
Category Miscellaneous Language Source Code
FAQ on Neural Networks. Feb.93′.
File Name File Size Zip Size Zip Type
NNFAQ.TXT 70888 26191 deflated

Download File NNFAQ.ZIP Here

Contents of the NNFAQ.TXT file



200 PORT command successful.
150 Opening ASCII mode data connection for /bin/compress.
From: [email protected] (Lutz Prechelt)
Date: 28 Dec 92 03:16:53 GMT
Newsgroups: comp.ai.neural-nets,news.answers
Subject: FAQ in comp.ai.neural-nets -- monthly posting

Archive-name: neural-net-faq
Last-modified: 92/11/30

(FAQ means "Frequently Asked Questions")

------------------------------------------------------------------------
Anybody who is willing to contribute any question or
information, please email me; if it is relevant,
I will incorporate it. But: Please format your contribution
appropriately so that I can just drop it in.

The monthly posting departs at the 28th of every month.
------------------------------------------------------------------------

This is a monthly posting to the Usenet newsgroup comp.ai.neural-nets
(and news.answers, where it should be findable at ANY time)
Its purpose is to provide basic information for individuals who are
--More--
new to the field of neural networks or are just beginning to read this
group. It shall help to avoid lengthy discussion of questions that usually
arise for beginners of one or the other kind.

>>>>> SO, PLEASE, SEARCH THIS POSTING FIRST IF YOU HAVE A QUESTION <<<<<
and
>>>>> DON'T POST ANSWERS TO FAQs: POINT THE ASKER TO THIS POSTING <<<<<

This posting is archived in the periodic posting archive on
"pit-manager.mit.edu" [18.172.1.27] (and on some other hosts as well).
Look in the anonymous ftp directory "/pub/usenet/news.answers",
the filename is as given in 'Archive-name:' header above.
If you do not have anonymous ftp access, you can access the archives
by mail server as well. Send an E-mail message to
[email protected] with "help" and "index" in the body on
separate lines for more information.


The monthly posting is not meant to discuss any topic exhaustively.

Disclaimer: This posting is provided 'as is'.

No warranty whatsoever is expressed or implied,
especially, no warranty that the information contained herein
is correct or useful in any way, although both is intended.

>> To find the answer of question number (if present at all), search
>> for the string "-A.)" (so the answer to question 12 is at "-A12.)")

And now, in the end, we begin:

============================== Questions ==============================

(the short forms and non-continous numbering is intended)
1.) What is this newsgroup for ? How shall it be used ?
2.) What is a neural network (NN) ?
3.) What can you do with a Neural Network and what not ?
4.) Who is concerned with Neural Networks ?

6.) What does 'backprop' mean ?
7.) How many learning methods for NNs exist ? Which ?
8.) What about Genetic Algorithms ?

10.) Good introductory literature about Neural Networks ?
11.) Any journals and magazines about Neural Networks ?
12.) The most important conferences concerned with Neural Networks ?
13.) Neural Network Associations ?
14.) Other sources of information about NNs ?

15.) Freely available software packages for NN simulation ?
16.) Commercial software packages for NN simulation ?
17.) Neural Network hardware ?

19.) Databases for experimentation with NNs ?

============================== Answers ==============================

------------------------------------------------------------------------

-A1.) What is this newsgroup for ?

The newsgroup comp.ai.neural-nets is inteded as a forum for people who want
to use or explore the capabilities of Neural Networks or Neural-Network-like
structures.

There should be the following types of articles in this newsgroup:

1. Requests

Requests are articles of the form
"I am looking for X"
where X is something public like a book, an article, a piece of software.

If multiple different answers can be expected, the person making the
request should prepare to make a summary of the answers he/she got
and announce to do so with a phrase like
"Please email, I'll summarize"
at the end of the posting.

The Subject line of the posting should then be something like
"Request: X"

2. Questions

As opposed to requests, questions are concerned with something so specific
that general interest cannot readily be assumed.
If the poster thinks that the topic is of some general interest,
he/she should announce a summary (see above).

The Subject line of the posting should be something like
"Question: this-and-that"
or have the form of a question (i.e., end with a question mark)

3. Answers

These are reactions to questions or requests.
As a rule of thumb articles of type "answer" should be rare.
Ideally, in most cases either the answer is too specific to be of general
interest (and should thus be e-mailed to the poster) or a summary
was announced with the question or request (and answers should
thus be e-mailed to the poster).

The subject lines of answers are automatically adjusted by the
news software.

4. Summaries

In all cases of requests or questions the answers for which can be assumed
to be of some general interest, the poster of the request or question
shall summarize the ansers he/she received.
Such a summary should be announced in the original posting of the question
or request with a phrase like
"Please answer by email, I'll summarize"

In such a case answers should NOT be posted to the newsgroup but instead
be mailed to the poster who collects and reviews them.
After about 10 to 20 days from the original posting, its poster should
make the summary of answers and post it to the net.

Some care should be invested into a summary:
a) simple concatenation of all the answers is not enough;
instead redundancies, irrelevancies, verbosities and
errors must be filtered out (as good as possible),
b) the answers shall be separated clearly
c) the contributors of the individual answers shall be identifiable
(unless they requested to remain anonymous [yes, that happens])
d) the summary shall start with the "quintessence" of the answers,
as seen by the original poster
e) A summary should, when posted, clearly be indicated to be one
by giving it a Subject line starting with "Summary:"

Note that a good summary is pure gold for the rest of the newsgroup
community, so summary work will be most appreciated by all of us.
(Good summaries are more valuable than any moderator ! :-> )

5. Announcements

Some articles never need any public reaction.
These are called announcements (for instance for a workshop,
conference or the availability of some technical report or
software system).

Announcements should be clearly indicated to be such by giving
them a subject line of the form
"Announcement: this-and-that"

6. Reports

Sometimes people spontaneously want to report something to the
newsgroup. This might be special experiences with some software,
results of own experiments or conceptual work, or especially
interesting information from somewhere else.

Reports should be clearly indicated to be such by giving
them a subject line of the form
"Report: this-and-that"

7. Discussions

An especially valuable possibility of Usenet is of course that of
discussing a certain topic with hundreds of potential participants.
All traffic in the newsgroup that can not be subsumed under one of
the above categories should belong to a discussion.

If somebody explicitly wants to start a discussion, he/she can do so
by giving the posting a subject line of the form
"Start discussion: this-and-that"
(People who react on this, please remove the
"Start discussion: " label from the subject line of your replies)

It is quite difficult to keep a discussion from drifting into chaos,
but, unfortunately, as many many other newsgroups show there seems
to be no secure way to avoid this.
On the other hand, comp.ai.neural-nets has not had many problems
with this effect in the past, so let's just go and hope... :->

------------------------------------------------------------------------

-A2.) What is a neural network (NN) ?

[anybody there to write something better?
buzzwords: artificial vs. natural/biological; units and
connections; value passing; inputs and outputs; storage in structure
and weights; only local information; highly parallel operation ]

First of all, when we are talking about a neural network, we *should*
usually better say "artificial neural network" (ANN), because that is
what we mean most of the time. Biological neural networks are much
more complicated in their elementary structures than the mathematical
models we use for ANNs.

A vague description is as follows:

An ANN is a network of many very simple processors ("units"), each
possibly having a (small amount of) local memory. The units are
connected by unidirectional communication channels ("connections"),
which carry numeric (as opposed to symbolic) data. The units operate
only on their local data and on the inputs they receive via the
connections.

The design motivation is what distinguishes neural networks from other
mathematical techniques:

A neural network is a processing device, either an algorithm, or actual
hardware, whose design was motivated by the design and functioning of human
brains and components thereof.

Most neural networks have some sort of "training" rule
whereby the weights of connections are adjusted on the basis of
presented patterns.
In other words, neural networks "learn" from examples,
just like children learn to recognize dogs from examples of dogs,
and exhibit some structural capability for generalization.

Neural networks normally have great potential for parallelism, since
the computations of the components are independent of each other.

------------------------------------------------------------------------

-A3.) What can you do with a Neural Network and what not ?

[preliminary]

In principle, NNs can compute any computable function, i.e. they can
do everything a normal digital computer can do.
Especially can anything that can be represented as a mapping between
vector spaces be approximated to arbitrary precision by feedforward
NNs (which is the most often used type).

In practice, NNs are especially useful for mapping problems
which are tolerant of a high error rate, have lots of example data
available, but to which hard and fast rules can not easily be applied.

NNs are especially bad for problems that are concerned with manipulation
of symbols and for problems that need short-term memory.

------------------------------------------------------------------------

-A4.) Who is concerned with Neural Networks ?

Neural Networks are interesting for quite a lot of very dissimilar people:

- Computer scientists want to find out about the properties of
non-symbolic information processing with neural nets and about learning
systems in general.
- Engineers of many kinds want to exploit the capabilities of
neural networks on many areas (e.g. signal processing) to solve
their application problems.
- Cognitive scientists view neural networks as a possible apparatus to
describe models of thinking and conscience (High-level brain function).
- Neuro-physiologists use neural networks to describe and explore
medium-level brain function (e.g. memory, sensory system, motorics).
- Physicists use neural networks to model phenomena in statistical
mechanics and for a lot of other tasks.
- Biologists use Neural Networks to interpret nucleotide sequences.
- Philosophers and some other people may also be interested in
Neural Networks for various reasons.

------------------------------------------------------------------------

-A6.) What does 'backprop' mean ?

[anybody to write something similarly short,
but easier to understand for a beginner ? ]

It is an abbreviation for 'backpropagation of error' which is the
most widely used learning method for neural networks today.
Although it has many disadvantages, which could be summarized in the
sentence
"You are almost not knowing what you are actually doing
when using backpropagation" 🙂
it has pretty much success on practical applications and is
relatively easy to apply.

It is for the training of layered (i.e., nodes are grouped
in layers) feedforward (i.e., the arcs joining nodes are
unidirectional, and there are no cycles) nets.

Back-propagation needs a teacher that knows the correct output for any
input ("supervised learning") and uses gradient descent on the error
(as provided by the teacher) to train the weights. The activation
function is (usually) a sigmoidal (i.e., bounded above and below, but
differentiable) function of a weighted sum of the nodes inputs.

The use of a gradient descent algorithm to train its weights makes it
slow to train; but being a feedforward algorithm, it is quite rapid during
the recall phase.

Literature:
Rumelhart, D. E. and McClelland, J. L. (1986):
Parallel Distributed Processing: Explorations in the
Microstructure of Cognition (volume 1, pp 318-362).
The MIT Press.
(this is the classic one) or one of the dozens of other books
or articles on backpropagation :->

------------------------------------------------------------------------

-A7.) How many learning methods for NNs exist ? Which ?

There are many many learning methods for NNs by now. Nobody can know
exactly how many.
New ones (at least variations of existing ones) are invented every
week. Below is a collection of some of the most well known methods;
not claiming to be complete.

The main categorization of these methods is the distiction of
supervised from unsupervised learning:

- In supervised learning, there is a "teacher" who in the learning
phase "tells" the net how well it performs ("reinforcement learning")
or what the correct behavior would have been ("fully supervised learning").

- In unsupervised learning the net is autonomous: it just looks at
the data it is presented with, finds out about some of the
properties of the data set and learns to reflect these properties
in its output. What exactly these properties are, that the network
can learn to recognise, depends on the particular network model and
learning method.

Many of these learning methods are closely connected with a certain
(class of) network topology.

Now here is the list, just giving some names:

1. UNSUPERVISED LEARNING (i.e. without a "teacher"):
1). Feedback Nets:
a). Additive Grossberg (AG)
b). Shunting Grossberg (SG)
c). Binary Adaptive Resonance Theory (ART1)
d). Analog Adaptive Resonance Theory (ART2, ART2a)
e). Discrete Hopfield (DH)
f). Continuous Hopfield (CH)
g). Discrete Bidirectional Associative Memory (BAM)
h). Temporal Associative Memory (TAM)
i). Adaptive Bidirectional Associative Memory (ABAM)
j). Kohonen Self-organizing Map (SOM)
k). Kohonen Topology-preserving Map (TPM)
2). Feedforward-only Nets:
a). Learning Matrix (LM)
b). Driver-Reinforcement Learning (DR)
c). Linear Associative Memory (LAM)
d). Optimal Linear Associative Memory (OLAM)
e). Sparse Distributed Associative Memory (SDM)
f). Fuzzy Associative Memory (FAM)
g). Counterprogation (CPN)

2. SUPERVISED LEARNING (i.e. with a "teacher"):
1). Feedback Nets:
a). Brain-State-in-a-Box (BSB)
b). Fuzzy Congitive Map (FCM)
c). Boltzmann Machine (BM)
d). Mean Field Annealing (MFT)
e). Recurrent Cascade Correlation (RCC)
f). Learning Vector Quantization (LVQ)
2). Feedforward-only Nets:
a). Perceptron
b). Adaline, Madaline
c). Backpropagation (BP)
d). Cauchy Machine (CM)
e). Adaptive Heuristic Critic (AHC)
f). Time Delay Neural Network (TDNN)
g). Associative Reward Penalty (ARP)
h). Avalanche Matched Filter (AMF)
i). Backpercolation (Perc)
j). Artmap
k). Adaptive Logic Network (ALN)
l). Cascade Correlation (CasCor)

------------------------------------------------------------------------

-A8.) What about Genetic Algorithms ?

[preliminary]
[Who will write a better introduction?]

There are a number of definitions of GA (Genetic Algorithm).
A possible one is

A GA is an optimization program
that starts with some encoded procedure, (Creation of Life :-> )
mutates it stochastically, (Get cancer or so :-> )
and uses a selection process (Darwinism)
to prefer the mutants with high fitness
and perhaps a recombination process (Make babies :-> )
to combine properties of (preferably) the succesful mutants.

Some GA discussion tends to happen in comp.ai.neural-nets.
Another loosely relevant group is comp.theory.self-org-sys.
Perhaps it is time for a comp.ai.ga, comp.theory.ga or maybe comp.ga
There is a GA mailing list which you can subscribe to by
sending a request to [email protected]
You can also try anonymous ftp to
ftp.aic.nrl.navy.mil
in the /pub/galist directory. There are papers and some software.

For more details see (for example):

"Genetic Algorithms in Search Optimisation and Machine Learning"
by David Goldberg (Addison-Wesley 1989, 0-201-15767-5) or

"Handbook of Genetic Algorithms"
edited by Lawrence Davis (Van Nostrand Reinhold 1991 0-442-00173-8) or

"Classifier Systems and Genetic Algorithms"
L.B. Booker, D.E. Goldberg and J.H. Holland, Techreport No. 8 (April 87),
Cognitive Science and Machine Intelligence Laboratory, University of Michigan
also reprinted in :
Artificial Intelligence, Volume 40 (1989), pages 185-234

------------------------------------------------------------------------

-A10.) Good introductory literature about Neural Networks ?

0.) The best (subjectively, of course -- please don't flame me):

Hecht-Nielsen, R. (1990). Neurocomputing. Addison Wesley.
Comments: "A good book", "comprises a nice historical overview and a chapter
about NN hardware. Well structured prose. Makes important concepts clear."

Hertz, J., Krogh, A., and Palmer, R. (1991). Introduction to the Theory of
Neural Computation. Addison-Wesley: Redwood City, California.
Comments: "My first impression is that this one is by far the best book on
the topic. And it's below $30 for the paperback."; "Well written, theoretical
(but not overwhelming)"; It provides a good balance of model development,
computational algorithms, and applications. The mathematical derivations
are especially well done"; "Nice mathematical analysis on the mechanism of
different learning algorithms"; "It is NOT for mathematical beginner.
If you don't have a good grasp of higher level math, this book can
be really tough to get through."


1.) Books for the beginner:

Aleksander, I. and Morton, H. (1990). An Introduction to Neural Computing.
Chapman and Hall. (ISBN 0-412-37780-2).
Comments: "This book seems to be intended for the first year of university
education."

Beale, R. and Jackson, T. (1990). Neural Computing, an Introduction.
Adam Hilger, IOP Publishing Ltd : Bristol. (ISBN 0-85274-262-2).
Comments: "It's clearly written. Lots of hints as to how to get the
adaptive models covered to work (not always well explained in the
original sources). Consistent mathematical terminology. Covers
perceptrons, error-backpropagation, Kohonen self-org model, Hopfield
type models, ART, and associative memories."

Dayhoff, J. E. (1990). Neural Network Architectures: An Introduction.
Van Nostrand Reinhold: New York.
Comments: "Like Wasserman's book, Dayhoff's book is also very easy to
understand".

McClelland, J. L. and Rumelhart, D. E. (1988).
Explorations in Parallel Distributed Processing: Computational Models of
Cognition and Perception (software manual). The MIT Press.
Comments: "Written in a tutorial style, and includes 2 diskettes of NN
simulation programs that can be compiled on MS-DOS or Unix (and they do
too !)"; "The programs are pretty reasonable as an introduction to some
of the things that NNs can do."; "There are *two* editions of this book.
One comes with disks for the IBM PC, the other comes with disks for the
Macintosh".

McCord Nelson, M. and Illingworth, W.T. (1990). A Practical Guide to Neural
Nets. Addison-Wesley Publishing Company, Inc. (ISBN 0-201-52376-0).
Comments: "No formulas at all( ==> no good)"; "It does not have much
detailed model development (very few equations), but it does present many
areas of application. It includes a chapter on current areas of research.
A variety of commercial applications is discussed in chapter 1. It also
includes a program diskette with a fancy graphical interface (unlike the
PDP diskette)".

Orchard, G.A. & Phillips, W.A. (1991). Neural Computation: A
Beginner's Guide. Lawrence Earlbaum Associates: London.
Comments: "Short user-friendly introduction to the area, with a
non-technical flavour. Apparently accompanies a software package, but I
haven't seen that yet".

Wasserman, P. D. (1989). Neural Computing: Theory & Practice.
Van Nostrand Reinhold: New York. (ISBN 0-442-20743-3)
Comments: "Wasserman flatly enumerates some common architectures from an
engineer's perspective ('how it works') without ever addressing the underlying
fundamentals ('why it works') - important basic concepts such as clustering,
principal components or gradient descent are not treated. It's also full of
errors, and unhelpful diagrams drawn with what appears to be PCB board layout
software from the '70s. For anyone who wants to do active research in the
field I consider it quite inadequate"; "Okay, but too shallow"; "Quite
easy to understand";
"The best bedtime reading for Neural Networks. I have given
this book to numerous collegues who want to know NN basics, but who never
plan to implement anything. An excellent book to give your manager."


2.) The classics:

Kohonen, T. (1984). Self-organization and Associative Memory. Springer-Verlag:
New York. (2nd Edition: 1988; 3rd edition: 1989).
Comments: "The section on Pattern mathematics is excellent."

Rumelhart, D. E. and McClelland, J. L. (1986). Parallel Distributed
Processing: Explorations in the Microstructure of Cognition (volumes 1 & 2).
The MIT Press.
Comments: "As a computer scientist I found the two Rumelhart and McClelland
books really heavy going and definitely not the sort of thing to read if you
are a beginner."; "It's quite readable, and affordable (about $65 for both
volumes)."; "THE Connectionist bible.".


3.) Introductory journal articles:

Hinton, G. E. (1989). Connectionist learning procedures.
Artificial Intelligence, Vol. 40, pp. 185--234.
Comments: "One of the better neural networks overview papers, although the
distinction between network topology and learning algorithm is not always
very clear. Could very well be used as an introduction to neural networks."

Knight, K. (1990). Connectionist, Ideas and Algorithms. Communications of
the ACM. November 1990. Vol.33 nr.11, pp 59-74.
Comments:"A good article, while it is for most people easy to find a copy of
this journal."

Kohonen, T. (1988). An Introduction to Neural Computing. Neural Networks,
vol. 1, no. 1. pp. 3-16.
Comments: "A general review".


4.) Not-quite-so-introductory literature:

Anderson, J. A. and Rosenfeld, E. (Eds). (1988). Neurocomputing:
Foundations of Research. The MIT Press: Cambridge, MA.
Comments: "An expensive book, but excellent for reference. It is a
collection of reprints of most of the major papers in the field.";

Anderson, J. A., Pellionisz, A. and Rosenfeld, E. (Eds). (1990).
Neurocomputing 2: Directions for Research. The MIT Press: Cambridge, MA.
Comments: "The sequel to their well-known Neurocomputing book."

Caudill, M. and Butler, C. (1990). Naturally Intelligent Systems.
MIT Press: Cambridge, Massachusetts. (ISBN 0-262-03156-6).
Comments: "I guess one of the best books I read"; "May not be suited for
people who want to do some research in the area".

Khanna, T. (1990). Foundations of Neural Networks. Addison-Wesley: New York.
Comments: "Not so bad (with a page of erroneous formulas (if I remember
well), and #hidden layers isn't well described)."; "Khanna's intention
in writing his book with math analysis should be commended but he
made several mistakes in the math part".

Levine, D. S. (1990). Introduction to Neural and Cognitive Modeling.
Lawrence Erlbaum: Hillsdale, N.J.
Comments: "Highly recommended".

Lippmann, R. P. (April 1987). An introduction to computing with neural nets.
IEEE Acoustics, Speech, and Signal Processing Magazine. vol. 2,
no. 4, pp 4-22.
Comments: "Much acclaimed as an overview of neural networks, but rather
inaccurate on several points. The categorization into binary and continuous-
valued input neural networks is rather arbitrary, and may work confusing for
the unexperienced reader. Not all networks discussed are of equal importance."

Maren, A., Harston, C. and Pap, R., (1990). Handbook of Neural Computing
Applications. Academic Press. ISBN: 0-12-471260-6. (451 pages)
Comments: "They cover a broad area"; "Introductory with suggested
applications implementation".

Pao, Y. H. (1989). Adaptive Pattern Recognition and Neural Networks
Addison-Wesley Publishing Company, Inc. (ISBN 0-201-12584-6)
Comments: "An excellent book that ties together classical approaches
to pattern recognition with Neural Nets. Most other NN books do not
even mention conventional approaches."

Rumelhart, D. E., Hinton, G. E. and Williams, R. J. (1986). Learning
representations by back-propagating errors. Nature, vol 323 (9 October),
pp. 533-536.
Comments: "Gives a very good potted explanation of backprop NN's. It gives
sufficient detail to write your own NN simulation."

Simpson, P. K. (1990). Artificial Neural Systems: Foundations, Paradigms,
Applications and Implementations. Pergamon Press: New York.
Comments: "Contains a very useful 37 page bibliography. A large number of
paradigms are presented. On the negative side the book is very shallow.
Best used as a complement to other books".

Zeidenberg. M. (1990). Neural Networks in Artificial Intelligence.
Ellis Horwood, Ltd., Chichester.
Comments: "Gives the AI point of view".

Zornetzer, S. F., Davis, J. L. and Lau, C. (1990). An Introduction to
Neural and Electronic Networks. Academic Press. (ISBN 0-12-781881-2)
Comments: "Covers quite a broad range of topics (collection of
articles/papers )."; "Provides a primer-like introduction and overview for
a broad audience, and employs a strong interdisciplinary emphasis".

------------------------------------------------------------------------

-A11.) Any journals and magazines about Neural Networks ?


[to be added: comments on speed of reviewing and publishing,
whether they accept TeX format or ASCII by e-mail, etc.]

A. Dedicated Neural Network Journals:
=====================================

~Title: Neural Networks
Publish: Pergamon Press
Address: Pergamon Journals Inc., Fairview Park, Elmsford,
New York 10523, USA and Pergamon Journals Ltd.
Headington Hill Hall, Oxford OX3, 0BW, England
Freq.: 6 issues/year (vol. 1 in 1988)
Cost/Yr: Free with INNS membership ($45?), Individual $65, Institution $175
ISSN #: 0893-6080
Remark: Official Journal of International Neural Network Society (INNS).
Contains Original Contributions, Invited Review Articles, Letters
to Editor, Invited Book Reviews, Editorials, Announcements and INNS
News, Software Surveys. This is probably the most popular NN journal.
(Note: Remarks supplied by Mike Plonski "[email protected]")
-------
~Title: Neural Computation
Publish: MIT Press
Address: MIT Press Journals, 55 Hayward Street Cambridge,
MA 02142-9949, USA, Phone: (617) 253-2889
Freq.: Quarterly (vol. 1 in 1989)
Cost/Yr: Individual $45, Institution $90, Students $35; Add $9 Outside USA
ISSN #: 0899-7667
Remark: Combination of Reviews (10,000 words), Views (4,000 words)
and Letters (2,000 words). I have found this journal to be of
outstanding quality.
(Note: Remarks supplied by Mike Plonski "[email protected]")
-----
~Title: IEEE Transaction on Neural Networks
Publish: Institute of Electrical and Electronics Engineers (IEEE)
Address: IEEE Service Cemter, 445 Hoes Lane, P.O. Box 1331, Piscataway, NJ,
08855-1331 USA. Tel: (201) 981-0060
Cost/Yr: $10 for Members belonging to participating IEEE societies
Freq.: Quarterly (vol. 1 in March 1990)
Remark: Devoted to the science and technology of neural networks
which disclose significant technical knowledge, exploratory
developments and applications of neural networks from biology to
software to hardware. Emphasis is on artificial neural networks.
Specific aspects include self organizing systems, neurobiological
connections, network dynamics and architecture, speech recognition,
electronic and photonic implementation, robotics and controls.
Includes Letters concerning new research results.
(Note: Remarks are from journal announcement)
-----
~Title: Journal of Neural Network Computing,
Technology, Design, and Applications
Publish: Auerback Publishers
Address: Auerback Publishers, 210 South Street, Boston, MA 02111-9812
Tel: (800) 950-1216
Freq.: Quarterly (vol. 1 in Summer 1989)
Cost/Yr: $145 in USA
Remark: 3 to 5 in-depth articles per issue; Bookshelf Section which
provides a several page introduction to a specific topic
(e.g. feedforward networks) and a list of references for further
reading on that topic; Software Reviews. Good quality, but a little
expensive for personal subscriptions. I got my corporate library
to buy it so I wouldn't have to.
(Note: Remarks supplied by Mike Plonski "[email protected]")
-----
~Title: International Journal of Neural Systems
Publish: World Scientific Publishing
Address: USA: World Scientific Publishing Co., 687 Hartwell Street, Teaneck,
NJ 07666. Tel: (201) 837-8858; Eurpoe: World Scientific Publishing
Co. Pte. Ltd., 73 Lynton Mead, Totteridge, London N20-8DH, England.
Tel: (01) 4462461; Other: World Scientific Publishing Co. Pte. Ltd.,
Farrer Road, P.O. Box 128, Singapore 9128. Tel: 2786188
Freq.: Quarterly (Vol. 1 in 1990?)
Cost/Yr: Individual $42, Institution $88 (plus $9-$17 for postage)
ISSN #: 0129-0657 (IJNS)
Remark: The International Journal of Neural Systems is a quarterly journal
which covers information processing in natural and artificial neural
systems. It publishes original contributions on all aspects of this
broad subject which involves physics, biology, psychology, computer
science and engineering. Contributions include research papers,
reviews and short communications. The journal presents a fresh
undogmatic attitude towards this multidisciplinary field with the
aim to be a forum for novel ideas and improved understanding of
collective and cooperative phenomena with computational capabilities.
(Note: Remarks supplied by B. Lautrup (editor),
"LAUTRUP%[email protected]" )
Review is reported to be very slow.
------
~Title: Neural Network News
Publish: AIWeek Inc.
Address: Neural Network News, 2555 Cumberland Parkway, Suite 299, Atlanta, GA
30339 USA. Tel: (404) 434-2187
Freq.: Monthly (beginning September 1989)
Cost/Yr: USA and Canada $249, Elsewhere $299
Remark: Commericial Newsletter
------
~Title: Network: Computation in Neural Systems
Publish: IOP Publishing Ltd
Address: Europe: IOP Publishing Ltd, Techno House, Redcliffe Way, Bristol
BS1 6NX, UK; IN USA: American Institute of Physics, Subscriber
Services 500 Sunnyside Blvd., Woodbury, NY 11797-2999
Freq.: Quarterly (1st issue 1990)
Cost/Yr: USA: $180, Europe: 110 pounds
Remark: Description: "a forum for integrating theoretical and experimental
findings across relevant interdisciplinary boundaries." Contents:
Submitted articles reviewed by two technical referees paper's
interdisciplinary format and accessability." Also Viewpoints and
Reviews commissioned by the editors, abstracts (with reviews) of
articles published in other journals, and book reviews.
Comment: While the price discourages me (my comments are based upon
a free sample copy), I think that the journal succeeds very well. The
highest density of interesting articles I have found in any journal.
(Note: Remarks supplied by brandt kehoe "[email protected]")
------
~Title: Connection Science: Journal of Neural Computing,
Artificial Intelligence and Cognitive Research
Publish: Carfax Publishing
Address: Europe: Carfax Publishing Company, P. O. Box 25, Abingdon,
Oxfordshire OX14 3UE, UK. USA: Carafax Publishing Company,
85 Ash Street, Hopkinton, MA 01748
Freq.: Quarterly (vol. 1 in 1989)
Cost/Yr: Individual $82, Institution $184, Institution (U.K.) 74 pounds
-----
~Title: International Journal of Neural Networks
Publish: Learned Information
Freq.: Quarterly (vol. 1 in 1989)
Cost/Yr: 90 pounds
ISSN #: 0954-9889
Remark: The journal contains articles, a conference report (at least the
issue I have), news and a calendar.
(Note: remark provided by J.R.M. Smits "[email protected]")
-----
~Title: Concepts in NeuroScience
Publish: World Scientific Publishing
Address: Same Address (?) as for International Journal of Neural Systems
Freq.: Twice per year (vol. 1 in 1989)
Remark: Mainly Review Articles(?)
(Note: remarks by Osamu Saito "[email protected]")
-----
~Title: International Journal of Neurocomputing
Publish: ecn Neurocomputing GmbH
Freq.: Quarterly (vol. 1 in 1989)
Remark: Commercial journal, not the academic periodicals
(Note: remarks by Osamu Saito "[email protected]")
Review has been reported to be fast (less than 3 months)
-----
~Title: Neurocomputers
Publish: Gallifrey Publishing
Address: Gallifrey Publishing, PO Box 155, Vicksburg, Michigan, 49097, USA
Tel: (616) 649-3772
Freq. Monthly (1st issue 1987?)
ISSN #: 0893-1585
Editor: Derek F. Stubbs
Cost/Yr: $32 (USA, Canada), $48 (elsewhere)
Remark: I only have one exemplar so I cannot give you much detail about
the contents. It is a very small one (12 pages) but it has a lot
of (short) information in it about e.g. conferences, books,
(new) ideas etc. I don't think it is very expensive but I'm not sure.
(Note: remark provided by J.R.M. Smits "[email protected]")
------
~Title: JNNS Newsletter (Newsletter of the Japan Neural Network Society)
Publish: The Japan Neural Network Society
Freq.: Quarterly (vol. 1 in 1989)
Remark: (IN JAPANESE LANGUAGE) Official Newsletter of the Japan Neural
Network Society(JNNS)
(Note: remarks by Osamu Saito "[email protected]")
-------
~Title: Neural Networks Today
Remark: I found this title in a bulletin board of october last year.
It was a message of Tim Pattison, [email protected]
(Note: remark provided by J.R.M. Smits "[email protected]")
-----
~Title: Computer Simulations in Brain Science
-----
~Title: Internation Journal of Neuroscience
-----
~Title: Neural Network Computation
Remark: Possibly the same as "Neural Computation"



B. NN Related Journals
======================

~Title: Complex Systems
Publish: Complex Systems Publications
Address: Complex Systems Publications, Inc., P.O. Box 6149, Champaign,
IL 61821-8149, USA
Freq.: 6 times per year (1st volume is 1987)
ISSN #: 0891-2513
Cost/Yr: Individual $75, Institution $225
Remark: Journal COMPLEX SYSTEMS devotes to the rapid publication of research
on the science, mathematics, and engineering of systems with simple
components but complex overall behavior. Send mail to
"[email protected]" for additional info.
(Remark is from announcement on Net)
-----
~Title: Biological Cybernetics (Kybernetik)
Publish: Springer Verlag
Remark: Monthly (vol. 1 in 1961)
-----
~Title: Various IEEE Transactions and Magazines
Publish: IEEE
Remark: Primarily see IEEE Trans. on System, Man and Cybernetics; Various
Special Issues: April 1990 IEEE Control Systems Magazine.; May 1989
IEEE Trans. Circuits and Systems.; July 1988 IEEE Trans. Acoust.
Speech Signal Process.
-----
~Title: The Journal of Experimental and Theoretical Artificial Intelligence
Publish: Taylor & Francis, Ltd.
Address: London, New York, Philadelphia
Freq.: ? (1st issue Jan 1989)
Remark: For submission information, please contact either of the editors:
Eric Dietrich Chris Fields
PACSS - Department of Philosophy Box 30001/3CRL
SUNY Binghamton New Mexico State University
Binghamton, NY 13901 Las Cruces, NM 88003-0001
[email protected]on.edu [email protected]
-----
~Title: The Behavioral and Brain Sciences
Publish: Cambridge University Press
Remark: (Expensive as hell, I'm sure.)
This is a delightful journal that encourages discussion on a
variety of controversial topics. I have especially enjoyed reading
some papers in there by Dana Ballard and Stephen Grossberg (separate
papers, not collaborations) a few years back. They have a really neat
concept: they get a paper, then invite a number of noted scientists
in the field to praise it or trash it. They print these commentaries,
and give the author(s) a chance to make a rebuttal or concurrence.
Sometimes, as I'm sure you can imagine, things get pretty lively. I'm
reasonably sure they are still at it--I think I saw them make a call
for reviewers a few months ago. Their reviewers are called something
like Behavioral and Brain Associates, and I believe they have to be
nominated by current associates, and should be fairly well established
in the field. That's probably more than I really know about it but
maybe if you post it someone who knows more about it will correct any
errors I have made. The main thing is that I liked the articles I
read. (Note: remarks by Don Wunsch )
-----
~Title: International Journal of Applied Intelligence
Publish: Kluwer Academic Publishers
Remark: first issue in 1990(?)
-----
~Title: Bulletin of Mathematica Biology
-----
~Title: Intelligence
-----
~Title: Journal of Mathematical Biology
-----
~Title: Journal of Complex System
-----
~Title: AI Expert
Publish: Miller Freeman Publishing Co., for subscription call ++415-267-7672.
Remark: Regularly includes ANN related articles, product
announcements, and application reports.
Listings of ANN programs are available on AI Expert affiliated BBS's
-----
~Title: International Journal of Modern Physics C
Publish: World Scientific Publ. Co.
Farrer Rd. P.O.Box 128, Singapore 9128
or: 687 Hartwell St., Teaneck, N.J. 07666 U.S.A
or: 73 Lynton Mead, Totteridge, London N20 8DH, England
Freq: published quarterly
Eds: G. Fox, H. Herrmann and K. Kaneko
-----
~Title: Machine Learning
Publish: Kluwer Academic Publishers
Address: Kluwer Academic Publishers
P.O. Box 358
Accord Station
Hingham, MA 02018-0358 USA
Freq.: Monthly (8 issues per year; increasing to 12 in 1993)
Cost/Yr: Individual $140 (1992); Member of AAAI or CSCSI $88
Remark: Description: Machine Learning is an international forum for
research on computational approaches to learning. The journal
publishes articles reporting substantive research results on a
wide range of learning methods applied to a variety of task
domains. The ideal paper will make a theoretical contribution
supported by a computer implementation.
The journal has published many key papers in learning theory,
reinforcement learning, and decision tree methods. Recently
it has published a special issue on connectionist approaches
to symbolic reasoning. The journal regularly publishes
issues devoted to genetic algorithms as well.



C. Journals loosely related to NNs
==================================

JOURNAL OF COMPLEXITY
(Must rank alongside Wolfram's Complex Systems)

IEEE ASSP Magazine
(April 1987 had the Lippmann intro. which everyone likes to cite)

ARTIFICIAL INTELLIGENCE
(Vol 40, September 1989 had the survey paper by Hinton)

COGNITIVE SCIENCE
(the Boltzmann machine paper by Ackley et al appeared here in Vol 9, 1983)

COGNITION
(Vol 28, March 1988 contained the Fodor and Pylyshyn critique of connectionism)

COGNITIVE PSYCHOLOGY
(no comment!)

JOURNAL OF MATHEMATICAL PSYCHOLOGY
(several good book reviews)

------------------------------------------------------------------------

-A12.) The most important conferences concerned with Neural Networks ?

[preliminary]
[to be added: has taken place how often yet; most emphasized topics;
where to get proceedings ]

A. Dedicated Neural Network Conferences:
1. Neural Information Processing Systems (NIPS)
Annually in Denver, Colorado; late November or early December
2. International Joint Conference on Neural Networks (IJCNN)
co-sponsored by INNS and IEEE
3. Annual Conference on Neural Networks (ACNN)
4. International Conference on Artificial Neural Networks (ICANN)
Annually in Europe(?), 1992 in Brighton
Major conference of European Neur. Netw. Soc. (ENNS)

B. Other Conferences
1. International Joint Conference on Artificial Intelligence (IJCAI)
2. Intern. Conf. on Acustics, Speech and Signal Processing (ICASSP)
3. Annual Conference of the Cognitive Science Society
4. [Vision Conferences?]

C. Pointers to Conferences
1. The journal "Neural Networks" has a long list of conferences,
workshops and meetings in each issue.
This is quite interdisciplinary.
2. There is a regular posting on comp.ai.neural-nets from Paultje Bakker:
"Upcoming Neural Network Conferences", which lists names, dates,
locations, contacts, and deadlines.

------------------------------------------------------------------------

-A13.) Neural Network Associations ?

[Is this data still correct ? Who will send me some update ?]

1. International Neural Network Society (INNS).
INNS membership includes subscription to "Neural Networks",
the official journal of the society.
Membership is $55 for non-students and $45 for students per year.
Address: INNS Membership, P.O. Box 491166, Ft. Washington, MD 20749.

2. International Student Society for Neural Networks (ISSNNets).
Membership is $5 per year.
Address: ISSNNet, Inc., P.O. Box 15661, Boston, MA 02215 USA

3. Women In Neural Network Research and technology (WINNERS).
Address: WINNERS, c/o Judith Dayhoff, 11141 Georgia Ave., Suite 206,
Wheaton, MD 20902. Telephone: 301-933-9000.

4. European Neural Network Society (ENNS)

5. Japanese Neural Network Society (JNNS)
Address: Japanese Neural Network Society
Department of Engineering, Tamagawa University,
6-1-1, Tamagawa Gakuen, Machida City, Tokyo,
194 JAPAN
Phone: +81 427 28 3457,Fax: +81 427 28 3597

6. Association des Connexionnistes en THese (ACTH)
(the French Student Association for Neural Networks)
Membership is 100 FF per year
Activities : newsletter, conference (every year), list of members...
Address : ACTH - Le Castelnau R2
23 avenue de la Galline
34170 Castelnau-le-Lez
FRANCE
Contact : [email protected]

7. Neurosciences et Sciences de l'Ingenieur (NSI)
Biology & Computer Science
Activity : conference (every year)
Address : NSI - TIRF / INPG
46 avenue Felix Viallet
38031 Grenoble Cedex
FRANCE


------------------------------------------------------------------------

-A14.) Other sources of information about NNs ?

1. Neuron Digest
Internet Mailing List. From the welcome blurb:
"Neuron-Digest is a list (in digest form) dealing with all aspects
of neural networks (and any type of network or neuromorphic system)"
Moderated by Peter Marvit.
To subscribe, send email to [email protected]
comp.ai.neural-net readers also find the messages in that newsgroup
in the form of digests.

2. Usenet groups comp.ai.neural-nets (Oha ! :-> )
and comp.theory.self-org-sys
There is a periodic posting on comp.ai.neural-nets sent by
[email protected] (Gregory Aharonian) about Neural Network
patents.

3. Central Neural System Electronic Bulletin Board
Modem: 509-627-6CNS; Sysop: Wesley R. Elsberry;
P.O. Box 1187, Richland, WA 99352; [email protected]
Available thrugh FidoNet, RBBS-Net, and other EchoMail compatible
bulletin board systems as NEURAL_NET echo.

4. Neural ftp archive site funic.funet.fi
Is administrating a large collection of neural network papers and
software at the Finnish University Network file archive site
funic.funet.fi in directory /pub/sci/neural

Contains all the public domain software and papers that they
have been able to find.
ALL of these files have been transferred from FTP sites in U.S.
and are mirrored about every 3 months at fastest.
Contact: [email protected]
[email protected](my home university address)

5. USENET newsgroup comp.org.issnnet
Forum for discussion of academic/student-related issues in NNs, as
well as information on ISSNNet (see A13) and its activities.

------------------------------------------------------------------------

-A15.) Freely available software packages for NN simulation ?


[This is a bit chaotic and needs reorganization.
A bit more information about what the various programs can do,
on which platform they run, and how big they are would also be nice.
And some important packages are still missing (?)
Who volunteers for that ?]

1. Rochester Connectionist Simulator
A quite versatile simulator program for arbitrary types of
neural nets. Comes with a backprop package and a X11/Sunview
interface.
anonymous FTP from cs.rochester.edu (192.5.53.209)
directory : pub/simulator
files: README (8 KB)
(documentation:) rcs_v4.2.justdoc.tar.Z (1.6 MB)
(source code:) rcs_v4.2.justsrc.tar.Z (1.4 MB)

2. UCLA-SFINX
ftp 131.179.16.6 (retina.cs.ucla.edu)
Name: sfinxftp
Password: joshua
directory: pub/
files : README
sfinx_v2.0.tar.Z
Email info request : [email protected]

3. NeurDS
request from mcclanahan%[email protected]
simulator for DEC systems supporting VT100 terminal.
OR
anonymous ftp gatekeeper.dec.com [16.1.0.2]
directory: pub/DEC
file: NeurDS031.tar.Z ( please check may be NeurDSO31.tar.Z )

4. PlaNet5.7 (also known as SunNet)
ftp 133.15.240.3 (tutserver.tut.ac.jp)
pub/misc/PlaNet5.7.tar.Z
or
ftp 128.138.240.1 (boulder.colorado.edu)
pub/generic-sources/PlaNet5.7.tar.Z (also the old PlaNet5.6.tar.Z)
A popular connectionist simulator with versions to
run under X Windows, and non-graphics terminals
created by Yoshiro Miyata (Chukyo Univ., Japan).
60-page User's Guide in Postscript.
Send any questions to [email protected]

5. GENESIS
anonymous ftp 131.215.135.64 ( genesis.cns.caltech.edu )
Register first via telnet genesis.cns.caltech.edu
login as: genesis

6. Mactivation
anonymous ftp from bruno.cs.colorado.edu [128.138.243.151]
directory: /pub/cs/misc
file: Mactivation-3.3.sea.hqx

7. CMU Connectionist Archive
There is a lisp backprop simulator in the
connectionist archive.
unix> ftp b.gp.cs.cmu.edu (or 128.2.242.8)
Name: ftpguest
Password: cmunix
ftp> cd connectionists/archives
ftp> get backprop.lisp

8. Cascade Correlation Simulator
There is a LISP and C version of the
simulator based on Scott Fahlman's Cascade Correlation algorithm,
who also created the LISP version. The C version was created by
Scott Crowder.
Anonymous ftp from pt.cs.cmu.edu (or 128.2.254.155)
directory /afs/cs/project/connect/code
files cascor1.lisp (56 KB)
cascor1.c (108 KB)

9. Quickprop
A variation of the back-propagation algorithm developed by
Scott Fahlman. A LISP and C version can be obtained in the
same directory as the cascade correlation simulator above. (25 KB)

10. DartNet
DartNet is a Macintosh-based Neural Network Simulator. It makes
full use of the Mac's graphical interface, and provides a
number of powerful tools for building, editing, training,
testing and examining networks.
This program is available by anonymous ftp from
dartvax.dartmouth.edu [129.170.16.4] as
/pub/mac/dartnet.sit.hqx (124 KB)
Copies may also be obtained through email from [email protected]
Along with a number of interface improvements and feature
additions, v2.0 is an extensible simulator. That is,
new network architectures and learning algorithms can be
added to the system by writing small XCMD-like CODE
resources called nDEF's ("Network Definitions"). A number
of such architectures are included with v2.0, as well as
header files for creating new nDEF's.
Contact: [email protected] (Sean P. Nolan)

11. SNNS
"Stuttgarter Neuronale Netze Simulator" from the University
of Stuttgart, Germany.
A luxurious simulator for many types of nets; with X11 interface:
Graphical topology editor, training visualisation, etc.
ftp: ifi.informatik.uni-stuttgart.de [129.69.211.1]
directory /pub/SNNS
file SNNSv2.1.tar.Z OR SNNSv2.1.tar.Za[a-d] ( 826271 Bytes)
manual SNNSv2.1.Manual.ps.Z (1041375 Bytes)
SNNSv2.1.Readme( 7645 Bytes)

12. Aspirin/MIGRAINES
Aspirin/MIGRAINES 6.0 consists of a code generator that builds neural network simulations by reading a network description (written in a language
called "Aspirin") and generates a C simulation. An interface
(called "MIGRAINES") is provided to export data from the neural
network to visualization tools.
The system has been ported to a large number of platforms.
The goal of Aspirin is to provide a common extendible front-end language
and parser for different network paradigms.
The MIGRAINES interface is a terminal based interface
that allows you to open Unix pipes to data in the neural
network. This replaces the NeWS1.1 graphical interface
in version 4.0 of the Aspirin/MIGRAINES software. The
new interface is not a simple to use as the version 4.0
interface but is much more portable and flexible.
The MIGRAINES interface allows users to output
neural network weight and node vectors to disk or to
other Unix processes. Users can display the data using
either public or commercial graphics/analysis tools.
Example filters are included that convert data exported through
MIGRAINES to formats readable by Gnuplot 3.0, Matlab, Mathematica,
and xgobi.
The software is available from two FTP sites:
CMU's simulator collection on "pt.cs.cmu.edu" (128.2.254.155)
in /afs/cs/project/connect/code/am6.tar.Z".
and UCLA's cognitive science machine "ftp.cognet.ucla.edu" (128.97.50.19)
in alexis/am6.tar.Z
The compressed tar file is a little less than 2 megabytes.

13. Adaptive Logic Network kit
Available from menaik.cs.ualberta.ca. This package differs from
the traditional nets in that it uses logic functions rather than
floating point; for many tasks, ALN's can show many orders of
magnitude gain in training and performance speed.
Anonymous ftp from menaik.cs.ualberta.ca [129.128.4.241]
unix source code and examples: /pub/atree2.tar.Z (145 KB)
Postscript documentation: /pub/atree2.ps.Z ( 76 KB)
MS-DOS Windows 3.0 version: /pub/atree2.zip (353 KB)
/pub/atree2zip.readme (1 KB)

14. NeuralShell
Availible from FTP site quanta.eng.ohio-state.edu
(128.146.35.1) in directory "pub/NeuralShell", filename
"NeuralShell.tar".

15. PDP
The PDP simulator package is available via anonymous FTP at
nic.funet.fi (128.214.6.100) in /pub/sci/neural/sims/pdp.tar.Z (0.2 MB)
The simulator is also available with the book
"Explorations in Parallel Distributed Processing: A Handbook of
Models, Programs, and Exercises" by McClelland and Rumelhart.
MIT Press, 1988.
Comment: "This book is often referred to as PDP vol III which is a very
misleading practice! The book comes with software on an IBM disk but
includes a makefile for compiling on UNIX systems. The version of
PDP available at nic.funet.fi seems identical to the one with the book
except for a bug in bp.c which occurs when you try to run a script of
PDP commands using the DO command. This can be found and fixed easily."

16. Xerion
Xerion is available via anonymous ftp from
ftp.cs.toronto.edu in the directory /pub/xerion.
xerion-3.0.PS.Z (0.9 MB) and xerion-3.0.tar.Z (1.1 MB) plus
several concrete simulators built with xerion (about 0.3 MB each,
see below).
Xerion runs on SGI and Sun machines and uses X Windows for graphics.
The software contains modules that implement Back Propagation,
Recurrent Back Propagation, Boltzmann Machine, Mean Field Theory,
Free Energy Manipulation, Hard and Soft Competitive Learning, and
Kohonen Networks. Sample networks built for each of the modules are
also included.
Contact: [email protected]

17. Neocognitron simulator
An implementation is available for anonymous ftp at
[128.194.15.32] tamsun.tamu.edu as /pub/neocognitron.Z.tar
The simulator is written in C and comes with a list of references
which are necessary to read to understand the specifics of the
implementation. The unsupervised version is coded without (!)
C-cell inhibition.

18. Multi-Module Neural Computing Environment (MUME)

MUME is a simulation environment for multi-modules neural computing. It
provides an object oriented facility for the simulation and training
of multiple nets with various architectures and learning algorithms.
MUME includes a library of network architectures including feedforward,
simple recurrent, and continuously running recurrent neural networks.
Each architecture is supported by a variety of learning algorithms.
MUME can be used for large scale neural network simulations as it provides
support for learning in multi-net environments. It also provide pre- and
post-processing facilities.
The modules are provided in a library. Several "front-ends" or clients are
also available.
MUME can be used to include non-neural computing modules (decision
trees, ...) in applications.
The software is the product of a number of staff and postgraduate students
at the Machine Intelligence Group at Sydney University Electrical
Engineering.
The software is written in 'C' and is being used on Sun and DEC
workstations. Efforts are underway to port it to the Fujitsu VP2200
vector processor using the VCC vectorising C compiler.
MUME is made available to research institutions on media/doc/postage cost
arrangements. Information on how to acquire it may be obtained by writing
(or email) to:
Marwan Jabri
SEDAL
Sydney University Electrical Engineering
NSW 2006 Australia
[email protected]

19. LVQ_PAK, SOM_PAK
These are packages for Learning Vector Quantization and
Self-Organizing Maps, respectively.
They have been built by the LVQ/SOM Programming Team of the
Helsinki University of Technology, Laboratory of Computer and
Information Science, Rakentajanaukio 2 C, SF-02150 Espoo, FINLAND There are versions for Unix and MS-DOS available from
cochlea.hut.fi (130.233.168.48) in
/pub/lvq_pak/lvq_pak-2.1.tar.Z (340 kB, Unix)
/pub/lvq_pak/lvq_p2r1.exe (310 kB, MS-DOS self-extract archive)
/pub/som_pak/som_pak-1.1.tar.Z (246 kB, Unix)
/pub/som_pak/som_p1r1.exe (215 kB, MS-DOS self-extract archive)

For some of these simulators there are user mailing lists. Get the
packages and look into their documentation for further info.

If you are using a small computer (PC, Mac, etc.) you may want to have
a look at the Central Neural System Electronic Bulletin Board
(see Answer 14)
Modem: 509-627-6CNS; Sysop: Wesley R. Elsberry;
P.O. Box 1187, Richland, WA 99352; [email protected]
There are lots of small simulator packages, the CNS ANNSIM file set.
There is an ftp mirror site for the CNS ANNSIM file set at
me.uta.edu (129.107.2.20) in the /pub/neural directory. Most ANN
offerings are in /pub/neural/annsim.

------------------------------------------------------------------------

-A16.) Commercial software packages for NN simulation ?

[preliminary]
[who will write some short comment on each of the most
important packages ?]

The Number 1 of each volume of the journal "Neural Networks" has a list
of some dozens of commercial suppliers of Neural Network things:
Software, Hardware, Support, Programming, Design and Service.

Here is a naked list of names of Simulators running on PC (and, partly,
some other platforms, too):

1. NeuralWorks Professional 2+
2. AIM
3. BrainMaker Professional
4. Brain Cel
5. Neural Desk
6. Neural Case
7. Neuro Windows
8. Explorenet 3000

------------------------------------------------------------------------

-A17.) Neural Network hardware ?

[preliminary]
[who will write some short comment on the most important
HW-packages and chips ?]

The Number 1 of each volume of the journal "Neural Networks" has a list
of some dozens of suppliers of Neural Network support:
Software, Hardware, Support, Programming, Design and Service.

Here is a list of companies contributed by [email protected]:

1. HNC, INC.
5501 Oberlin Drive
San Diego
California 92121
(619) 546-8877

and a second address at

7799 Leesburg Pike, Suite 900
Falls Church, Virginia
22043
(703) 847-6808

Note: Australian Dist.: Unitronics
Tel : (09) 4701443
Contact: Martin Keye

HNC markets:
'Image Document Entry Processing Terminal' - it recognises
handwritten documents and converts the info to ASCII.

'ExploreNet 3000' - a NN demonstrator

'Anza/DP Plus'- a Neural Net board with 25MFlop or 12.5M peak
interconnects per second.

2. SAIC (Sience Application International Corporation)
10260 Campus Point Drive
MS 71, San Diego
CA 92121
(619) 546 6148
Fax: (619) 546 6736

3. Micro Devices
30 Skyline Drive
Lake Mary
FL 32746-6201
(407) 333-4379

MicroDevices makes MD1220 - 'Neural Bit Slice'

Each of the products mentioned sofar have very different usages.
Although this sounds similar to Intel's product, the
architectures are not.

4. Intel Corp
2250 Mission College Blvd
Santa Clara, Ca 95052-8125
Attn ETANN, Mail Stop SC9-40
(408) 765-9235

Intel is making an experimental chip:
80170NW - Electrically trainable Analog Neural Network (ETANN)

It has 64 'neurons' on it - almost fully internally connectted
and the chip can be put in an hierarchial architecture to do 2 Billion
interconnects per second.

Support software has already been made by

California Scientific Software
10141 Evening Star Dr #6
Grass Valley, CA 95945-9051
(916) 477-7481

Their product is called 'BrainMaker'.

5. NeuralWare, Inc
Penn Center West
Bldg IV Suite 227
Pittsburgh
PA 15276

They only sell software/simulator but for many platforms.

6. Tubb Research Limited
7a Lavant Street
Peterfield
Hampshire
GU32 2EL
United Kingdom
Tel: +44 730 60256

7. Adaptive Solutions Inc
1400 NW Compton Drive
Suite 340
Beaverton, OR 97006
U. S. A.
Tel: 503 - 690 - 1236
FAX: 503 - 690 - 1249

------------------------------------------------------------------------

-A19.) Databases for experimentation with NNs ?

[are there any more ?]

1. The nn-bench Benchmark collection
accessible via anonymous FTP on
"pt.cs.cmu.edu"
in directory
"/afs/cs/project/connect/bench"
or via the Andrew file system in the directory
"/afs/cs.cmu.edu/project/connect/bench"
In case of problems email contact is "[email protected]".
The data sets in this repository include the 'nettalk' data, the
'two spirals' problem, a vowel recognition task, and a few others.

2. UCI machine learning database
accessible via anonymous FTP on
"ics.uci.edu" [128.195.1.1]
in directory
"/pub/machine-learning-databases"

3. NIST special databases of the National Institute Of Standards
And Technology:
NIST special database 2:
Structured Forms Reference Set (SFRS)

The NIST database of structured forms contains 5,590 full page images
of simulated tax forms completed using machine print. THERE IS NO REAL
TAX DATA IN THIS DATABASE. The structured forms used in this database
are 12 different forms from the 1988, IRS 1040 Package X. These
include Forms 1040, 2106, 2441, 4562, and 6251 together with Schedules
A, B, C, D, E, F and SE. Eight of these forms contain two pages or
form faces making a total of 20 form faces represented in the
database. Each image is stored in bi-level black and white raster
format. The images in this database appear to be real forms prepared
by individuals but the images have been automatically derived and
synthesized using a computer and contain no "real" tax data. The entry
field values on the forms have been automatically generated by a
computer in order to make the data available without the danger of
distributing privileged tax information. In addition to the images
the database includes 5,590 answer files, one for each image. Each
answer file contains an ASCII representation of the data found in the
entry fields on the corresponding image. Image format documentation
and example software are also provided. The uncompressed database
totals approximately 5.9 gigabytes of data.

NIST special database 3:
Binary Images of Handwritten Segmented Characters (HWSC)

Contains 313,389 isolated character images segmented from the
2,100 full-page images distributed with "NIST Special Database 1".
223,125 digits, 44,951 upper-case, and 45,313 lower-case character
images. Each character image has been centered in a separate
128 by 128 pixel region, error rate of the segmentation and
assigned classification is less than 0.1%.
The uncompressed database totals approximately 2.75 gigabytes of
image data and includes image format documentation and example software.


NIST special database 4:
8-Bit Gray Scale Images of Fingerprint Image Groups (FIGS)

The NIST database of fingerprint images contains 2000 8-bit gray scale
fingerprint image pairs. Each image is 512 by 512 pixels with 32 rows
of white space at the bottom and classified using one of the five
following classes: A=Arch, L=Left Loop, R=Right Loop, T=Tented Arch,
W=Whirl. The database is evenly distributed over each of the five
classifications with 400 fingerprint pairs from each class. The images
are compressed using a modified JPEG lossless compression algorithm
and require approximately 636 Megabytes of storage compressed and 1.1
Gigabytes uncompressed (1.6 : 1 compression ratio). The database also
includes format documentation and example software.

More short overview:
Special Database 1 - NIST Binary Images of Printed Digits, Alphas, and Text
Special Database 2 - NIST Structured Forms Reference Set of Binary Images
Special Database 3 - NIST Binary Images of Handwritten Segmented Characters
Special Database 4 - NIST 8-bit Gray Scale Images of Fingerprint Image Groups Special Database 6 - NIST Structured Forms Reference Set 2 of Binary Images
Special Database 7 - NIST Test Data 1: Binary Images of Handprinted Segmented Characters
Special Software 1 - NIST Scoring Package Release 1.0

Special Database 1 - $895.00
Special Database 2 - $250.00
Special Database 3 - $895.00
Special Database 4 - $250.00
Special Database 6 - $250.00
Special Database 7 - $1,000.00
Special Software 1 - $1,150.00

The system requirements for all databases are a 5.25" CD-ROM drive
with software to read ISO-9660 format.

Contact: Darrin L. Dimmick
[email protected] (301)975-4147

If you wish to order the database, please contact:
Standard Reference Data
National Institute of Standards and Technology
221/A323
Gaithersburg, MD 20899
(301)975-2208 or (301)926-0416 (FAX)

4. CEDAR CD-ROM 1: Database of Handwritten
Cities, States, ZIP Codes, Digits, and Alphabetic Characters

The Center Of Excellence for Document Analysis and Recognition (CEDAR)
State University of New York at Buffalo announces the availability of
CEDAR CDROM 1: USPS Office of Advanced Technology
The database contains handwritten words and ZIP Codes
in high resolution grayscale (300 ppi 8-bit) as well as
binary handwritten digits and alphabetic characters (300 ppi
1-bit). This database is intended to encourage research in
off-line handwriting recognition by providing access to
handwriting samples digitized from envelopes in a working
post office.
Specifications of the database include:
+ 300 ppi 8-bit grayscale handwritten words (cities,
states, ZIP Codes)
o 5632 city words
o 4938 state words
o 9454 ZIP Codes
+ 300 ppi binary handwritten characters and digits:
o 27,837 mixed alphas and numerics segmented
from address blocks
o 21,179 digits segmented from ZIP Codes
+ every image supplied with a manually determined
truth value
+ extracted from live mail in a working U.S. Post
Office
+ word images in the test set supplied with dic-
tionaries of postal words that simulate partial
recognition of the corresponding ZIP Code.
+ digit images included in test set that simulate
automatic ZIP Code segmentation. Results on these
data can be projected to overall ZIP Code recogni-
tion performance.
+ image format documentation and software included
System requirements are a 5.25" CD-ROM drive with software to read ISO-
9660 format.
For any further information, including how to order the
database, please contact:
Jonathan J. Hull, Associate Director, CEDAR, 226 Bell Hall
State University of New York at Buffalo, Buffalo, NY 14260
[email protected] (email)

------------------------------------------------------------------------



That's all folks.

========================================================================

Acknowledgements: Thanks to all the people who helped to get the stuff
above into the posting. I cannot name them all, because
I would make far too many errors then. :->

No ? Not good ? You want individual credit ?
OK, OK. I'll try to name them all. But: no guarantee....

THANKS FOR HELP TO:
(in alphabetical order of email adresses, I hope)

S.Taimi Ames
[email protected]
Kim L. Blackwell
Paul Bakker
Yijun Cai
L. Leon Campbell
David DeMers
Denni Rognvaldsson
Wesley R. Elsberry
Frank Schnorrenberg
Gary Lawrence Murphy
[email protected]
Glen Clark
[email protected]
Jean-Denis Muller
Jonathan Kamens
Luke Koops
William Mackeown
Peter Marvit
Yoshiro Miyata
Jyrki Alakuijala
[email protected]
Michael Plonski
[myself]
Richard Cornelius
Rob Cunningham
Osamu Saito
Ted Stockwell
Thomas G. Dietterich
[email protected]
Ulrich Wendl
Matthew P Wiener

Bye

Lutz

--
Lutz Prechelt (email: [email protected]) | Whenever you
Institut fuer Programmstrukturen und Datenorganisation | complicate things,
Universitaet Karlsruhe; D-7500 Karlsruhe 1; Germany | they get
(Voice: ++49/721/608-4317, FAX: ++49/721/694092) | less simple.

226 Transfer complete.
local: |more remote: neural-nets-faq
70669 bytes received in 97 seconds (0.71 Kbytes/s)
ftp>


 December 5, 2017  Add comments

Leave a Reply