Introducing
Your new presentation assistant.
Refine, enhance, and tailor your content, source relevant images, and edit visuals quicker than ever before.
Trending searches
Artificial Neural Networks (ANN) as
a Classification Method
Arwa Alsaiari
April 22 , 2020
Classification is placing objects into categories based on some attributes.
However, over time, attention moved to
performing specific tasks.
In 1958 Frank Rosenblatt created the “Perceptron” model, which was the first of its kind to perform pattern recognition which only consisted of a single layer.[3]
In 1943 ,Warren McCulloch and Walter Pitts opened the subject by writing a paper on how neurons might work. In order to describe how neurons in the brain might work, they modeled a simple neural network using electrical circuits.[1]
But in 1975, Marvin Minsky and Seymour Papert found multiple problems with the Perceptron model [4], which were later solved by Paul Werbos, using Back Propagation.[5]
In 1949, Donald Hebb pointed out that neural pathways are strengthened each time they are used. It is similar to the ways in which humans learn.[2]
In 1982, John Hopfield presented a paper to the National Academy of Sciences. Previously, the connections between neurons was only one way.His approach was to create more useful machines by using bidirectional lines.[6]
In 1982, there was a joint US-Japan conference on Cooperative/Competitive Neural Networks.
Japan announced a new Fifth Generation effort on neural networks,
resulting US worrying about being left behind. As a result, there was more funding and thus more research in the field.[8]
In 1982, Reilly and Cooper used a "Hybrid network" with multiple layers, each layer using a different problem-solving strategy.[7]
Today, ANN discussions are occurring everywhere, and the importance of ANNs becomes observed.
ANNs have taken a place as important mathematical/engineering tools.
However, The most important advances in ANNs almost
will be in the future.
The large number and wide variety of applications of this technology are very encouraging .
In 1997, a recurrent neural network framework, Long Short-Term Memory (LSTM) was proposed by Schmidhuber & Hochreiter.[9]
In 1998, Yann LeCun published Gradient-Based Learning Applied to Document Recognition.[10]
In general, an ANN is a collection of connected nodes called
artificial neurons.
The nodes are organized into multiple layers (input, hidden,
and output).
Each node in one of these layers is connected to all the nodes in the next layer by links.
The connections (links) are used to transmit a signal (piece of
information) from a node to others. Each link has a weight, which determines the strength of one node's influence on another.
References:
1. McCulloch, Warren S., and Walter Pitts. "A logical calculus of the ideas immanent in nervous activity." The bulletin of mathematical biophysics 5.4 (1943): 115-133.
2. Hebb, Donald Olding. The organization of behavior: a neuropsychological theory. J. Wiley; Chapman & Hall, 1949.
3. Rosenblatt, Frank. "The perceptron: a probabilistic model for information storage and organization in the brain." Psychological review 65.6 (1958): 386.
4. Minsky, Marvin, and Seymour A. Papert. Perceptrons: An introduction to computational geometry. MIT press, 2017.
5. Widrow, Bernard, and Michael A. Lehr. "Perceptrons, Adalines, and backpropagation." Arbib 4 (1995): 719-724.
References:
6. Hopfield, John J. "Learning algorithms and probability distributions in feed-forward and feed-back networks." Proceedings of the national academy of sciences 84.23 (1987): 8429-8433.
7. Cooper, Leon N., Charles Elbaum, and Douglas L. Reilly. "Self organizing general pattern class separator and identifier." U.S. Patent No. 4,326,259. 20 Apr. 1982.
8. Amari, Shun-ichi, and Michael A. Arbib. "Competition and cooperation in neural nets." Springer Lecture Notes in Biomathematics 45 (1982).
9. Hochreiter, Sepp, and Jürgen Schmidhuber. "Long short-term memory." Neural computation 9.8 (1997): 1735-1780.
10. LeCun, Yann, et al. "Gradient-based learning applied to document recognition." Proceedings of the IEEE 86.11 (1998): 2278-2324.
Thank you !
Questions !