Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

Hebbian Learning: Fundamentals and Applications for Uniting Memory and Learning
Hebbian Learning: Fundamentals and Applications for Uniting Memory and Learning
Hebbian Learning: Fundamentals and Applications for Uniting Memory and Learning
Ebook111 pages1 hour

Hebbian Learning: Fundamentals and Applications for Uniting Memory and Learning

Rating: 0 out of 5 stars

()

Read preview

About this ebook

What Is Hebbian Learning


The Hebbian theory is a neuropsychological theory that asserts that an improvement in synaptic efficacy results from the repetitive and persistent stimulation of a postsynaptic cell by a presynaptic cell. This is an effort to explain synaptic plasticity, which refers to the process through which neurons in the brain change in response to learning. It was first presented in Donald Hebb's book titled The Organization of Behavior, which was published in 1949. Hebb's rule, Hebb's postulate, and the cell assembly hypothesis are all names for the same body of thought. The way that Hebb expresses it is as follows: Let us assume that the persistence or repetition of a reverberatory action tends to create long-lasting cellular modifications that add to its stability. ... When an axon of cell A is close enough to excite a cell B and takes part in firing it repeatedly or consistently, a growth process or metabolic change takes occur in one or both of the cells, which results in an increase in cell A's efficiency as one of the cells firing cell B. This can happen in either cell.


How You Will Benefit


(I) Insights, and validations about the following topics:


Chapter 1: Hebbian theory


Chapter 2: Chemical synapse


Chapter 3: Long-term potentiation


Chapter 4: Synaptic plasticity


Chapter 5: Long-term depression


Chapter 6: Spike-timing-dependent plasticity


Chapter 7: Neural circuit


Chapter 8: Metaplasticity


Chapter 9: Oja's rule


Chapter 10: BCM theory


(II) Answering the public top questions about hebbian learning.


(III) Real world examples for the usage of hebbian learning in many fields.


Who This Book Is For


Professionals, undergraduate and graduate students, enthusiasts, hobbyists, and those who want to go beyond basic knowledge or information for any kind of hebbian learning.


What Is Artificial Intelligence Series


The artificial intelligence book series provides comprehensive coverage in over 200 topics. Each ebook covers a specific Artificial Intelligence topic in depth, written by experts in the field. The series aims to give readers a thorough understanding of the concepts, techniques, history and applications of artificial intelligence. Topics covered include machine learning, deep learning, neural networks, computer vision, natural language processing, robotics, ethics and more. The ebooks are written for professionals, students, and anyone interested in learning about the latest developments in this rapidly advancing field.
The artificial intelligence book series provides an in-depth yet accessible exploration, from the fundamental concepts to the state-of-the-art research. With over 200 volumes, readers gain a thorough grounding in all aspects of Artificial Intelligence. The ebooks are designed to build knowledge systematically, with later volumes building on the foundations laid by earlier ones. This comprehensive series is an indispensable resource for anyone seeking to develop expertise in artificial intelligence.

LanguageEnglish
Release dateJun 20, 2023
Hebbian Learning: Fundamentals and Applications for Uniting Memory and Learning

Read more from Fouad Sabry

Related to Hebbian Learning

Titles in the series (100)

View More

Related ebooks

Intelligence (AI) & Semantics For You

View More

Related articles

Reviews for Hebbian Learning

Rating: 0 out of 5 stars
0 ratings

0 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    Hebbian Learning - Fouad Sabry

    Chapter 1: Hebbian theory

    The Hebbian hypothesis is a neuropsychological theory that asserts that an improvement in synaptic efficacy results from the repetitive and sustained stimulation of a postsynaptic cell by a presynaptic cell. This is an effort to explain synaptic plasticity, which refers to the process through which neurons in the brain change in response to learning. It was first presented in Donald Hebb's book titled The Organization of Behavior, which was published in 1949. Hebb's rule, Hebb's postulate, and the cell assembly hypothesis are all names for the same body of thought. Hebb explains it in the following manner::

    Let's make the assumption that the continuation or recurrence of a reverberatory activity (or trace) has a tendency to cause long-lasting cellular modifications that contribute to the stability of the activity.

    ..

    When an axon of cell A is close enough to a cell B to excite it, and when that axon of cell A regularly or consistently participates in firing the cell B, some growth process or metabolic change takes place in one or both cells such that A’s efficiency, as an example of a cell that is producing B, is increased.

    Cells that fire together wire together is a phrase that is often used to summarize this notion.

    This hypothesis makes an effort to explain associative learning, also known as Hebbian learning, which takes place when the stimulation of several cells at the same time results in significant increases in the synaptic strength connecting those cells. In addition to this, it offers a biological foundation for the development of error-free learning approaches that may be used in education and memory restoration. It is often considered to be the neuronal foundation of unsupervised learning when it comes to the study of neural networks' roles in cognitive function.

    The Hebbian hypothesis investigates the potential ways in which neurons might interact with one another to form engrams.

    Hebb's theories on the form and function of cell assemblies can be understood from the following:: 70

    The concept that any two cells or systems of cells that are regularly active at the same time would tend to become connected in such a way that activity in one aids activity in the other is an ancient one. This is the broad theory, and it has been around for a long time.

    Hebb also wrote:: 63

    When one cell helps to fire another cell repeatedly, the axon of the first cell produces synaptic knobs (or enlarges them if they already exist) in contact with the soma of the second cell. This can only happen if the two cells are in close proximity to one another.

    Along the lines of the notion of auto-association, which is defined as follows, [D. Alan Allport] proposes further concepts involving cell assembly theory and its involvement in producing engrams. These theories are explained as follows::

    If the inputs to a system cause the same pattern of activity to occur again and over again, then the system is said to be in a stable state, The active parts that make up that pattern will become progressively firmly interassociated with one another.

    That is, Each element will have a greater propensity to turn on every other element and, with negative weights, will have a greater propensity to turn off the components that are not a part of the pattern.

    To phrase it in a different way:, The pattern in its whole will develop a auto-associated relationship.

    We may call a learned (auto-associated) pattern an engram.: 44

    Evidence for the participation of Hebbian learning processes at synapses in the marine gastropod Aplysia californica has been revealed as a result of research conducted in the laboratory of Eric Kandel. Research on Hebbian synapse modification processes at the central nervous system synapses of vertebrates are much more challenging to regulate compared to experiments on the comparatively straightforward peripheral nervous system synapses explored in marine invertebrates. A significant portion of the research that is done on long-lasting synaptic alterations between vertebrate neurons (such as long-term potentiation) includes the use of experimental stimulation of brain cells that is not based on physiological principles. On the other hand, it seems that some of the physiologically relevant synapse remodeling mechanisms that have been investigated in the brains of vertebrates are, in fact, instances of Hebbian processes. One such research evaluates the findings of previous studies that suggest that physiologically appropriate synaptic activity may generate long-lasting changes in synaptic strengths. These changes can be brought about by Hebbian and non-Hebbian processes, according to the findings of the study.

    Hebb's principle is a technique that may be used to determine how to change the weights between model neurons. When seen from the perspective of artificial neurons and artificial neural networks, this principle can be stated as a method. When two neurons fire together, the weight between them rises, but it decreases when they fire in isolation from one another. Strong positive weights are assigned to nodes that tend to be either both positive at the same time or both negative at the same time. On the other hand, nodes that tend to be opposite are assigned strong negative weights.

    The process of learning according to the Hebbian model may be summed up as follows: (many more ways of describing it are also feasible)

    \,w_{ij}=x_ix_j

    where w_{ij} is the weight of the connection from neuron j to neuron i and x_{i} the input for neuron i .

    Take note that this is known as pattern recognition (weights updated after every training example).

    In a network based on Hopfield, connections w_{ij} are set to zero if i=j (no reflexive connections allowed).

    With two types of neurons (activations either 0 or 1), If the neurons that are coupled have the same level of activity for a pattern, then the connections will be set to 1.

    When a number of different training patterns are used, the expression reverts to being an average of the individual ones:

    {\displaystyle w_{ij}={\frac {1}{p}}\sum _{k=1}^{p}x_{i}^{k}x_{j}^{k}=\langle x_{i}x_{j}\rangle ,\,}

    where w_{ij} is the weight of the connection from neuron j to neuron i , p is the number of training patterns, x_{i}^k the k th input for neuron i and <> is the average over all training patterns.

    This is what's known as learning by example (weights updated after all the training examples are presented), being the penultimate phrase that can be used to both continuous and discontinuous training sets.

    Again, in the context of a Hopfield network, connections w_{ij} are set to zero if i=j (no reflexive connections).

    The mathematical model developed by Harry Klopf is one example of a variant of Hebbian learning. This model takes into consideration a variety of neuronal learning events in addition to blocking, for example. The model developed by Klopf is not only easy to use, but it is also capable of accurately reproducing a wide variety of biological occurrences.

    Because of the straightforward character of Hebbian learning, which is based only on the concomitance

    Enjoying the preview?
    Page 1 of 1