Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

Simulated Annealing: Fundamentals and Applications
Simulated Annealing: Fundamentals and Applications
Simulated Annealing: Fundamentals and Applications
Ebook116 pages1 hour

Simulated Annealing: Fundamentals and Applications

Rating: 0 out of 5 stars

()

Read preview

About this ebook

What Is Simulated Annealing


The method of simulated annealing, often known as SA, is a probabilistic approach that can approximate the value of a function's global optimal value. To be more specific, it is a metaheuristic that allows for an approximation of global optimization in a vast search space when dealing with an optimization problem. The global optimal solution can be found using SA for large numbers of local optimal solutions. It is utilized quite frequently in situations in which the search space is discrete. Simulated annealing may be superior to exact algorithms like gradient descent and branch and bound for solving problems where obtaining an approximate global optimum is more important than finding a precise local optimum in a set amount of time. This is the case when finding an approximate global optimum is more important.


How You Will Benefit


(I) Insights, and validations about the following topics:


Chapter 1: Simulated annealing


Chapter 2: Adaptive simulated annealing


Chapter 3: Automatic label placement


Chapter 4: Combinatorial optimization


Chapter 5: Dual-phase evolution


Chapter 6: Graph cuts in computer vision


Chapter 7: Molecular dynamics


Chapter 8: Multidisciplinary design optimization


Chapter 9: Particle swarm optimization


Chapter 10: Quantum annealing


(II) Answering the public top questions about simulated annealing.


(III) Real world examples for the usage of simulated annealing in many fields.


(IV) 17 appendices to explain, briefly, 266 emerging technologies in each industry to have 360-degree full understanding of simulated annealing' technologies.


Who This Book Is For


Professionals, undergraduate and graduate students, enthusiasts, hobbyists, and those who want to go beyond basic knowledge or information for any kind of simulated annealing.

LanguageEnglish
Release dateJul 1, 2023
Simulated Annealing: Fundamentals and Applications

Read more from Fouad Sabry

Related to Simulated Annealing

Titles in the series (100)

View More

Related ebooks

Intelligence (AI) & Semantics For You

View More

Related articles

Reviews for Simulated Annealing

Rating: 0 out of 5 stars
0 ratings

0 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    Simulated Annealing - Fouad Sabry

    Chapter 1: Simulated annealing

    The method of simulated annealing, often known as SA, is a probabilistic approach that may approximate the value of a function's global optimal value. To be more specific, it is a metaheuristic that allows for an approximation of global optimization in a vast search space when dealing with an optimization issue. It is often used in situations in which the search space is limited (for example the traveling salesman problem, the boolean satisfiability problem, protein structure prediction, and job-shop scheduling). Simulated annealing may be preferable to exact algorithms like gradient descent and branch and bound for solving problems in which locating an approximate global optimum is more important than locating a precise local optimum in a predetermined amount of time. This is the case when simulated annealing is used.

    Annealing is a process that is used in metallurgy to change the material's physical characteristics. The name of the algorithm derives from this process, which involves heating and then carefully cooling the material. Both of these properties are inherited from the material and are determined by its thermodynamic free energy. Both heating and cooling the material will have an effect, respectively, on the temperature and the thermodynamic free energy, also known as the Gibbs energy. Even though it only achieves an approximation of the global minimum most of the time, simulated annealing may be employed for very difficult computational optimization issues that precise techniques are unable to solve. This may be sufficient for a great many practical situations.

    The issues that can be addressed using SA are now expressed by an objective function that is composed of a large number of variables and is subject to a number of constraints. In actuality, the limitation may be punished as a component of the overall objective function.

    For the purpose of providing an answer to the traveling salesman issue, methods that are conceptually analogous to Pincus's (1970) have been separately proposed on several occasions. They also came up with the technique's present name, which is simulated annealing.

    The idea of slow cooling is something that is incorporated in the method for simulated annealing, and it is viewed as a gradual reduction in the likelihood of accepting inferior solutions as the solution space is investigated. Accepting less desirable options paves the way for a more exhaustive search for the one that is optimum across the board. The following is a broad overview of how simulated annealing methods operate. After reaching a high point in the beginning of the process, the temperature returns to its normal level. During each time step, the algorithm picks a solution at random that is relatively close to the one that is currently being used, evaluates how good it is, and then moves to that solution based on the temperature-dependent probabilities of selecting better or worse solutions. These probabilities, which during the search respectively remain at 1 (or are positive) and decrease towards zero,.

    Either by solving kinetic equations for density functions or by using discrete event simulation, the simulation may be carried out.

    The state of certain physical systems, as well as the function E(s) that must be reduced, is equivalent to the system's internal energy in that state. The objective is to move the system from an arbitrary beginning condition to a state in which it has the least amount of energy that is achievable given the circumstances.

    The heuristic for simulated annealing makes a probabilistic choice between transferring the system to one of the states that is adjacent to the current state, which is denoted by s, or remaining in the current state, which is denoted by s*. This happens at each step. Because of these probabilities, the system will eventually migrate to states that have a lower energy level. In most cases, this phase is carried out several times until either the computer system achieves a condition that is suitable for the application or a certain computation budget has been used up.

    Evaluating the states that are created when a given state is changed in a way that is relatively conservative is one of the steps involved in optimizing a solution. These new states are referred to as neighbors of the original state. For instance, in the traveling salesman problem, each state is typically defined as a permutation of the cities that need to be visited, and the neighbors of any state are the set of permutations produced by swapping any two of these cities. Similarly, the traveling salesman problem also defines each city as a permutation. A move is the well-defined mechanism in which the states are adjusted to generate adjacent states, and various movements yield distinct sets of neighboring states. The neighboring states that result from a move are referred to as neighboring states. These maneuvers often have the effect of making only minor adjustments to the previous state in an effort to gradually enhance the solution by repeatedly enhancing its individual components (such as the city connections in the traveling salesman problem).

    Simple heuristics such as hill climbing, which advance by finding a better neighbor after a better neighbor and stop when they have reached a solution that has no neighbors that are better solutions, are unable to guarantee that they will lead to any of the existing better solutions. Their outcome could very well be just a local optimum, whereas the actual best solution would be a global optimum, which could be something completely different. Metaheuristics employ a solution's neighbors as a tool to explore the solution space. Even though they prefer better neighbors, metaheuristics will tolerate inferior neighbors in order to avoid being mired in a local optimum. If given enough time, metaheuristics are capable of finding the global optimum.

    The probability of making the transition from the current state s to a candidate new state {\displaystyle s_{\mathrm {new} }} is specified by an acceptance probability function {\displaystyle P(e,e_{\mathrm {new} },T)} , that depends on the energies e=E(s) and {\displaystyle e_{\mathrm {new} }=E(s_{\mathrm {new} })} of the two states, and on a global time-varying parameter T called the temperature.

    States with a lower amount of energy are preferable than those with a higher amount of energy.

    The probability function P must be positive even when {\displaystyle e_{\mathrm {new} }} is greater than e .

    This function prevents the method from becoming stuck at a local minimum that is lower than the one that applies globally.

    When T tends to zero, the probability {\displaystyle P(e,e_{\mathrm {new} },T)} must tend to zero if {\displaystyle e_{\mathrm {new} }>e} and to a positive value otherwise.

    For sufficiently small values of T , The system will then reward, to an increasing extent, movements that go downhill (that is, moves that decrease in value), to bring down the energy levels), and avoid those that go uphill. With T=0 the procedure reduces to the greedy algorithm, which only navigates the slopes heading downwards.

    In the first version of the explanation of simulated annealing, the, the probability {\displaystyle P(e,e_{\mathrm {new} },T)} was equal to 1 when {\displaystyle e_{\mathrm {new} }

    This requirement is still taken into account in several explanations and implementations of the simulated annealing process, since it is an essential component of the method's formulation.

    However, This stipulation is not absolutely necessary for the operation of the procedure.

    The P function is usually chosen so that the probability of accepting a move decreases when the difference {\displaystyle e_{\mathrm {new} }-e} increases—that is, It is more probable that there will be tiny upward movements than huge ones.

    However, This prerequisite is not absolutely required, provided that all of the following conditions are satisfied.

    In light of these characteristics, the temperature T plays a crucial role in controlling the evolution

    Enjoying the preview?
    Page 1 of 1