Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

Brute Force Search: Fundamentals and Applications
Brute Force Search: Fundamentals and Applications
Brute Force Search: Fundamentals and Applications
Ebook169 pages2 hours

Brute Force Search: Fundamentals and Applications

Rating: 0 out of 5 stars

()

Read preview

About this ebook

What Is Brute Force Search


In the field of computer science, a brute-force search, exhaustive search, or generate and test is a very general problem-solving technique and algorithmic paradigm. This technique and paradigm consists of systematically examining all possible candidates to determine whether or not each option satisfies the problem's statement. Other names for this technique are exhaustive search, brute-force search, or exhaustive search.


How You Will Benefit


(I) Insights, and validations about the following topics:


Chapter 1: Brute-Force Search


Chapter 2: Sudoku Solving Algorithms


Chapter 3: Permutation


Chapter 4: Backtracking


Chapter 5: Branch and Bound


Chapter 6: Binary Search Algorithm


Chapter 7: Meet-in-the-Middle Attack


Chapter 8: Parallel Computing


Chapter 9: Randomized Algorithm


Chapter 10: Brute-force attack


(II) Answering the public top questions about brute force search.


(III) Real world examples for the usage of brute force search in many fields.


(IV) 17 appendices to explain, briefly, 266 emerging technologies in each industry to have 360-degree full understanding of brute force search' technologies.


Who This Book Is For


Professionals, undergraduate and graduate students, enthusiasts, hobbyists, and those who want to go beyond basic knowledge or information for any kind of brute force search.

LanguageEnglish
Release dateJun 28, 2023
Brute Force Search: Fundamentals and Applications

Read more from Fouad Sabry

Related to Brute Force Search

Titles in the series (100)

View More

Related ebooks

Intelligence (AI) & Semantics For You

View More

Related articles

Reviews for Brute Force Search

Rating: 0 out of 5 stars
0 ratings

0 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    Brute Force Search - Fouad Sabry

    Chapter 1: Brute-force search

    In the field of computer science, a brute-force search, exhaustive search, or generate and test is a very general problem-solving technique and algorithmic paradigm that consists of systematically enumerating all possible candidates for the solution and checking whether each candidate satisfies the problem's statement. Other names for this type of search include exhaustive search and brute-force search. Generate and test is another name for exhaustive search.

    An method that uses brute force to identify the divisors of a natural number n would list all numbers from 1 to n and check to see whether each of them divides n without leaving a remainder. This would be repeated until all divisors were found. The eight queens problem may be solved using a method known as brute force, which involves checking each conceivable arrangement of eight pieces on a chessboard with 64 squares to see whether any of the pieces (queen pieces) can attack any of the other pieces on the board. As a result, a brute-force search is used the majority of the time either when the size of the issue is constrained or when there are problem-specific heuristics that can be utilized to minimize the collection of possible solutions to a size that is more manageable. The approach is also employed in situations in which the ease of implementation is prioritized above the amount of time it takes.

    This is the case, for example, in crucial applications in which any faults in the algorithm would have extremely catastrophic implications or when using a computer to prove a mathematical theorem. In both of these situations, any mistakes in the method would have very serious repercussions. Searching using brute force may also be beneficial as a baseline approach for comparing the performance of various algorithms or metaheuristics. In point of fact, the term brute-force search may be used to refer to the most basic form of a metaheuristic. Retracing is not to be mistaken with brute force search since with backtracking, enormous sets of potential solutions may be eliminated without first being explicitly listed (as in the textbook computer solution to the eight queens problem above). Linear search is another term for the brute-force approach of locating anything in a database, which involves checking each entry of the table in the order that it was created.

    The next possible contender for the position of President is c.

    valid (P, c): Determine whether the candidate c in question is a solution for P.

    output (P, c): implement the solution c of P in a manner that is suitable for the application.

    The subsequent method must also indicate when all of the potential candidates for instance P have been eliminated, after the one before it c.

    That may be accomplished quickly and easily by returning a null candidate., some conventional data value Λ that is distinct from any real candidate.

    Likewise the first procedure should return Λ if there are no candidates at all for the instance P.

    The algorithm then expresses the brute-force approach to solving the problem.

    c ← first(P)

    while c ≠ Λ do

    if valid(P,c) then

    output(P, c)

    c ← next(P, c)

    end while

    For example, while trying to find the factors that divide n, an integer, The value of n may be found in the instance data P.

    The call first(n) should return the integer 1 if n ≥ 1, or Λ otherwise; If c is less than n, the next(n,c) call should return c plus one, and Λ otherwise; valid(n,c) should only return true if the argument c can be divided by n, and only in that case.

    (The truth is), if we choose Λ to be n + 1, the tests n ≥ 1 and c < n are unnecessary.)The brute-force search algorithm above will call output for every candidate that is a solution to the given instance P.

    It is simple to adapt the method such that it terminates after finding the first solution, or a predetermined amount of potential answers; or after the screening of a certain number of applicants, or after a certain length of time has been spent using the processor.

    The primary drawback of using the brute-force technique is the fact that, in many situations that occur in real life, It would be impossible to choose amongst all of the natural choices.

    For instance, if we search for the numbers that divide a certain integer using the method outlined above, The specified number n will be the amount of potential applicants that are evaluated.

    So if n contains sixteen decimal digits, say, the search will require executing at least 10¹⁵ computer instructions, It will take a standard personal computer quite a few days to accomplish.

    If n is an arbitrary natural integer with 64 bits, then:, It typically has about 19 digits after the decimal point, The hunt is expected to take about ten years.

    This dramatic increase in the pool of potential candidates, when the amount of data becomes more substantial, arises in a wide variety of different problems.

    For instance, In the event that we are looking for a specific rearrangement of 10 letters,, Therefore, there are 10 times 3,628,800 potential candidates for us to evaluate, which a standard personal computer is able to produce and analyze in much less than one second.

    However, The number of potential candidates will rise by eleven times if we add only one more letter, which will only increase the size of the data by ten percent, a one thousand percent rise.

    For 20 letters, There are twenty people running for the position!, which is about 2.4×10¹⁸ or 2.4 quintillion; And it is estimated that the hunt will take around ten years.

    This unwanted phenomena is referred to as the combinatorial explosion in common parlance, or the problem of having too many dimensions.

    Figuring out how to win in chess is one scenario that exemplifies how combinatorial complexity may lead to solvability limits. The game of chess cannot be completely solved. 2005 was the year when all possible endings to chess games with six pieces or less were solved, revealing the outcome of each position if it were played correctly. Ten more years were needed to finish the tablebase, which resulted in a seven-piece tablebase being finished after the addition of one more chess piece. It is considered intractable to add one more piece to a chess endgame, which would result in an 8-piece tablebase. This is because to the increased combinatorial complexity.

    The reduction of the search space is one method for increasing the speed of a brute-force algorithm, that is, the pool of potential answers to the problem, by the use of heuristics that are unique to the issue type.

    For example, The objective of the eight queens problem is to arrange eight queens on a regular chessboard in such a way that none of the queens may attack the other queens.

    As a result of the fact that each queen may go in any of the 64 squares,, in principle there are 64⁸ = 281,474,976,710,656 possibilities to consider.

    However, because there is no distinction between the queens, and that there can never be two queens on the same square at the same time, The candidates are all of the different combinations that are available when selecting 8 squares out of a total of 64 squares; Therefore, 64 out of 8 is equal to 64!/(56!*8!) = 4,426,165,368 potential answers, which is around 1/60,000 of the prior estimate.

    Further, There is no possible solution that involves having two queens in the same row or column at the same time.

    Therefore, We are able to narrow the pool of options down even more by focusing on those configurations.

    As the above example demonstrates, doing even a little amount of research may often result in significant cuts to the pool of potential solutions and has the potential to reduce a difficult issue to one that is easily solvable.

    In some instances, The analysis could narrow down the potential choices to include just the sets of viable solutions; that is, It is possible that this will result in an algorithm that either discovers one answer or directly enumerates all of the required solutions, as appropriate), without spending time on examinations and the production of individuals who are not qualified.

    For example, A naïve brute-force approach to solving the issue of finding all numbers between 1 and 1,000,000 that are evenly divisible by 417 would include the generation of all integers in the range 1–1,000,000, determining the degree to which each one may be divided.

    However, that problem can be solved much more efficiently by starting with 417 and repeatedly adding 417 until the number exceeds 1,000,000 – which takes only 2398 (= 1,000,000 ÷ 417) steps, not even any tests.

    In applications that require only one solution, rather than all solutions, the expected running time of a brute force search will often depend on the order in which the candidates are tested. This is because a brute force search attempts to find the solution to a problem by trying every possible solution. It is best practice to begin the screening process with the applicants who have the most potential. When looking for a correct divisor of a random integer n, for instance, it is preferable to enumerate the candidate divisors in ascending order, from 2 to n 1, rather than the other way around. This is due to the fact that the probability that n is divisible by c is equal to 1/c. In addition, the number of past unsuccessful tests may often have an effect on the chance that a candidate is legitimate. Take, for instance, the challenge of locating a bit value of one inside a string of 1000 bits provided as P. In this scenario, the possible answers are the indices 1 through 1000, and a candidate c is considered correct if and only if Pc] = 1. Now, let's assume that the first bit of P might either be 0 or 1, but that every bit after that has a 90% chance of being the same as the one before it. If the applicants are listed in descending order, from one to one thousand, the number of candidates that will need to be evaluated prior to success will be around six on average. On the other hand, the predicted value of t will be just a tiny bit more than 2 if the candidates are enumerated in the sequence 1,11,21,31...991,2,12,22,32 etc. In a broader sense, the search area has to be enumerated in such a manner that the subsequent candidate has the highest probability of being legitimate, provided that the trials that came before it did not succeed. If the correct answers are likely to be clustered in some way, then each new candidate ought to be as far as possible from the ones that came before it in that sense. The opposite is true, of course, if there is a higher probability that the answers will be distributed more evenly than was originally anticipated due to random chance.

    There are a variety of alternative search strategies, known as metaheuristics, which are intended to

    Enjoying the preview?
    Page 1 of 1