Content deleted Content added
Adding local short description: "Problem-solving technique and algorithmic paradigm", overriding Wikidata description "computer problem-solving technique" (Shortdesc helper) |
→Combinatorial explosion: archiving Lomonosov Endgame |
||
(11 intermediate revisions by 8 users not shown) | |||
Line 2:
{{about|the problem-solving technique in computer science|similarly named methods in other disciplines|Brute force (disambiguation)}}
{{More citations needed|date=February 2008}}
In [[computer science]], '''brute-force search''' or '''exhaustive search''', also known as '''generate and test''', is a very general [[problem-solving]] technique and [[algorithmic paradigm]] that consists of [[Iteration#Computing|systematically
A brute-force algorithm
While a brute-force search is simple to [[Software|implement]]
This is the case, for example, in critical applications where any errors in the [[algorithm]] would have very serious consequences
==Implementing the brute-force search==
Line 28:
The main disadvantage of the brute-force method is that, for many real-world problems, the number of natural candidates is prohibitively large. For instance, if we look for the divisors of a number as described above, the number of candidates tested will be the given number ''n''. So if ''n'' has sixteen decimal digits, say, the search will require executing at least 10<sup>15</sup> computer instructions, which will take several days on a typical [[personal computer|PC]]. If ''n'' is a random 64-[[binary digit|bit]] natural number, which has about 19 decimal digits on the average, the search will take about 10 years. This steep growth in the number of candidates, as the size of the data increases, occurs in all sorts of problems. For instance, if we are seeking a particular rearrangement of 10 letters, then we have 10! = 3,628,800 candidates to consider, which a typical PC can generate and test in less than one second. However, adding one more letter{{snd}}which is only a 10% increase in the data size{{snd}}will multiply the number of candidates by 11, a 1000% increase. For 20 letters, the number of candidates is 20!, which is about 2.4×10<sup>18</sup> or 2.4 [[quintillion]]; and the search will take about 10 years. This unwelcome phenomenon is commonly called the [[combinatorial explosion]], or the [[curse of dimensionality]].
One example of a case where combinatorial complexity leads to solvability limit is in [[solving chess]]. Chess is not a [[solved game]]. In 2005, all chess game endings with six pieces or less were solved, showing the result of each position if played perfectly. It took ten more years to complete the tablebase with one more chess piece added, thus completing a 7-piece tablebase. Adding one more piece to a chess ending (thus making an 8-piece tablebase) is considered intractable due to the added combinatorial complexity.<ref>{{cite web |title=Is there a freely available online 7 piece Endgame tablebase? |url=https://chess.stackexchange.com/questions/5253/is-there-a-freely-available-online-7-piece-endgame-tablebase |website=Stack Exchange}}</ref><ref>{{cite web |title=Lomonosov Endgame Tablebases |url=http://chessok.com/?page_id=27966 |website=ChessOK |archive-url= https://web.archive.org/web/20190406123519/http://chessok.com/?page_id=27966|archive-date=April 6, 2019}}</ref>
==Speeding up brute-force searches==
Line 58:
* [[Brute-force attack]]
* [[Big O notation]]
* [[Iteration#Computing]]
{{Algorithmic paradigms}}
{{DEFAULTSORT:Brute-Force Search}}
[[Category:Search algorithms]]
[[Category:Iteration in programming]]
|