What is the worst sorting algorithm

Are there any worse sorting algorithms than Bogosort (aka Monkey Sort)? [closed]


This morning my staff took me back to my university days with a discussion about sorting algorithms. We remembered our favorites like StupidSort, and one of us was sure we'd seen a sorting algorithm. That made me look around for the "worst" sorting algorithms I could find.

We postulated that completely random sorting would be pretty bad (i.e. randomize the items - is it okay? No? Randomize again), and I looked around and found that apparently it is called BogoSort or Monkey Sort, or sometimes just Random Sort .

Monkey Sort appears to have worst-case performance of, best-case performance of, and average performance of.

What is the currently officially accepted sorting algorithm with the worst average sorting performance (and therefore worse than)?






Reply:


On the Esoteric Algorithms page by David Morgan-Mar: Intelligent Design Sort

introduction

Intelligent Design Sort is a sorting algorithm based on the theory of intelligent design.

Description of the algorithm

The probability that the original input list is exactly in the order it is in is 1 / (n!). The likelihood of this is so small that it is clearly absurd to say that it happened by accident. Therefore, it must have been put into this order on purpose by an intelligent sorter. Hence, it can be assumed that it is already optimally sorted in some way that exceeds our naive mortal understanding of "ascending order". Any attempt to change this order to suit our own prejudices would actually sort them out less.

analysis

This algorithm is constant over time and sorts the list in place without the need for additional memory. In fact, it doesn't even require any of this suspicious technological computer stuff. Praise the sorter!

Feedback

Gary Rogers writes:

If the sorting is kept constant over time, the power of The Sorter is denied. The sorter exists outside of time, so the sorting is timeless. If you need time to validate the sort, the role of the sorter is reduced. So ... this particular species is flawed and cannot be attributed to 'The Sorter'.

Heresy!







I invented (but never implemented) MiracleSort many years ago.

After all, alpha particles flipping bits in the memory chips should result in a successful sort.

For greater reliability, copy the array to a shielded location and compare potentially sorted arrays with the original.

How can you compare the potentially sorted array to the original? You just sort each array and see if they match. MiracleSort is the obvious algorithm for this step.

EDIT: Strictly speaking, this is not an algorithm as it is not guaranteed to terminate. Does "no algorithm" qualify as a "worse algorithm"?







Quantum Bogosort

A sorting algorithm that assumes that the many-world interpretation of quantum mechanics is correct:

  1. Check that the list is sorted. If not, destroy the universe.

At the end of the algorithm, the list is sorted by the only remaining universe. This algorithm takes the time for the worst case O (N) and the average case O (1). In fact, the average number of comparisons made is 2: there is a 50% chance that the universe will be destroyed on the second element, a 25% chance that it will be destroyed on the third element, and so on.







I'm surprised no one has mentioned where to sleep ... Or did I not notice? Anyway:

Application example:

In terms of performance, it's terrible (especially the second example). Waiting almost 3.5 months for 2 numbers to be sorted is a bit bad.







Jingle Sort as described here.

You give each value on your list to another child for Christmas. Children who are terrible people will compare the value of their gifts and rank themselves accordingly.


I had a professor who once suggested we generate a random array, check that it was sorted, and then check that the data matched the array to be sorted.

Best case O (N) (first time baby!) Worst case O (never)







Keeping the algorithm meaningful in any way is the worst upper bound you can hit.

Since verifying every possibility of ordering permutations of a set to be sorted takes steps, you can't get any worse.

If you take more steps, the algorithm doesn't really serve a useful purpose. Not to mention the following simple sorting algorithm with:





Bogobogosort. Yeah that's one thing. to Bogobogosort, you Bogosort the first item. Check that this one item is sorted. As an element it will be. Then add the second item and sort these two items until they are sorted. Then add another element, then Bogosort. Add more elements and bogosorting until you have finally done all of the elements. This should never succeed with an extensive pre-universe heat death list.




There is a variety called Bogobogosort. First the first two items are checked and sorted incorrectly. Next it will check the first 3, screw them up and so on.

If at any point the list is wrong, it will be restarted by re-sorting the first 2 incorrectly. Normal Bogosort has an average complexity of, this algorithm has an average complexity of

To edit : To give you an idea of ​​how large this number is for elements, this algorithm takes on average Years , well above the universe's proposed heat death (if it occurs) of Years .

While the merge sort roughly Seconds who have favourited Bubble Sorting Seconds and the bogos sorting Years , Days , Hours , Minutes , Takes seconds , assuming a year is 365.242 days and a computer is performing 250,000,000 32-bit integer operations per second.

Edit2 : This algorithm is not as slow as that Miracle variety "Algorithm" which, like this type, probably sucks the computer into the black hole before successfully sorting 20 items, but if it did I would estimate an average complexity of Years ,

Since gravity speeds up the alpha movement of the chips and there are 2 ^ N states, which is roughly Years . However, if the list were to start sorted, its complexity would only be faster than the merge sort, which is only N log N in the worst case.

Edit3 : This algorithm is actually slower than that Miracle sorting, since the size becomes very large, for example 1000, since my algorithm has a running time of Years would have , while the miracle sorting algorithm has a runtime of Years would have .



Here are 2 strains I made up with my roommate in college

1) Check the order 2) Maybe a miracle happened, go to 1

and

1) Check if it's ok, if not. 2) Put each item in a package and route it back to itself from a remote server. Some of these packages are returned in a different order. So continue with 1




There is always the Bogobogosort (Bogoception!). It executes Bogosort on larger and larger subsets of the list and then starts over in case the list is ever not sorted.



1 Put your items to be sorted on index cards.
2 Throw them in the air a mile from your house on a windy day.
2 Throw them into a campfire and confirm that they are completely destroyed.
3 Check your kitchen floor for the correct order.
4 Repeat this process if the order is not correct.

The best scenario is O (∞)

Edit it above based on an astute observation by KennyTM.







The "What should it be?" sort by

  1. Make a note of the system time.
  2. Sort with Quicksort (or something else that makes reasonable sense) without the very last swap.
  3. Make a note of the system time.
  4. Calculate the time required. Advanced precision arithmetic is a requirement.
  5. Wait the required time.
  6. Make the final swap.

Not only can it implement every conceivable O (x) value close to infinity, the time it takes is demonstrably correct (if you can wait that long).


Nothing can be worse than infinity.







Bozo sorting is a related algorithm that checks to see if the list is sorted and, if not, randomly swaps two items. It has the same best and worst performances, but I would intuitively expect the average case to be longer than Bogosort. It is difficult to find (or produce) data on the performance of this algorithm.


Segments of π

Suppose π contains all possible finite number combinations. See math.stackexchange question

  1. Determine the number of digits you need based on the size of the array.