Jump to content

Talk:Sorting algorithm/Archive 2

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Wizmut (talk | contribs) at 03:08, 21 January 2025 (create archive 2 by moving in some topics from archive 1). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)
Archive 1Archive 2Archive 3

Juggle Sort

I have an idea for a sorting algorithm that works similarly to selection sort i.e. it keeps sorting the list as it goes on, but using many exchanges instead of selection, much like "juggling" with the numbers. Here it is:

- consider an array a of n integers. - start from position 1 with an iterator i. - iterate with j from the opposite side until you find the first element that is smaller than a[i]. Swap them. - stay on position 1 and repeat step 3 until no more swaps can be done. When this occurs, move to position 2 and so on. - it ends when i reaches n-1.

C++ code:

void jugglesort(int *a,int n)
{
         int i,j,modif,aux;
         i=1;
         j=n;
         modif=0;
         do
	 {
		if(modif) j--;
                else j=n;
		 
                while(a[j]>=a[i] && j>=i) j--;
		if(j>i)
		{
			 aux=a[j];
			 a[j]=a[i];
			 a[i]=aux;
                         modif=1;
                 }
                 else 
                 {
                         i++;
                         modif=0;
                 }
          }
          while(i<n);
}

The stats are the same as for selection sort, however in worst cases it runs faster. Well, I guess at worst it can be labeled as an alternate selection sort implementation.

Some tests: 10000 elements worst case | Select sort: 0.578 seconds | Juggle sort: 0.468 seconds |

30000 elements worst case | Select sort: 4.906 seconds | Juggle sort: 3.843 seconds |

50000 elements worst case | Select sort: 13.625 seconds | Juggle sort: 10.516 seconds |

Best cases are identical for both.

So yeah, the more unsorted the array, the better juggle sort performs compared to selection sort.

Why is this? Well it's pretty simple: even though the number of iterations is the same i.e. n², in a worst case, selection sort has to keep performing these operations: min=a[i], min_pos=i (memorizing the minimum and its position) at every iteration, plus a swap. At the same time, it takes juggle sort very few swaps to actually get the minimum on the current position (one swap for worst case), after that it will just iterate. If you implement selection sort by maximum, not minimum, the tests will be reversed. They will both perform the same in the worst case, but juggle sort will outperform selection sort in the best cases, for the reasons above.

L337 Krew (talk) 09:01, 12 February 2008 (UTC)

Keep in mind talk pages are for discussing improvements to the article rather than disseminating and discussing your sort algorithm ideas - and even if your experiments demonstrated that your sorting algorithm outcompeted every existing algorithm, it still couldn't be added to Wikipedia due to our Wikipedia:No original research policy. In short, publish a paper about it in a conference or journal, and then we may consider its inclusion. Don't modify the articles on selection sort or bubble sort either, because this is not a well-documented variant.
Also, if you wish to pursue this research, I'd compare your algorithm against insertion sort, which tends to be the fastest of the simple O(n2) sorts. Dcoetzee 23:14, 13 February 2008 (UTC)

Problem with the table listing

When you click the little double triangle next to the columes titles on the "List of sorting algorithms" table, it sorts all the rows descendingly by that colume.

However, when you click on the triangle for "average speed", you get insertion sort at the start, followed by the n(log n) algorithms, then the n^2 algorithm...etc.

Insertion sort is listed as O(n+d), but since "d is the number of inversions, which is O(n²)", the average speed of insertion sort is infact O(n+n^2), so shouldn't it go below the O(nlogn) algorithms like quicksort? —Preceding unsigned comment added by 124.191.82.143 (talk) 10:30, 19 March 2008 (UTC)

I agree, that does sound right to me. --Doradus (talk) 20:55, 5 August 2008 (UTC)

Comb sort complexity

The table states that the complexity of comb sort is O(n log n) for both the average and the worst case. Can anyone provide a reference for this? One poster to a recent discussion in comp.theory, opined that the worst case is O(n²). It is not obvious that either average or worst case is O(n log n). Testing supports the idea that average complexity is O(n log n), but testing can say nothing about the worst case. Caviare (talk) 11:01, 12 June 2008 (UTC)

This information was removed form comb sort because a reliable source is not available for it (the only article describing comb sort as such did not do a thorough analysis). Because it is in form similar to shellsort the idea that its worst case is better than O(n2) is credible, and the idea that it might be O(n log n) in the worst case with a carefully-chosen sequence is also credible. Someone recently added this to the discussion on Talk:Comb sort: "Update: Somebody did find a killer sequence! See "Combsort: shrink factor for guaranteed O(n log n) worst case time?" in comp.theory, thread started on Thu, 10 Apr 2008. --Como (talk) 15:20, 3 June 2008 (UTC)" I don't know whether to believe them, but I suggest this discussion continue there. Dcoetzee 00:56, 13 June 2008 (UTC)
OK, I have removed this information from this article as well. Caviare (talk) 10:05, 15 June 2008 (UTC)

Math tags

Does anyone else think the recent rash of <math> tags has actually made the page look worse while simultaneously making it harder to edit? --Doradus (talk) 20:54, 5 August 2008 (UTC)

Omega is not big O

For "bad behavior", the omega symbol is used - nowhere else in the article. —Preceding unsigned comment added by 208.127.11.83 (talk) 15:59, 7 August 2008 (UTC)

And for good reason - big O cannot meaningfully be used to express a lower bound. See Big-O notation for details. Dcoetzee 22:12, 7 August 2008 (UTC)

Changed "binary tree sort" worst case performance from "n log n" to "n^2"...

I have changed "binary tree sort" worst case performance from "n log n" to "n^2" since it conflicted with the worst case performance listed in the full "binary tree sort" article and other descriptions of this sort I found on the internet. I don't know for certain that this is correct, but at least now it is consistent. If I've made an error, please correct both this article and the "binary tree sort" article so they will remain consistent. ---twikir

Binary tree sort is O(n log n) as long as a self-balancing binary search tree is used. I'll look at this... Dcoetzee 00:02, 30 January 2009 (UTC)

Please add Funnel sort (a cache efficient sort) if you have the time and a good understanding...

Please add Funnel sort (a cache efficient sort). ---twikir

On my list of things to do. Dcoetzee 00:02, 30 January 2009 (UTC)

Regarding revised sentence in stability section

I propose the reverting of the recent edit to the stability section. My grounds are:

  • The added link is overgeneral, and I don't think a link is really needed here.
  • The paragraph is describing a specific algorithm transformation which increases the space complexity of the sort but not the time complexity. Therefore generalizing the language to include both time and space complexity is inappropriate.

The editor may have justifications I'm not aware of though, and if so I'd like to hear them. This is a pretty minor matter all around. Dcoetzee 00:02, 30 January 2009 (UTC)

O( log 1 )

Could someone please tell me what the hell O( log 1 ) time is? 71.198.5.105 (talk) 11:06, 13 March 2009 (UTC)

I think O(log 1) = O(0) is nonsense! An algorithm with memory usage 0? I would say O(1) is better! 134.109.185.33 (talk) 07:59, 27 March 2009 (UTC)

O(log n) minimum memory usage

From what I know, storing integers requires O(log n) space, where n is the integer. So shouldn't the auxiliary memory for the sorts that require this information (such as cocktail sort) be at least O(log n)? CHL (aka yse) (talk) 12:54, 29 March 2009 (UTC)

Storing an integer requires O(log n) space in the magnitude of the integer, not in the quantity of integers. --Doradus (talk) 12:56, 29 March 2009 (UTC)
My point is like, say, normally to store the index of the swapping position requires an integer. This integer can be represented with a long int, where the maximum number of items is 2^32 and using O(1) space, or it can be stored with a bignum, unlimited number of items and using O(log n) space. There's currently the assumption that the number of items to be sorted is less than some fixed constant, which should be removed. CHL (aka yse) (talk) 14:41, 29 March 2009 (UTC)
Oh I see what you're saying. Yes, perhaps it should. I find all the complexity discussions on this article to be rather confusing and ill-defined (as you can see from previous discussions above) but supposedly the complexities we've listed are the ones commonly given in reference texts. I guess the proper thing to do if you disagree with one of the listed complexities is to find a reference that supports your view and then change the article, citing that reference. --Doradus (talk) 14:35, 20 April 2009 (UTC)