David A. Bader
IEEE Fellow
AAAS Fellow
Professor
College of Computing
Georgia Tech
Atlanta, GA 30332


 
 

 

A New Deterministic Parallel Sorting Algorithm With an Experimental Evaluation

We introduce a new deterministic parallel sorting algorithm based on the regular sampling approach. The algorithm uses only two rounds of regular all-to-all personalized communication in a scheme that yields very good load balancing with virtually no overhead. Moreover, unlike previous variations, our algorithm efficiently handles the presence of duplicate values without the overhead of tagging each element with a unique identifier. This algorithm was implemented in Split-C and run on a variety of platforms, including the Thinking Machines CM-5, the IBM SP-2-WN, and the Cray Research T3D. We ran our code using widely different benchmarks to examine the dependence of our algorithm on the input distribution. Our experimental results illustrate the efficiency and scalability of our algorithm across different platforms. In fact, the performance compares closely to that of our random sample sort algorithm, which seems to outperform all similar algorithms known to the authors on these platforms. Together, their performance is nearly invariant over the set of input distributions, unlike previous efficient algorithms. However, unlike our randomized sorting algorithm, the performance and memory requirements of our regular sorting algorithm can be deterministically guaranteed.

Publication History

Versions of this paper appeared as:
  1. University of New Mexico AHPCC-TR-98-007
  2. University of Maryland CS-TR-3670, UMIACS-TR-96-54
  3. D.R. Helman, J. JáJá , D.A. Bader. `` A New Deterministic Parallel Sorting Algorithm With an Experimental Evaluation,'' ACM Journal of Experimental Algorithmics, 3(4):1-24, 1998.

Click here to get the code (psort)

Download this report in Adobe PDF

 

 
 

Last updated: July 25, 2004

 




Computational Biology



Parallel Computing



Combinatorics