Arrow Research search
Back to FOCS

FOCS 1987

The Average Complexity of Deterministic and Randomized Parallel Comparison Sorting Algorithms

Conference Paper Accepted Paper Algorithms and Complexity · Theoretical Computer Science

Abstract

In practice, the average time of (deterministic or randomized) sorting algorithms seems to be more relevant than the worst case time of deterministic algorithms. Still, the many known complexity bounds for parallel comparison sorting include no nontrivial lower bounds for the average time required to sort by comparisons n elements with p processors (via deterministic or randomized algorithms). We show that for p ≥ n this time is Θ (log n/log(1 + p/n)), (it is easy to show that for p ≤ n the time is Θ (n log n/p) = Θ (log n/(p/n)). Therefore even the average case behaviour of randomized algorithms is not more efficient than the worst case behaviour of deterministic ones.

Authors

Keywords

  • Sorting
  • Phase change random access memory
  • Computer science
  • Parallel algorithms
  • Read-write memory
  • Decision trees
  • Parallel Comparison
  • Sorting Algorithm
  • Average Behavior
  • Running Time
  • Proof Of Theorem
  • Time Complexity
  • Number Of Comparisons
  • Set Of Elements
  • Break Point
  • Parallel Model
  • Proof Of Proposition
  • Average Running Time
  • Parallel Algorithm
  • Legal Situation
  • Worst-case Time
  • Binary Comparisons
  • Number Of Processors
  • Special Case Of Theorem
  • N Log N
  • Worst-case Time Complexity

Context

Venue
IEEE Symposium on Foundations of Computer Science
Archive span
1975-2025
Indexed papers
3809
Paper id
953442484201022025