Arrow Research search
Back to MFCS

MFCS 2012

Smoothed Complexity Theory

Conference Paper Accepted Paper Algorithms and Complexity · Theoretical Computer Science

Abstract

Abstract Smoothed analysis is a new way of analyzing algorithms introduced by Spielman and Teng ( J. ACM, 2004). Classical methods like worst-case or average-case analysis have accompanying complexity classes, like P and Avg − P, respectively. While worst-case or average-case analysis give us a means to talk about the running time of a particular algorithm, complexity classes allows us to talk about the inherent difficulty of problems. Smoothed analysis is a hybrid of worst-case and average-case analysis and compensates some of their drawbacks. Despite its success for the analysis of single algorithms and problems, there is no embedding of smoothed analysis into computational complexity theory, which is necessary to classify problems according to their intrinsic difficulty. We propose a framework for smoothed complexity theory, define the relevant classes, and prove some first results.

Authors

Keywords

No keywords are indexed for this paper.

Context

Venue
International Symposium on Mathematical Foundations of Computer Science
Archive span
1973-2025
Indexed papers
3045
Paper id
587052567512379981