STOC Conference 2025 Conference Paper
Maximum Circuit Lower Bounds for Exponential-Time Arthur Merlin
- Lijie Chen 0001
- Jiatu Li
- Jingxun Liang
Author name cluster
Possible papers associated with this exact author name in Arrow. This page groups case-insensitive exact name matches and is not a full identity disambiguation profile.
STOC Conference 2025 Conference Paper
STOC Conference 2025 Conference Paper
FOCS Conference 2024 Conference Paper
This paper revisits the study of two classical technical tools in theoretical computer science: Yao's trans-formation of distinguishers to next-bit predictors (FOCS 1982), and the “reconstruction paradigm” in pseudorandomness (e. g. , as in Nisan and Wigderson, JCSS 1994). Recent works of Pyne, Raz, and Zhan (FOCS 2023) and Doron, Pyne, and Tell (STOC 2024) showed that both of these tools can be derandomized in the specific context of read-once branching programs (ROBPs), but left open the question of de randomizing them in more general settings. Our main contributions give appealing evidence that derandomization of the two tools is possible in general settings, show surprisingly strong consequences of such derandomization, and reveal several new settings where such derandomization is unconditionally possible for algorithms stronger than ROBPs (with useful consequences). Specifically: •We show that derandomizing these tools is equivalent to general derandomization. Specifically, we show that derandomizing distinguish - to- predict transformations is equivalent to prBPP=prP, and that derandomized reconstruction procedures (in a more general sense that we introduce) is equivalent to prBPP=prZPP. These statements hold even when scaled down to weak circuit classes and to algorithms that run in super-polynomial time. •Our main technical contributions are unconditional constructions of derandomized versions of Yao's transformation (or reductions of this task to other problems) for classes and for algorithms beyond ROBPs. Consequently, we deduce new results: A significant relaxation of the hypotheses required to derandomize the isolation lemma for logspace algorithms and deduce that NL=UL; and proofs that de-randomization necessitates targeted PRGs in catalytic logspace (unconditionally) and in logspace (conditionally). In addition, we introduce a natural subclass of prZPP that has been implicitly studied in recent works (Korten FOCS 2021, CCC 2022): The class of problems reducible to a problem called “Lossy Code”. We provide a structural characterization for this class in terms of derandomized reconstruction procedures, and show that this characterization is robust to several natural variations. Lastly, we present alternative proofs for classical results in the theory of pseudorandomness (such as two-sided derandomization reducing to one-sided), relying on the notion of deterministically transforming distinguishers to predictors as the main technical tool.
STOC Conference 2024 Conference Paper
FOCS Conference 2024 Conference Paper
Reverse mathematics is a program in mathematical logic that seeks to determine which axioms are necessary to prove a given theorem. In this work, we systematically explore the reverse mathematics of complexity lower bounds. We explore reversals in the setting of bounded arithmetic, with Cook's theory PV1 as the base theory, and show that several natural lower bound statements about communication complexity, error correcting codes, and Turing machines are equivalent to widely investigated combinatorial principles such as the weak pigeonhole principle for polynomial-time functions and its variants. As a consequence, complexity lower bounds can be formally seen as fundamental mathematical axioms with far-reaching implications. The proof-theoretic equivalence between complexity lower bound statements and combinatorial principles yields several new implications for the (un)provability of lower bounds. Among other results, we derive the following consequences: • Under a plausible cryptographic assumption, the classical single-tape Turing machine (n2)-time lower bound for Palindrome is unprovable in Jerabek's theory APC1. The conditional unprovability of this simple lower bound goes against the intuition shared by some researchers that most complexity lower bounds could be established in APC 1. • While APC1 proves one-way communication lower bounds for Set Disjointness, it does not prove one-way communication lower bounds for Equality, under a plausible cryptographic assumption. • An amplification phenomenon connected to the (un)provability of some lower bounds, under which a quantitatively weak lower bound is provable if and only if a stronger (and often tight) n c lower bound is provable. • Feasibly definable randomized algorithms can be feasibly defined deterministically (APC1 is over PV1) if and only if one-way communication complexity lower bound for Set Disjointness are provable in PV1.
STOC Conference 2023 Conference Paper
The range avoidance problem (denoted by Avoid ) asks to find a string outside of the range of a given circuit C :{0,1} n →{0,1} m , where m > n . Although at least half of the strings of length m are correct answers, it is not clear how to deterministically find one. Recent results of Korten (FOCS’21) and Ren, Wang, and Santhanam (FOCS’ 22) show that efficient deterministic algorithms for Avoid would have far-reaching consequences, including strong circuit lower bounds and explicit constructions of combinatorial objects (e.g., Ramsey graphs, extractors, rigid matrices). This strongly motivates the question: does an efficient deterministic algorithm for Avoid actually exist? In this work, we prove under the existence of subexponentially secure indistinguishability obfuscation ( i O ) that polynomial-time deterministic algorithms for Avoid imply NP = coNP . Combining this with Jain, Lin, and Sahai’s recent breakthrough construction of i O from well-founded assumptions (STOC’21, EUROCRYPT’22), we provide the first plausible evidence that Avoid has no efficient deterministic algorithm. Moreover, we also prove the hardness of Avoid based on polynomially-secure i O and a weaker variant of the Nondeterministic Exponential Time Hypothesis (NETH). Extending our techniques, we prove a surprising separation in bounded arithmetic , conditioned on similar assumptions. Assuming subexponentially secure i O and coNP is not infinitely often in, we show that Avoid has no deterministic polynomial-time algorithm even when we are allowed O (1) queries to an oracle that can invert the given input circuit on an arbitrarily chosen m -bit string. It follows that the dual Weak Pigeonhole Principle , the combinatorial principle underlying Avoid , is not provable in Cook’s theory PV 1 . This gives (under plausible assumptions) the first separation of Cook’s theory PV 1 for polynomial-time reasoning and Jeřábek’s theory APV 1 for probabilistic polynomial-time reasoning.
STOC Conference 2023 Conference Paper
STOC Conference 2023 Conference Paper
STOC Conference 2022 Conference Paper
STOC Conference 2022 Conference Paper
Investigating the computational resources we need for cryptography is an essential task of both theoretical and practical interests. This paper provides answers to this problem on pseudorandom functions (PRFs). We resolve the exact complexity of PRFs by proving tight upper and lower bounds for various circuit models. * PRFs can be constructed in 2 n + o ( n ) size general circuits assuming the existence of polynomial-size PRFs, simplifying and improving the O ( n ) upper bound by Ishai, Kushilevitz, Ostrovsky, and Sahai (STOC 2008). Moreover, if PRFs exist in NC 1 , we can further guarantee the depth of our construction to be (1+є) log n . We show that our construction is almost optimal by giving an unconditional 2 n − O (1) lower bound. * PRFs can be constructed in AC 0 [2] circuits of o ( n ) gates and 2 n + o ( n ) wires assuming the existence of polynomial-size AC 0 [2] PRFs. We show the optimality of our construction with a 2 n +Ω(√ n ) wire complexity lower bound. * PRFs can be constructed with wire complexity n 1+ O (1.61 − d ) in depth- d TC 0 circuits assuming the existence of polynomial-size TC 0 PRFs. We also present an n 1+Ω( c − d ) wire complexity lower bound against depth- d TC 0 circuits for some c >1.61. As a byproduct, we prove unconditional tight upper and lower bounds for ”almost” universal hash functions that are of independent interest. Following the natural proof barrier of Razborov and Rudich (J. Comput. Syst. Sci. 1997), we observe that our exact complexity results are closely related to the ”bootstrapping phenomena” in circuit complexity (such as hardness magnification and quantified derandomization). We introduce the black-box natural proof barrier and show that a large range of techniques for bootstrapping results cannot be combined with ”black-box” lower bound proofs to obtain a breakthrough.