Decision problems are a special case of estimation problems so it is sufficient to prove a time hierarchy for probabilistic algorithms. To see this, consider a decision problem D⊆{0,1}∗ and an estimator Q. We can now construct a probabilistic algorithm A s.t. Pr[A(x)=1]=E[Q(x)]. We have Pr[A(x)≠χD(x)]≤√E[(Q(x)−χD(x))2]. Conversely, a probabilistic algorithm can be trivially interpreted as an estimator. So, a probabilistic algorithm with time complexity t(k) and success probability 16 gives an estimator with time complexity t(k) and error 16. On the other hand there is some polynomial p s.t. if there is no probabilistic algorithm with time complexity p(t(k)) and success probability 16, then there is no estimator with time complexity t(k) and error 15 (we should assume that p is sufficient to amplify error probability 1√5 to 16; for reasonable computational models a linear polynomial is enough).
Now, you’re right that there is no complete proof of a probabilistic time hierarchy, however:
I agree that probabilistic time hierarchy would imply me hierarchy for estimation problems. Time hierarchy is actually known with literally 1 bit of advice.
I believe that time hierarchy with O(1) advice also implies time hierarchy for estimation problems. Consider the advice machine for the hard problem running with advice 0 and the advice machine running with 1. Each of those estimates some function. If both of these functions are quickly estimable, then there is a fast advice machine which solves the hard problem (it uses its advice to decide which of the fast estimators to run), contradicting time hierarchy with advice.
In general, estimation problems seem much nicer than decision problems when we are talking about probabilistic algorithms. For example, every algorithm effectively solves some estimation problem, whereas most probabilistic algorithms don’t solve any decision problem (this is why we can’t eliminate the advice). Relatedly, there are complete estimation problems but no natural complete decision problems for probabilistic time (and hence no natural candidates for a concrete problem to demonstrate probabilistic time hierarchy, which would make our inability to prove it significantly less interesting).
I feel like there should be some intuitive claims that don’t work for BPP because it is a kind of terrible class, but which do work for estimation problems.
Good point! More precisely, an arbitrary estimator Q may have significant error w.r.t.E[Q(x)] because Var[Q(x)] doesn’t have to be small but it doesn’t matter because we can always amplify Q by running it multiple times and averaging the results. So we still get a hierarchy, just slightly less tight (but this tightness is not very interesting since it depends on the computational model anyway).
Decision problems are a special case of estimation problems so it is sufficient to prove a time hierarchy for probabilistic algorithms. To see this, consider a decision problem D⊆{0,1}∗ and an estimator Q. We can now construct a probabilistic algorithm A s.t. Pr[A(x)=1]=E[Q(x)]. We have Pr[A(x)≠χD(x)]≤√E[(Q(x)−χD(x))2]. Conversely, a probabilistic algorithm can be trivially interpreted as an estimator. So, a probabilistic algorithm with time complexity t(k) and success probability 16 gives an estimator with time complexity t(k) and error 16. On the other hand there is some polynomial p s.t. if there is no probabilistic algorithm with time complexity p(t(k)) and success probability 16, then there is no estimator with time complexity t(k) and error 15 (we should assume that p is sufficient to amplify error probability 1√5 to 16; for reasonable computational models a linear polynomial is enough).
Now, you’re right that there is no complete proof of a probabilistic time hierarchy, however:
a. The hierarchy is known given O(1) advice.
b. The hierarchy follows given sufficient derandomization assumptions.
I agree that probabilistic time hierarchy would imply me hierarchy for estimation problems. Time hierarchy is actually known with literally 1 bit of advice.
I believe that time hierarchy with O(1) advice also implies time hierarchy for estimation problems. Consider the advice machine for the hard problem running with advice 0 and the advice machine running with 1. Each of those estimates some function. If both of these functions are quickly estimable, then there is a fast advice machine which solves the hard problem (it uses its advice to decide which of the fast estimators to run), contradicting time hierarchy with advice.
In general, estimation problems seem much nicer than decision problems when we are talking about probabilistic algorithms. For example, every algorithm effectively solves some estimation problem, whereas most probabilistic algorithms don’t solve any decision problem (this is why we can’t eliminate the advice). Relatedly, there are complete estimation problems but no natural complete decision problems for probabilistic time (and hence no natural candidates for a concrete problem to demonstrate probabilistic time hierarchy, which would make our inability to prove it significantly less interesting).
I feel like there should be some intuitive claims that don’t work for BPP because it is a kind of terrible class, but which do work for estimation problems.
Good point! More precisely, an arbitrary estimator Q may have significant error w.r.t.E[Q(x)] because Var[Q(x)] doesn’t have to be small but it doesn’t matter because we can always amplify Q by running it multiple times and averaging the results. So we still get a hierarchy, just slightly less tight (but this tightness is not very interesting since it depends on the computational model anyway).