I agree that probabilistic time hierarchy would imply me hierarchy for estimation problems. Time hierarchy is actually known with literally 1 bit of advice.
I believe that time hierarchy with O(1) advice also implies time hierarchy for estimation problems. Consider the advice machine for the hard problem running with advice 0 and the advice machine running with 1. Each of those estimates some function. If both of these functions are quickly estimable, then there is a fast advice machine which solves the hard problem (it uses its advice to decide which of the fast estimators to run), contradicting time hierarchy with advice.
In general, estimation problems seem much nicer than decision problems when we are talking about probabilistic algorithms. For example, every algorithm effectively solves some estimation problem, whereas most probabilistic algorithms don’t solve any decision problem (this is why we can’t eliminate the advice). Relatedly, there are complete estimation problems but no natural complete decision problems for probabilistic time (and hence no natural candidates for a concrete problem to demonstrate probabilistic time hierarchy, which would make our inability to prove it significantly less interesting).
I feel like there should be some intuitive claims that don’t work for BPP because it is a kind of terrible class, but which do work for estimation problems.
Good point! More precisely, an arbitrary estimator Q may have significant error w.r.t.E[Q(x)] because Var[Q(x)] doesn’t have to be small but it doesn’t matter because we can always amplify Q by running it multiple times and averaging the results. So we still get a hierarchy, just slightly less tight (but this tightness is not very interesting since it depends on the computational model anyway).
I agree that probabilistic time hierarchy would imply me hierarchy for estimation problems. Time hierarchy is actually known with literally 1 bit of advice.
I believe that time hierarchy with O(1) advice also implies time hierarchy for estimation problems. Consider the advice machine for the hard problem running with advice 0 and the advice machine running with 1. Each of those estimates some function. If both of these functions are quickly estimable, then there is a fast advice machine which solves the hard problem (it uses its advice to decide which of the fast estimators to run), contradicting time hierarchy with advice.
In general, estimation problems seem much nicer than decision problems when we are talking about probabilistic algorithms. For example, every algorithm effectively solves some estimation problem, whereas most probabilistic algorithms don’t solve any decision problem (this is why we can’t eliminate the advice). Relatedly, there are complete estimation problems but no natural complete decision problems for probabilistic time (and hence no natural candidates for a concrete problem to demonstrate probabilistic time hierarchy, which would make our inability to prove it significantly less interesting).
I feel like there should be some intuitive claims that don’t work for BPP because it is a kind of terrible class, but which do work for estimation problems.
Good point! More precisely, an arbitrary estimator Q may have significant error w.r.t.E[Q(x)] because Var[Q(x)] doesn’t have to be small but it doesn’t matter because we can always amplify Q by running it multiple times and averaging the results. So we still get a hierarchy, just slightly less tight (but this tightness is not very interesting since it depends on the computational model anyway).