Competing With Besov Spaces
This problem belongs to competitive on-line prediction. At this time there are two techniques for competing with prediction rules in Besov spaces : defensive forecasting (DF) and the strong aggregating algorithm (SAA) combined with metric entropy. However, the loss bounds established for these techniques are not comparable: for some Besov spaces DF is better and for others the SAA is better. The problem is to find a technique that would dominate all other techniques. This might be achieved by improving the loss bounds for one of the known techniques or by finding a new technique.
The parameter in is much less important than the other two parameters, so the essence of the problem can be understood already for the Slobodetsky spaces . Suppose the signal space is a bounded domain ; this will be the domain of the functions in . We are competing with the prediction strategies , where is the signal and (see the protocol for competing with prediction rules; is our benchmark class). We always assume that . This condition ensures that the elements of are continuous functions on , and in the case the elements of are not even genuine functions but equivalence classes of functions (and so is undefined for in such ). Intuitively, the parameter is responsible for the rotundity of the unit ball (the unit ball is perfectly round when , and are Hilbert spaces), and the parameter is responsible for the smoothness of functions in the space. In the perfectly mixable case (such as the square loss function for bounded ), the known upper bound on the regret is of order:
- when the Defensive Forecasting technique is used, assuming . There is no dependence on .
- , proved by Metric Entropy technique. There is no dependence on .
Perhaps a better bound will take both and into account.