Competing With Besov Spaces

This problem belongs to competitive on-line prediction. At this time there are two techniques for competing with prediction rules in Besov spaces $B^s_{p,q}$: defensive forecasting (DF) and the strong aggregating algorithm (SAA) combined with metric entropy. However, the loss bounds established for these techniques are not comparable: for some Besov spaces DF is better and for others the SAA is better. The problem is to find a technique that would dominate all other techniques. This might be achieved by improving the loss bounds for one of the known techniques or by finding a new technique.

The parameter $q$ in $B^s_{p,q}$ is much less important than the other two parameters, so the essence of the problem can be understood already for the Slobodetsky spaces $B^s_p:=B^s_{p,p}$. Suppose the signal space is a bounded domain $X\subseteq\mathbb{R}^m$; this will be the domain of the functions in $B^s_p=B^s_p(X)$. We are competing with the prediction strategies $\gamma_n:=f(x_n)$, where $x_n$ is the signal and $f\in B^s_p$ (see the protocol for competing with prediction rules; $B^s_p$ is our benchmark class). We always assume that $sp > m$. This condition ensures that the elements of $B^s_p$ are continuous functions on $X$, and in the case $sp\le m$ the elements of $B^s_p$ are not even genuine functions but equivalence classes of functions (and so $f(x_n)$ is undefined for $f$ in such $B^s_p$). Intuitively, the parameter $p$ is responsible for the rotundity of the unit ball (the unit ball is perfectly round when $p=2$, and $B^s_2$ are Hilbert spaces), and the parameter $s$ is responsible for the smoothness of functions in the space. In the perfectly mixable case (such as the square loss function for bounded $y_n$), the known upper bound on the regret is of order:

Perhaps a better bound will take both $s$ and $p$ into account.

Bibliography