# Aggregating Algorithm Regression

Aggregating Algorithm Regression (AAR) is an application of the Aggregating Algorithm to the online linear regression setting (see Vovk (2001)).

For the online linear regression problem with bounded signal {`⚠ $x: ||x_t||_\infty \le B$`

}, Vovk (2001) proves that for the square-loss function AAR can achieve

{`⚠ $L_T \le \inf_\theta (L_T(\theta)+a||\theta||^2_2) + nY^2\ln(TB^2/a+1)$`

},

where {`⚠ $L_T(\theta)$`

} is the loss of any linear function of {`⚠ $x$`

}.

The algorithm works as follows (all the vectors are columns):

`⚠ $b=0$`

, and matrix `⚠ $A = aI$`

;
`⚠ $t=1,2,\dots$`

}
`⚠ $x_t \in \mathbb{R}^n$`

`⚠ $A = A + x_t x_t'$`

`⚠ $\gamma_t = b'A^{-1}x_t$`

`⚠ $y_t \mathbb{R}$`

`⚠ $b = b+y_tx_t$`

This algorithm almost repeats online Ridge Regression (RR), with the difference that in RR the matrix `⚠ $A$`

is updated after the prediction has made. This means the method chooses to follow the best expert. Unfortunately, it there is no theoretical guarantees for RR in the same form as for AAR.

AAR adds an additional term to the RR minimization function: on the `⚠ $T+1$`

step its predictor `⚠ $\theta$`

achieves `⚠ $\min_\theta \left(\sum_{t=1}^T (\theta' x_t - y_t)^2 + a\|\theta\|^2 + (\theta' x_{T+1})^2 \right)$`

, where `⚠ $a \in \mathbb{R}$`

.

### Bibliography

- Vladimir Vovk. Competitive on-line statistics.
*International Statistical Review*69, 213–248 (2001).