Weighted Majority Algorithm

The Weighted Majority Algorithm is, in the simplest case, the algorithm for predicting a binary sequence using advice from a finite number {$K$} of experts. See the protocol of Prediction with Expert Advice. Both the observation and prediction space is {$\{0,1\}$}. Each expert is initially assigned weight 1. When an expert makes a mistake, his weight is multiplied by a number {$\beta\in[0,1]$} (called the exponential learning rate). At each step, the master's prediction is the weighted majority of the experts' prediction. It can be shown that the algorithm guarantees the inequality

{$L_T\le\frac{\ln\frac{1}{\beta}}{\ln\frac{2}{1+\beta}}L_T(k)+\frac{1}{\ln\frac{2}{1+\beta}}\ln K$}

for all {$T$} and {$k$}, where {$L_T$} is the number of mistakes made by the master over the first {$T$} trials, and {$L_T(k)$} is the number of mistakes made by the {$k$}th expert over the first {$T$} trials.

The Weighted Majority Algorithm is generalized by the Strong Aggregating Algorithm to a wide class of games of prediction.

The Weighted Majority Algorithm was introduced independently by Littlestone and Warmuth (1994) and Vovk (1992, the proof of Theorem 5). The abstract of the former paper appeared in the FOCS 1989 (30 Oct - 1 Nov) proceedings, and the manuscript of the latter paper was received by the journal in January 1989 (in fact the manuscript began its journey to the journal at the end of December 1988; it was not cleared by the Soviet authorities for publication outside the Soviet Union). The name of the algorithm was proposed by Littlestone and Warmuth (1994).

Bibliography

  • Nick Littlestone and Manfred K. Warmuth. The weighted majority algorithm. Information and Computation, 108(2):212 - 261, 1994.
  • Vladimir G. Vovk. Universal forecasting algorithms. Information and Computation, 96(2):245 - 277, 1992.