Upper Bound On A Determinant

In competitive prediction it is often necessary to find upper bounds for the determinant ⚠ $\det(I+\frac 1a XX')$, which, by the Sylvester identity, equals ⚠ $\det(I+\frac 1a X'X)$. Let us assume that the size of ⚠ $X$ is ⚠ $n\times T$ and the columns of ⚠ $X$ are vectors ⚠ $x_1,x_2,\ldots,x_T\in\mathbb R^n$; it is natural for the purposes of machine learning to formulate bounds involving properties of these vectors.

In this article we discuss two bounds.

Infinity Norm

A bound in terms of the infinity norms of ⚠ $x_i$ was obtained in [Vovk]. Recall that the infinity norm of a vector ⚠ $u=(u^1,u^2,\ldots,u^n)'$ is the maximum absolute value of its coordinates, ⚠ $\|u\|_\infty=\max_{j=1,2,\ldots,n}|u^j|$.

The determinant of a positive definite matrix does not exceed the product of its diagonal elements (see, e.g., [Beckenbach and Bellman, Chapter 2, Theorem 7]. Let ⚠ $B_\infty=\max_{t=1,2,\ldots,T}\|x_t\|_\infty$; in other words, ⚠ $B_\infty$ is the maximum absolute value of an element of ⚠ $X$. A diagonal element of ⚠ $I+\frac 1a XX'$ does not exceed ⚠ $1+\frac{TB^2_\infty}a$ and we get the bound \begin{equation*} \det\left(I+\frac 1a X'X\right)\le \left(1+\frac{TB^2_\infty}a\right)^n\enspace. \end{equation*}

Quadratic Norm

In [Cesa-Bianchi et al] a bound in terms of the quadratic norms of ⚠ $x_i$ was obtained. The quadratic (or Euclidean) norm of a vector ⚠ $u=(u^1,u^2,\ldots,u^n)'$ is given by ⚠ $\|u\|_2=\sqrt{(u^1)^2+(u^2)^2+\ldots+(u^n)^2}$.

The determinant of a matrix is the product of its (generally speaking complex) eigenvalues counting multiplicities. If ⚠ $\lambda_1,\lambda_2,\ldots,\lambda_n$ are the eigenvalues of ⚠ $XX'$, then the eigenvalues of ⚠ $I+\frac 1a XX'$ are ⚠ $1+\lambda_1/a,1+\lambda_2/a,\ldots,1+\lambda_n/a$ and ⚠ $\det(I+\frac 1a XX')=\prod_{i=1}^n(1+\frac{\lambda_i}a)$.

The sum of eigenvalues ⚠ $\lambda_1+\lambda_2+\ldots+\lambda_n$ equals the trace ⚠ $\mathop{{\rm{tr}}}\nolimits(XX')$ and ⚠ $\mathop{{\rm{tr}}}\nolimits(XX')=\mathop{{\rm{tr}}}\nolimits(X'X)$. Indeed, the matrices ⚠ $AB$ and ⚠ $BA$ (provided they exist) have the same non-zero eigenvalues counting multiplicities while zero eigenvalues do not contribute to the trace. Alternatively one can verify the equality ⚠ $\mathop{{\rm{tr}}}\nolimits(AB)=\mathop{{\rm{tr}}}\nolimits(BA)$ by a straightforward calculation, see, e.g., [Axler, Proposition 10.9 (p. 219)]. The matrix ⚠ $X'X$ is the Gram matrix of vectors ⚠ $x_1,x_2,\ldots,x_T$ and the elements on its diagonal are the squared quadratic norms of the vectors. Letting ⚠ $B_2=\max_{t=1,2,\ldots,T}\|x_t\|_2$ we get ⚠ $\mathop{{\rm{tr}}}\nolimits(XX')=\mathop{{\rm{tr}}}\nolimits(X'X)\le TB^2_2$.

The problem has reduced to obtaining an upper bound on the product of some positive numbers with a known sum. The inequality of arithmetic and geometric means implies that \begin{align*} \prod_{i=1}^n\left(1+\frac 1a\lambda_i\right)&\le \left(\frac 1n\sum_{i=1}^n\left(1+\frac 1a\lambda_i\right)\right)^n
&= \left(1+\frac 1{an} \sum_{i=1}^n\lambda_i\right)^n\enspace. \end{align*} Combining this with the bound on the trace yields \begin{equation*} \det\left(I+\frac 1a X'X\right)\le \left(1+\frac{TB^2_2}{an}\right)^n\enspace. \end{equation*}

Comparison

Let us compare the bounds. For every vector ⚠ $u\in\mathbb R^n$ we have ⚠ $\|u\|^2_2\le n\|u\|^2_\infty$ (the inequality is tight: for a vector of equal coordinates ⚠ $u=(c,c,\ldots,c)$ it is an equality) and therefore for any array of vectors ⚠ $x_1,x_2,\ldots,x_T$ we get \begin{equation*} \frac{B^2_2}n\le B_\infty\enspace. \end{equation*} We conclude that the quadratic norm bound is not weaker than the infinity norm bound.

References

[Vovk] V. Vovk. Competitive on-line statistics. International Statistical Review, 69(2):213-248, 2001.

[Beckenbach and Bellman] E.F.Beckenbach and R.E.Bellman. Inequalities. Springer, 1961.

[Cesa-Bianchi et al] N.Cesa-Bianchi, A.Conconi, and C.Gentile. A second-order perceptron algorithm. SIAM Journal on Computing, 34(3):640-668, 2005.

[Axler] S.Axler. Linear Algebra Done Right. Springer, 2nd edition, 1997.