Two Special Cases
Two Special Cases
In this section, we introduce two cases where solutions can be analytically solved without too many difficulties.
Suppose \(\mathbf{Q} \in \mathbb{R}^{n \times n}\) is symmetric and positive semidefinite.
Unconstrained - Least Square
This is solvable because we can take the gradient right away to get \(\mathbf{Q x}+\mathbf{c}=\mathbf{0}\). Therefore, \(\mathbf{x} = \mathbf{-Q^{-1}c}\)
An example of this is the Least Square Problem,
Take the derivative with respect to \(\beta\), and we get \((X^TX)\beta-X^Ty=0\). So if \(X^TX\) is invertible, we have \(\beta=(X^TX)^{-1}X^Ty\).
Equally Constrained - Characteristic Portfolio
The problem has the following form,
Recall the Lagrangian function. The \(L(x,y)=f(x)-y^Th(x)\). Usually, this Lagrangian Multiplier is written as \(\lambda\) though, here it is \(y\). In this problem, \(h(x)=Ax-b\).
The dimension of \(y\) is the same as the dimension of \(b\).
Thus, we have \(L(\mathbf{x}, \mathbf{y}):=\frac{1}{2} \mathbf{x}^{\top} \mathbf{Q} \mathbf{x}+\mathbf{c}^{\top} \mathbf{x}+\mathbf{y}^{\top}(\mathbf{b}-\mathbf{A} \mathbf{x})\).
To solve this, we set the gradients of the \(\Delta L(x,y)=0\) (All the partial derivatives equal to 0), and solve for \(x,y\). The gradients zero conditions turn out to be,
An example of this is the following problem,
Notice that the constraint is a single number, so \(y\) is of dimension 1. The solution of this is \(\mathbf{x}^{\star}=\frac{1}{\mathbf{a}^{\top} \mathbf{V}^{-1} \mathbf{a}} \mathbf{V}^{-1} \mathbf{a}\).