$\renewcommand{\Re}{\operatorname{Re}}$ $\renewcommand{\Im}{\operatorname{Im}}$ $\newcommand{\erf}{\operatorname{erf}}$ $\newcommand{\dag}{\dagger}$ $\newcommand{\const}{\mathrm{const}}$ $\newcommand{\arcsinh}{\operatorname{arcsinh}}$ ##[4.3. Orthogonal systems](id:sect-4.3) ------------------ > 1. [Examples](#sect-4.3.1) > 2. [Abstract orthogonal systems: definition](#sect-4.3.2) > 3. [Orthogonal systems: approximation](#sect-4.3.3) > 4. [Orthogonal systems: approximation. II](#sect-4.3.4) > 5. [Orthogonal systems: completeness](#sect-4.3.5) ###Examples All systems we considered in the previous Section were orthogonal i.e. \begin{equation} (X\_n, X\_m)=0\qquad \forall m\ne n \label{eq-4.3.1} \end{equation} with \begin{equation} (X,Y):=\int\_0^l X(x)\bar{Y} (x)\,dx,\qquad \\|X\\|^2:=(X,X). \label{eq-4.3.2} \end{equation} where $\bar{Y}$ means complex-conjugate to $Y$. **Exercise 1.** Prove it by direct calculation. Instead however we show that this nice property (and the fact that eigenvalues are real) is due to self-adjointness (the notion which we do not want to formulate at this time at least). Consider $X,Y$ satisfying Robin boundary conditions \begin{align} &X'(0)-\alpha X(0)=0,\label{eq-4.3.3}\\\\ &X'(l)+\beta X(l)=0\label{eq-4.3.4} \end{align} with $\alpha,\beta\in \mathbb{R}$ (so $Y$ satisfies the same conditions). Note that \begin{multline} (X'',Y)=\int X''(x)\bar{Y}(x)\,dx = \\\\ -\int X'(x)\bar{Y}'(x)\,dx + X' (l)\bar{Y}(l)- X' (0)\bar{Y}(0)= \\\\ -(X',Y') -\beta X (l)\bar{Y}(l)-\alpha X(0)\bar{Y}(0).\qquad \label{eq-4.3.5} \end{multline} Therefore if we plug $Y=X\ne 0$ an eigenfunction, $X''+\lambda X=0$ we get $-\lambda \\|X\\|^2$ in the left-hand expression (with obviously real $\\|X\\|^2\ne 0$) and also we get the real right expression (since $\alpha,\beta\in \mathbb{R}$); so $\lambda$ must be real: *all eigenvalues are real*. Further, for $(X,Y'')$ we obtain the same equality albeit with $\alpha,\beta$ replaced by $\bar{\alpha},\bar{\beta}$ and therefore due to assumption $\alpha,\beta\in \mathbb{R}$ \begin{equation} (X'',Y)= (X,Y''). \label{eq-4.3.6} \end{equation} But then if $X,Y$ are eigenfunctions corresponding to *different* eigenvalues $\lambda$ and $\mu$ we get from (\ref{eq-4.3.6}) that $-\lambda(X,Y)=-\mu (X,Y)$ and $(X,Y)=0$ due to $\lambda\ne \mu$. **[Remark 1.](id:rem-4.3.1)** For periodic boundary conditions we cannot apply these arghuments to prove that $\cos(2\pi nx/l)$ and $\cos(2\pi nx/l)$ are orthogonal since they correspond to the same eigenvalue; we need to prove it directly. ###Abstract orthogonal systems: definition Consider *linear space* $\mathsf{H}$, real or complex. From linear algebra course [standard definition](http://en.wikipedia.org/wiki/Vector_space#Definition) 1. $u+v=v+u\qquad \forall u,v\in \mathsf{H}$; 2. $(u+v)+w=u+(v+w)\qquad \forall u,v,w\in \mathsf{H}$; 3. $\exists 0\in \mathsf{H}: \ 0+u=u\qquad \forall u\in \mathsf{H}$; 4. $\forall u\in \mathsf{H}\\ \exists (-u): u+(-u)=0 $; 5. $\alpha(u+v)=\alpha u+ \alpha v \qquad \forall u,v\in \mathsf{H}\quad \forall\alpha \in \mathbb{R}$; 6. $(\alpha+\beta)u=\alpha u+ \beta u \qquad \forall u\in \mathsf{H}\quad\forall\alpha,\beta \in \mathbb{R}$; 7. $\alpha(\beta u)=(\alpha \beta)u \qquad \forall u\in \mathsf{H}\quad \forall\alpha,\beta \in \mathbb{R}$; 8. $1u=u\qquad \forall u \in \mathsf{H}$. For complex linear space replace $\mathbb{R}$ by $\mathbb{C}$. Assume that on $\mathsf{H}$ *inner product* is defined: 1. $(u+v,w)=(u,w)+(v,w)\qquad \forall u,v,w\in \mathsf{H}$; 2. $(\alpha u,v)=\alpha (u,v) \qquad\forall u,v\in \mathsf{H} \quad \forall \alpha\in \mathbb{R}$; 3. $(u,v)=\overline{(v,u)} \qquad \forall u,v\in \mathsf{H}$; 4. $\\|u\\|^2:=(u,u)\ge 0 \qquad \forall u\in \mathsf{H}$ (it implies that it is real--if we consider complex spaces) and $\\|u\\|=0 \iff u=0$. **Definition 1.** 1. Finite dimensional real linear space with an inner product is called *Euclidean* space. 2. Finite dimensional complex linear space with an inner product is called *Hermitian* space. 3. Infinite dimensional linear space (real or complex) with an inner product is called *pre-Hilbert* space. For Hilbert space we will need another property (completeness) which we add later. **Definition 2.** 1. System $\\{u\_n\\}$, $0\ne u\_n\in \mathsf{H}$ (finite or infinite) is *orthogonal* if $(u\_m,u\_n)=0$ $\forall m\ne n$; 2. Orthogonal system is *orthonormal* if $\\|u\_n\\|=1$ $\forall n$, i.e. $(u\_m,u\_n)=\delta\_{mn}$ -- Kronecker symbol. ###Orthogonal systems: approximation Consider finite orthogonal system $\\{u\_n\\}$. Let $\mathsf{K}$ be its *linear hull*: the set of linear combinations $\sum\_n \alpha\_nu\_n$. Obviously $\mathsf{K}$ is a linear subspace of $\mathsf{H}$. Let $v\in \mathsf{H}$ and we try to find the best approximation of $v$ by elements of $\mathsf{K}$, i.e. we are looking for $w\in \mathsf{K}$ s.t. $\\|v-w\\|$ minimal. **Theorem 1.** 1. There exists a unique minimizer; 2. This minimizer is an orthogonal projection of $f$ to $\mathsf{K}$, i.e. $w\in \mathsf{K}$ s.t. $(v-w)$ is orthogonal to all elements of $\mathsf{K}$; 3. Such orthogonal projection is unique and $w=\sum\_n \alpha\_n u\_n$ with \begin{equation} \alpha\_n= \frac{(v,u\_n)}{\|u\_n\|^2}. \label{eq-4.3.7} \end{equation} 4. $\\|v\\|^2=\\|w\\|^2+\\|v-w\\|^2$. 5. $v=w \iff \\|v\\|^2=\\|w\\|^2$. *Proof.* (c) Obviously $(v-w)$ is orthogonal to $u\_n$ iff (\ref{eq-4.3.7}) holds. If (\ref{eq-4.3.7}) holds for all $n$ then $(v-w)$ is orthogonal to all $u\_n$ and therefore to all their linear combinations. (4)-(5) In particular $(v-w)$ is orthogonal to $w$ and then \begin{equation\*} \\|v\\|^2= \\|(v-w)+w\\|^2=\\|v-w\\|^2+ 2\Re \underbracket{(v-w,w)}\_{=0}+\\|w\\|^2. \end{equation\*} (1)-(2) Consider $w'\in \mathsf{K}$. Then $\\|v-w'\\|^2=\\|v-w\\|^2+\\|w-w'\\|^2$ because $(w-w')\in \mathsf{K}$ and therefore it is orthogonal to $(v-w)$. ###Orthogonal systems: approximation. II Now let $\\{u\_n\\}\_{n=1,2,\ldots,}$ be infinite orthogonal system. Consider its finite subsystem with $n=1,2,\ldots, N$, introduce $\mathsf{K}\_N$ for it and consider orthogonal projection $w\_N$ of $v$ on $\mathsf{K}\_N$. Then \begin{equation\*} w\_N= \sum\_{n=1}^N \alpha\_N u\_n \end{equation\*} where $\alpha\_n$ are defined by (\ref{eq-4.3.7}). Then according to (d) of Theorem \begin{equation\*} \\|v\\|^2 =\\|v-w\_N\\|^2+\\|w\_N\\|^2\ge \\|w\_N\\|^2=\sum\_{n=1}^N |\alpha\_n |^2\\|u\_n\\|^2. \end{equation\*} Therefore series in the right-hand expression below converges \begin{equation} \\|v\\|^2 \ge \sum\_{n=1}^\infty |\alpha\_n |^2\\|u\_n\\|^2 \label{eq-4.3.8} \end{equation} Really, recall that non-negative series can either converge or diverge to $\infty$. Then $w\_N$ is a *Cauchy sequence*. Really, for $M\>N$ \begin{equation\*} \\|w\_N-w\_M\\|^2= \sum\_{n=N+1}^M |\alpha\_n |^2\\|u\_n\\|^2\le \varepsilon\_N \end{equation\*} with $\varepsilon\_N\to 0$ as $N\to \infty$ because series in (\ref{eq-4.3.8}) converges. Now we want to conclude that $w\_N$ converges and to do this we must assume that every Cauchy sequence converges. **Definition 3.** 1. $\mathsf{H}$ is *complete* if every Cauchy sequence converges in $\mathsf{H}$. 2. Complete pre-Hilbert space is called *Hilbert space*. **Remark 2.** Every pre-Hilbert space could be completed i.e. extended to a complete space. From now on $\mathsf{H}$ is a Hilbert space. Then we can introduce $\mathsf{K}$-- a closed linear hull of $\\{u\_n\\}\_{n=1,2,\ldots}$ i.e. the space of \begin{equation} \sum\_{n=1}^\infty \alpha\_n u\_n \label{eq-4.3.9} \end{equation} with $\alpha\_n$ satisfying \begin{equation} \sum\_{n=1}^\infty |\alpha\_n |^2\\|u\_n\\|^2\<\infty. \label{eq-4.3.10} \end{equation} (Linear hull would be a space of finite linear combinations). Let $v\in \mathsf{H}$. We want to find the best approximation of $v$ by elements of $\mathsf{K}$. But then we get immediately **Theorem 2.** If $\mathsf{H}$ is a Hilbert space then [Theorem 1](#thm-4.3.1) holds for infinite systems as well. ###Orthogonal systems: completeness **Definition 4.** Orthogonal system is *complete* if equivalent conditions below are satisfied: 1. Its closed convex hull coincides with $\mathsf{H}$. 2. If $v\in \mathsf{H}$ is orthogonal to all $u\_n$ then $v=0$. **Remark 3.** Don't confuse completeness of spaces and completeness of orthogonal systems. Our next goal is to establish completeness of some orthogonal systems and therefore to give a positive answer (in the corresponding frameworks) to the question in the end of the previous [Section 4.2](./S4.2.html): can we decompose any function into eigenfunctions? Alternatively: Is the general solution a combination of simple solutions? _______ [$\Leftarrow$](./S4.2.html) [$\Uparrow$](../contents.html) [$\Rightarrow$](./S4.4.html)