Eigenvalue problems, basic definitions

Let us consider the matrix \( \mathbf{A} \) of dimension \( n \). The eigenvalues of \( \mathbf{A} \) are defined through the matrix equation $$ \mathbf{A}\mathbf{x}^{(\nu)} = \lambda^{(\nu)}\mathbf{x}^{(\nu)}, $$ where \( \lambda^{(\nu)} \) are the eigenvalues and \( \mathbf{x}^{(\nu)} \) the corresponding eigenvectors. Unless otherwise stated, when we use the wording eigenvector we mean the right eigenvector. The left eigenvalue problem is defined as $$ \mathbf{x}^{(\nu)}_L\mathbf{A} = \lambda^{(\nu)}\mathbf{x}^{(\nu)}_L $$ The above right eigenvector problem is equivalent to a set of \( n \) equations with \( n \) unknowns \( x_i \).