Consider two bases \((\mathbf{e}_{1},\mathbf{e}_{2})\) and \((\mathbf{\tilde{e}}_{1},\mathbf{\tilde{e}}_{2})\), where we consider the former the old basis and the latter the new basis.
Each vector \((\mathbf{\tilde{e}}_{1},\mathbf{\tilde{e}}_{2})\) can be expressed as a linear combination of \((\mathbf{e}_{1},\mathbf{e}_{2})\):
\begin{equation} \mathbf{\tilde{e}}_{1}\,=\,\mathbf{e}_{1}S^{1}_{1}\,+\,\mathbf{e}_{2}S^{2}_{1}\\\
\tag{1.0}\\\
\mathbf{\tilde{e}}_{2}\,=\,\mathbf{e}_{1}S^{1}_{2}\,+\,\mathbf{e}_{2}S^{2}_{2} \end{equation}
(1.0) is the basis transformation formula, and the object \(S\) is the direct transformation \(\{S^{j}_{i},\,1\,\leq\,i,\,j\,\leq\,2\}\), (assuming a \(2x2\) matrix) which can also be written in matrix form:
\begin{equation} \begin{bmatrix} \mathbf{\tilde{e}}_{1} & \mathbf{\tilde{e}}_{2} \end{bmatrix}\,=\, \begin{bmatrix} \mathbf{e}_{1} & \mathbf{e}_{2}\, \end{bmatrix} \begin{bmatrix} S^{1}_{1} & S^{1}_{2}\\\
...
\begin{equation} \mathbf{I}(\mathbf{X})=\mathbf{X} \end{equation}
Where any \(nxn\) matrix is established via the Kronecker Delta, e.g.
\begin{equation} \mathbf{I}_{ij}\,=\,\delta_{ij} \end{equation}
A matrix, which when multiplied by another matrix, results in the identity matrix.
\begin{equation} \mathbf{A}\mathbf{A}^{-1} = I \end{equation}
e.g.
\begin{equation} \begin{bmatrix} a & b\\\
c & d \end{bmatrix} \begin{bmatrix} d & -b\\\
-c & a \end{bmatrix}= \begin{bmatrix} 1 & 0\\\
0 & 1 \end{bmatrix} \end{equation}
also known as a linear space
A collection of objects known as “vectors”. In the Euclidean space these can be visualized as simple arrows with a direction and a length, but this analogy will not necessarily translate to all spaces.
Addition and multiplication of these objects (vectors) must adhere to a set of axioms for the set to be considered a “vector space”.
Addition (+) \begin{equation} +\,:\,V\,\times\,V\,\longrightarrow\,V \end{equation}
...
To qualify as a vector space, a set \(V\) and its associated operations of addition (\(+\)) and multiplication/scaling (\(\cdot\)) must adhere to the below:
Associativity # \begin{equation} \mathbf{u}+(\mathbf{v}+\mathbf{w}) = (\mathbf{u} + \mathbf{v}) + \mathbf{w} \end{equation}
Commutivity # \begin{equation} \mathbf{u} + \mathbf{v} = \mathbf{v} + \mathbf{u} \end{equation}
Identity of Addition # There exists and element \(\mathbf{0}\,\in\,V\), called the zero vector, such that \(\mathbf{v} + \mathbf{0} = \mathbf{v}\) for all \(\mathbf{v}\,\in\,V\).
...