Let \(\boldsymbol A\) and \(\boldsymbol B\) both be matrices of order \(k \times m\). Their sum is defined componentwise: \[\boldsymbol{A} + \boldsymbol{B}
=\begin{pmatrix}
a_{11}+ b_{11} & a_{12}+ b_{12} & \cdots & a_{1m}+ b_{1m} \\
a_{21}+ b_{21} & a_{22}+ b_{22} & \cdots & a_{2m}+ b_{2m} \\
\vdots & \vdots & & \vdots \\
a_{k1}+ b_{k1} & a_{k2}+ b_{k2} & \cdots & a_{km}+ b_{km}
\end{pmatrix}.\] Only two matrices of the same order can be added. Example:\[\boldsymbol{A}=\begin{pmatrix}2&0\\1&5\\3&2\end{pmatrix}\,,\quad
\boldsymbol{B}=\begin{pmatrix}-1&1\\7&1\\-5&2\end{pmatrix}\,,\quad
\boldsymbol{A}+\boldsymbol{B}=\begin{pmatrix}1&1\\8&6\\-2&4\end{pmatrix}\,.\]
The matrix summation satisfies the following rules: \[\begin{array}{@{}rr@{\ }c@{\ }l@{}r@{}}
\text{(i)} & \boldsymbol{A}+\boldsymbol{B} &=& \boldsymbol{B}+\boldsymbol{A}\, & \text{(commutativity)} \\
\text{(ii)} & (\boldsymbol{A}+\boldsymbol{B})+\boldsymbol{C} &=& \boldsymbol{A}+(\boldsymbol{B}+\boldsymbol{C})\, & \text{(associativity)} \\
\text{(iii)} & \boldsymbol A + \boldsymbol 0 &=& \boldsymbol A & {\text{(identity element)}} \\
\text{(iv)} & (\boldsymbol A + \boldsymbol B)' &=& \boldsymbol A' + \boldsymbol B' &
{\text{(transposition)}}
\end{array}\]
The inner product (also known as dot product) of two vectors \(\boldsymbol{a},\boldsymbol{b}\in\mathbb{R}^k\) is \[\boldsymbol{a}'\boldsymbol{b} = a_1 b_1+a_2b_2+\ldots+a_kb_k=\sum_{i=1}^k a_ib_i\in\mathbb{R}.\]Example:\[\boldsymbol{a}=\begin{pmatrix}1\\2\\3\end{pmatrix},\quad
\boldsymbol{b}=\begin{pmatrix}-2\\0\\2\end{pmatrix},\quad
\boldsymbol{a}'\boldsymbol{b}=1\cdot(-2)+2\cdot0+3\cdot2=4.\]
The inner product is commutative: \[\begin{align*}
\boldsymbol a' \boldsymbol b = \boldsymbol b' \boldsymbol a.
\end{align*}\] Two vectors \(\boldsymbol a\) and \(\boldsymbol b\) are called orthogonal if \(\boldsymbol a' \boldsymbol b = 0\). The vectors \(\boldsymbol a\) and \(\boldsymbol b\) are called orthonormal if, in addition to \(\boldsymbol a'\boldsymbol b\), we have \(\boldsymbol a' \boldsymbol a = 1\) and \(\boldsymbol b' \boldsymbol b=1\).
For vector multiplication in R, we use the operator %*% (recall that * is already reserved for element-wise multiplication). Let’s implement some multiplications.
y=c(2,7,4,1)#y is treated as a column vectort(y)%*%y#the inner product of y with itself
The matrix product of a \(k \times m\) matrix \(\boldsymbol{A}\) and a \(m \times n\) matrix \(\boldsymbol{B}\) is the \(k\times n\) matrix \(\boldsymbol C = \boldsymbol{A}\boldsymbol{B}\) with the components \[c_{ij} = a_{i1}b_{1j}+a_{i2}b_{2j}+\ldots+a_{im}b_{mj}=\sum_{l=1}^m a_{il}b_{lj} = \boldsymbol a_i' \boldsymbol b_j,\] where \(\boldsymbol a_i = (a_{i1}, \ldots, a_{im})'\) is the \(i\)-th row of \(\boldsymbol A\) written as a column vector, and \(\boldsymbol b_j = (b_{1j}, \ldots, b_{mj})'\) is the \(j\)-th column of \(\boldsymbol B\). The full matrix product can be written as \[
\boldsymbol A \boldsymbol B = \begin{pmatrix} \boldsymbol a_1' \\ \vdots \\ \boldsymbol a_k' \end{pmatrix}
\begin{pmatrix} \boldsymbol b_1 & \ldots & \boldsymbol b_n \end{pmatrix}
= \begin{pmatrix} \boldsymbol a_1' \boldsymbol b_1 & \ldots & \boldsymbol a_1' \boldsymbol b_n \\ \vdots & & \vdots \\ \boldsymbol a_k' \boldsymbol b_1 & \ldots & \boldsymbol a_k' \boldsymbol b_n \end{pmatrix}.
\] The matrix product is only defined if the number of columns of the first matrix equals the number of rows of the second matrix. Therefore, we say that the \(k \times m\) matrix \(\boldsymbol A\) and the \(m \times n\) matrix \(\boldsymbol B\) are conformable for matrix multiplication.