Autonomous Robots - Linear Algebra

Motivation

Let's think a problem about fitting a 2D polynomial line: $$ \begin{bmatrix} 1, x_0, x_0^2, x_0^3, \cdots x_0^n \\ 1, x_1, x_1^2, x_1^3, \cdots x_1^n \\ \vdots \\ 1, x_m, x_m^2, x_m^3, \cdots x_m^n \\ \end{bmatrix} \begin{bmatrix} \beta_0 \\ \beta_1\\ \vdots \\ \beta_n\end{bmatrix} = \begin{bmatrix} y_0 \\ y_1\\ \vdots \\ y_m\end{bmatrix} $$

$(x_i, y_i)$ are observed points and $\beta_i$ are coefficients for the polynomial function. How do we solve this equation to find the best $\beta$?

Vectors

Vector Representations

A two-dimensional vector $v \in \mathbb{R}^2$ can be written as a column vector:

$$ v = \begin{bmatrix} v_1 \\ v_2 \end{bmatrix} $$

It represents:

  • direction and magnitude , e.g. the velocity of a point on a robot
  • position , e.g. the position of a point on a robot
In [139]:
Out[139]:
[]

Vector Operations

  • Scaling

    $\lambda \textbf{v}$

  • Linear combinations

    $\alpha\textbf{v} + \beta\textbf{w}$

  • Dot product

    $ \begin{align} \textbf{v}.dot(\textbf{w}) =& \textbf{v}^T \textbf{w} \\ =& v_1w_1+\cdots+v_nw_n \end{align} \\ $ where $v,w \in \mathbb{R}^n$

    $ |\textbf{v}.dot(\textbf{w})\| = \|\textbf{v}\|\|\textbf{w}\|cos(\theta) \\ $ where $\theta$ is the angle between two vectors.

  • Cross product of vectors in $\mathbb{R}^3$

    $\textbf{v}.cross(\textbf{w}) = \begin{bmatrix} 0, -v_3, v_2\\ v_3, 0, -v_1 \\ -v_2, v_1, 0 \end{bmatrix} \begin{bmatrix} w_1 \\ w_2 \\ w_3 \\ \end{bmatrix} \\ $ $\|\textbf{v}.cross(\textbf{w})\| = \|\textbf{v}\|\|\textbf{w}\|sin(\theta)$, where $\theta$ is the angle between two vectors.

Examples of Vectors Operations in Robotics

  • Scaling

Starting at $O$, robot moves at a constant speed $v$. After $\Delta t$, the robot will be at $D = O+v\Delta t$

  • Linear combinations

Starting at $O \in \mathbb{R}^2$, robot moves at a constant speed $\dot{x}=v$ in $x$ direction and $\dot{y}=w$ in $y$ direction. After $\Delta t$, the robot will be at $D = O+v\Delta t+w\Delta t$

In [42]:
Out[42]:
[]
  • Dot product

    1. Find two lines seen from the robot are parallel or not.
    2. Check whether a point is inside a rectangle or not.
In [66]:
They are parallel :)
Out[66]:
[]
In [79]:
pt1 is inside!
pt2 is outside!
Out[79]:
[]
  • Cross product
    1. Linear velocity and angular velocity
    2. Find the shortest distance from a point to a line.
In [107]:
2.0
0.0
Out[107]:
[]

Matrices and Linear Equations

Matrix Representations

A $2 \times 2$ matrix can be written as a collection of two column vector $\textbf{v} \in \mathbb{R}^2$, $\textbf{w} \in \mathbb{R}^2$:

$$ \begin{bmatrix} a_{11}, a_{12} \\ a_{21}, a_{22} \end{bmatrix} = \begin{bmatrix} \textbf{v}, \textbf{w} \end{bmatrix} $$

Similarly, a $3 \times 3$ matrix can be written as a collection of three column vector $\textbf{u} \in \mathbb{R}^3$, $\textbf{v} \in \mathbb{R}^3$, $\textbf{w} \in \mathbb{R}^3$:

$$ \begin{bmatrix} a_{11}, a_{12}, a_{13} \\ a_{21}, a_{22}, a_{23} \\ a_{31}, a_{32}, a_{33} \end{bmatrix} = \begin{bmatrix} \textbf{u}, \textbf{v}, \textbf{w} \end{bmatrix} $$

In this post, we will only discuss about $m \times m$ matrix. Without considering numerical issues, knowing properties of $m \times m$ is enough for us solving all kinds of linear equations.

Matrix Operations

  • Matrix multiplication

    $C_{mm} = A_{mn}B_{nm}$

  • Matrix times vector as a linear combination of all column vectors

    $$ A\textbf{x} = \begin{bmatrix} a_{11}, a_{12}, a_{13} \\ a_{21}, a_{22}, a_{23} \\ a_{31}, a_{32}, a_{33} \end{bmatrix} \begin{bmatrix} {x_1}\\ {x_2}\\ {x_3} \end{bmatrix} = \begin{bmatrix} \textbf{u}, \textbf{v}, \textbf{w} \end{bmatrix} \begin{bmatrix} {x_1}\\ {x_2}\\ {x_3}\\ \end{bmatrix} = x_1\textbf{u} + x_2\textbf{v}, x_3\textbf{w} $$
  • Matrix inverse

    $A_{mm}A_{mm}^{-1} = I$, for orthogonal matrix , $Q^{-1} = Q^{T}$

In [116]:

Matrix Properties

  • Linear independence

$\textbf{u}, \textbf{v}, \textbf{w}$ are linear independent if and only if there doesn't exist non-zero $\textbf{x}$ such that:

$$ \begin{bmatrix} \textbf{u}, \textbf{v}, \textbf{w} \end{bmatrix} \begin{bmatrix} {x_1}\\ {x_2}\\ {x_3}\\ \end{bmatrix} = x_1\textbf{u} + x_2\textbf{v}, x_3\textbf{w} = \textbf{0} $$
  • Determinant and rank

For matrix $A_{mm} = [\textbf{v}_1, \cdots, \textbf{v}_m]$,

$A^{-1}\ exists \iff rank(A) = m \iff det(A) \neq 0 \iff \textbf{v}_1, \cdots, \textbf{v}_m\ are\ linear\ independent$ $rank(A) < m \iff det(A) = 0 \iff \textbf{v}_1, \cdots, \textbf{v}_m\ are \ linear\ dependent$

In [121]:
matrix A: 
 [[1 2 3]
 [3 2 1]
 [1 2 1]]
Rank: 3
Determinant: 8.00
  • Geometry Interpretations

All possible combinations of two independent vectors $\textbf{v}$ and $\textbf{w}$ in $\mathbb{R}^2$ form a two dimensional space. In other words, any vectors in $\mathbb{R}^2$ can be represented as a combination of $\textbf{v}$ and $\textbf{w}$. For example, possible orthogonal bases of $\mathbb{R}^2$ are:

$$ \textbf{v} = \begin{bmatrix}1\\0 \end{bmatrix}, \textbf{w} = \begin{bmatrix}0\\1 \end{bmatrix} $$

Then $A_{22} = [\textbf{v}, \textbf{w}]$ will be a full rank matrix.

Suppose we have three vectors $\textbf{u} \in \mathbb{R}^3 $, $\textbf{v} \in \mathbb{R}^3$ and $\textbf{w} \in \mathbb{R}^3$ as below, because of $\textbf{w} = \textbf{u} + \textbf{v}$, $A_{33} = [\textbf{u}, \textbf{v}, \textbf{w}]$ will not be a full rank matrix. Its rank is $2$ which is equal to the number of linear independent column vectors. We can easily see in this example that $\textbf{w}$ is redundant. All vectors in $x-y$ plane in a cartesian coordinate can be represented by $\textbf{u}$ and $\textbf{w}$.

$$ \textbf{u} = \begin{bmatrix}1\\0\\0 \end{bmatrix}, \textbf{v} = \begin{bmatrix}0\\1\\0 \end{bmatrix}, \textbf{w} = \begin{bmatrix}1\\1\\0\end{bmatrix} $$

For a matrix $A_{m \times n}$, The row space and column space have dimension $r$ (the rank). The nullspaces of $A$ and $A^T$ have dimensions $n − r$ and $m − r$.

Four spaces

Linear Equations in Matrix Form

Linear equations can be written in matrix form $A\textbf{x}=\textbf{b}$.

$$ a_{11}x_1 + a_{12}x_2 + a_{13}x_3 = b_1\\ a_{21}x_1 + a_{22}x_2 + a_{23}x_3 = b_2\\ a_{31}x_1 + a_{32}x_2 + a_{33}x_3 = b_3 $$$$ \begin{bmatrix} a_{11}, a_{12}, a_{13} \\ a_{21}, a_{22}, a_{23} \\ a_{31}, a_{32}, a_{33} \end{bmatrix} \begin{bmatrix} {x_1}\\ {x_2}\\ {x_3} \end{bmatrix}= \begin{bmatrix} {b_1}\\ {b_2}\\ {b_3} \end{bmatrix} $$

Solve $Ax=b$

How many solutions?

$$ A_{mm}\textbf{x} = \textbf{b} $$
  • If $A_{mm}$ is a full rank matrix (all columns are linear independent, $det(A_{mm}) \neq 0$), then it will have a single unique solution. Since the column vectors form the bases of $\mathbb{R}^m$, so we can always find a combinations of them to be equal to $\textbf{b}$. For example, the column vectors of an identity matrix form the orthogonal bases. You can always find the unique solution for the linear equations below as $\textbf{x} = [b_1, b_2, b_3]^T$
    $$ \begin{bmatrix} 1, 0, 0\\ 0, 1, 0\\ 0, 0, 1 \end{bmatrix} \textbf{x} = \begin{bmatrix} {b_1}\\ {b_2}\\ {b_3} \end{bmatrix} $$
  • If $A_{mm}$ is not a full rank matrix, then it will have either no solution or infinite solutions . If vector $\textbf{b}$ is in the matrix $A_{mm}$'s column space, then it will have infinite solutions. As in the example shown below, $x_3$ can be any value as long as $x_1 = 1, x_2 = 1$. However, if $b = [0,0,1]^T$, there won't be any solutions.
    $$ \begin{bmatrix} 1, 0, 1\\ 0, 1, 1\\ 0, 0, 0 \end{bmatrix} \textbf{x} = \begin{bmatrix} {2}\\ {2}\\ {0} \end{bmatrix} $$

Pseudo Inverse

What could we do if there is no solutions to linear equations? Actually no matter whether there are solutions or not, we can formulate it as an minimization problem:

$$\textbf{x} = \underset{x\in \mathbb{R}^n}{\operatorname{argmin}}f(\textbf{x}) = \underset{x\in \mathbb{R}^n}{\operatorname{argmin}}\|A_{mn}\textbf{x} - \textbf{b}\|^2$$

By setting $\frac{\partial f(\textbf{x})}{\partial \textbf{x}} = 0$, we can get:

$$ A_{mn}^T A_{mn} \textbf{x} = A_{mn}^T \textbf{b} \\ A'_{nn}\textbf{x} = \textbf{b}',\ {A'_{nn}}^T = A'_{nn}$$

As you can see, we get $A'_{nn}$as a symmetric square matrix! This is why we only discussed square matrices in this post.

If $A'_{nn}$ is a full rank matrix, we can get a unique solution:

$$\textbf{x} = {A'_{nn}}^{-1} \textbf{b}' = {(A_{mn}^T A_{mn})}^{-1} A_{mn}^T \textbf{b} $$
In [123]:

Cholesky Decomposition

Cholesky decomposition $A_{mm} = L_{mm}L_{mm}^T$ ($L_{mm}$ is a lower triangular matrix) is a special version of LU decomposition , which can only be applied on symmetric matrices. Solving $A\textbf{x} = \textbf{b}$ is equal to solving $L\textbf{y} = \textbf{b}$ and $L^T\textbf{x} = \textbf{y}$.

Given a triangular matrix equations, we can solve it using forward or backward substitution.

$$ \begin{bmatrix} a_{11}, 0_{12}, 0_{13} \\ a_{21}, a_{22}, 0_{23} \\ a_{31}, a_{32}, a_{33} \end{bmatrix} \begin{bmatrix} x_1 \\ x_2 \\ x_3 \end{bmatrix} = \begin{bmatrix} b_1 \\ b_2 \\ b_3 \end{bmatrix} $$
In [130]:

Singular Value Decomposition (SVD)

What could we do if $A_{mn}^T A_{mn}$ is not a full rank matrix (when $rank(A_{mn}) < n$) ?

  • First method: Find all independent columns, and solve the new linear equations only contain these independent columns. The $\textbf{x}$ variable corresponding to the rest of dependent columns will be set as 0. I don't think you will find this answer on any textbook, but I feel this is the most intuitive method. All dependent columns are redundant since they can be expressed as combinations of other independent columns. Thus, we don't really need them.

  • Second method: $\textbf{x} = VZU^T\textbf{b}$, where $A = U\Sigma V^T$,

    $$ z_i=\begin{cases} \frac{1}{\sigma_i}, & \text{if $\sigma_i \neq 0$}.\\ 0, & \text{otherwise}. \end{cases} $$
In [140]:
A:
 [[1 2 0 0]
 [0 0 0 0]
 [0 0 2 0]
 [0 0 0 1]
 [0 0 0 0]]
b:
 [0 0 4 1 0]
AtA:
 [[1 2 0 0]
 [2 4 0 0]
 [0 0 4 0]
 [0 0 0 1]]
AtA rank: 3
singular,  [2.23606798 2.         1.         0.        ]
x: [0. 0. 2. 1.]
x: [0. 2. 1.]
This blog is converted from autobot-linear-algebra.ipynb
Written on May 28, 2021