# squared error minimization formula semplification

#### Jiloc

Hello, it's my first post here. I am studying some Computer Vision and I found myself back in linear algebra. I gave this exam something like 10 years ago and I find some difficulties. Probably something basic.

In the following video (exact second) the guy is explaining that he wants to minimize the squared error to find the best approximation for an overdetermined linear system.

Here is my problem. How can this:

$$\displaystyle || b - M a || ^ 2 =$$

become this:

$$\displaystyle (b - M a)^T (b - M a)$$

Thank you!

#### SDK

Hello, it's my first post here. I am studying some Computer Vision and I found myself back in linear algebra. I gave this exam something like 10 years ago and I find some difficulties. Probably something basic.

In the following video (exact second) the guy is explaining that he wants to minimize the squared error to find the best approximation for an overdetermined linear system.

Here is my problem. How can this:

$$\displaystyle || b - M a || ^ 2 =$$

become this:

$$\displaystyle (b - M a)^T (b - M a)$$

Thank you!
What do you mean? There is no computation here. This is the definition of the Euclidean norm on $\mathbb{R}^n$. For any vector $x \in \mathbb{R}^n$, its norm is defined by $\left| \left| x \right| \right| = \sqrt{x^T x}$. Now just square both sides and you get $\left| \left| x \right| \right|^2 = x^T x$

#### Jiloc

What do you mean? There is no computation here. This is the definition of the Euclidean norm on $\mathbb{R}^n$. For any vector $x \in \mathbb{R}^n$, its norm is defined by $\left| \left| x \right| \right| = \sqrt{x^T x}$. Now just square both sides and you get $\left| \left| x \right| \right|^2 = x^T x$
Thank you, I suspected it was something very basic. I didn't remember at all about the definition and was confused from what I read here
https://math.stackexchange.com/questions/507742/distance-similarity-between-two-matrices