Roughly calculation of ordinary least squares of multiple linear regression.
y = X b + e
is a multiple linear regression where
y is a dependent vector,
X is an explanatory matrix and
e is an error vector.
e = y - X b
Multiply both sides with a transposed variable of then self.
eT e
= (y - X b)T (y - X b)
= yT y - 2 bT XT y + bT XT X b
Differentiate by b.
Since the right hand side of previous equation is a squared expression with b,
the differentiation of minimal value of it is 0.
d(eT e) / db = - 2 XT y + 2 XT X b = 0
b = (XT X)^-1 XT y
References:
https://en.wikipedia.org/wiki/Ordinary_least_squares
https://en.wikipedia.org/wiki/Proofs_involving_ordinary_least_squares#Least_squares_estimator_for_.CE.B2