## Derivative, Gradient and Jacobian unified

A summary about scalar and vector derivatives.

## Regression with squared error loss

In this article we study the solution to a regression with squared error loss. We start with the theoretical formulation before tackling the problem in practice.

## The geometry of the normal equations

In this article, I show that the normal equations define the orthogonal projection of a vector onto a linear subspace.

## The Moore-Penrose (pseudo-inverse) matrix

The Moore-Penrose inverse of a matrix is used to approximatively solve a degenerate system of linear equations.

## Why do we care about convexity?

In machine learning, the best parameters for a model are chosen so as to minimize the training objective. Strictly convex functions are paticularly interesting because they have a...

## Propositional logic derived as a special case of probability calculus

In this article, I will apply the rules of probability calculus to derive the rules of propositional logic (also called propositional calculus).