Math 5 minutes

# Propositional logic derived as a special case of probability calculus

In this article, I will apply the rules of probability calculus to derive the rules of propositional logic (also called propositional calculus).

This article is very formal and equation oriented. The goal is to show that bayesian thinking through probability calculus is fully compatible with propositional logic. This article contributes to a bigger purpose since I will show in an upcoming article how probability calculus explains inferences that can’t be made using propositional logic. In particular, if we know that $A$ is a cause of $B$, written $A \Rightarrow B$, then knowing that $B$ is true in propositional logic doesn’t tell us anything about $A$. But the next article will show that probability calculus gives us more information about $A$. Before we explore those extensions, let’s review our basics in the light of the new formalism.

Notations: given two propositions $A$ and $B$, we use the following notations.

Notation meaning
$A + B$ $A$ or $B$
$AB$ $A$ and $B$
$\bar{A}$ not $A$
$A \Rightarrow B$ $A$ implies $B$

Since our focus is primarily on inference, let’s use probability calculus to derive the following rules of propositional calculus:

We can take an example to illustrate those rules:

 $A \Rightarrow B$: If it rain tomorrow then I will go to the cinema; $\bar{A} + B$: Either it doesn’t rain tomorrow or you can be sure I’ll go to the cinema; $\bar{B} \Rightarrow \bar{A}$: If I don’t go to the cinema tomorrow, this means it’s not raining outside.

Before we dive in, let’s recall the rules of probability calculus.

## Probability calculus primer

Let $A$ and $B$ be two propositions and $e$ (as in “evidence”) a set of propositions. We note $p(A \mid e)$ our probability estimate for the truth of $A$ given evidence $e$ and we note $p_e(\cdot) = p(\cdot \mid e)$.

The following rules hold:

 negation: $p_e(\bar{A}) = 1 - p_e(A)$ sum rule: $p_e(AB) = p_e(A \mid B)\, p_e(B)$ product rule: $p_e(A + B) = p_e(A) + p_e(B) - p_e(AB)$

If $H_1, ..., H_n$ is a collection of $n$ propositions that are exhaustive (at least one of them is true) and incompatible (no two of them can be true at the same time), then we have for all proposition $A$:

 exhaustivity: $p_e(H_1 + ... + H_n) = 1$ partition rule: $p_e(A) = p_e(AH_1) + ... + p_e(AH_n)$

In particular, $A$ and $\bar{A}$ satisfy these properties:

 exhaustivity: $p_e(A + \bar{A}) = p_e(A) + p_e(\bar{A}) = 1$ binary partition rule: $p_e(A) = p_e(AB) + p_e(A\bar{B})$

If you want to know more about the how these rules were constructed, check out this article: Probability calculus: the logic of uncertainty.

Now that we are all setup, let’s tackle propositional logic.

## $A \Rightarrow B \equiv \bar{A} + B$

• Let $e$ = “$A \Rightarrow B$
• By definition of $e$, we have: $p_e(B \mid A) = 1$
• We want to show that: $p_e(\bar{A} + B) = 1$

Use the binary partition rule:

Hence:

Which achieves the proof that give $A \Rightarrow B$, we have $\bar{A} + B$. Let’s prove the other direction.

• Let $e$ = “$\bar{A} + B$ is true”
• As shown before, $p_e(B) = p_e(B \mid A)\,p_e(A) + p_e(\bar{B}A)$

So, on the one hand we have:

And on the other hand we have:

Therefore:

Which is what had to be proved. Let’s turn to our second proof.

## $A \Rightarrow B \equiv \bar{B} \Rightarrow \bar{A}$

 Let $e$ = “$A \Rightarrow B$” so that $p_e(B \mid A) = 1$ We want to prove that: $p_e(\bar{A} \mid \bar{B}) = 1$ We know from the previous section that: $p_e(\bar{A}+B) = 1$ This means that: $\color{blue}{p_e(A\bar{B})} = 1 - 1 = 0$ Since: $p_e(\bar{B}) = \color{blue}{p_e(A\bar{B})} + p_e(\bar{A}\bar{B})$ We have: $p_e(\bar{B}) = \color{blue}{0} + p_e(\bar{A}\bar{B})$ Also written: $p_e(\bar{B}) = p_e(\bar{A} \mid \bar{B})\,p_e(\bar{B})$ And thus: $p_e(\bar{A} \mid \bar{B}) = 1$

We can obtain the opposite direction by replacing $A$ with $\bar{A}$ and $B$ with $\bar{A}$.

Did you ever notice that most people think the rule $A \Rightarrow B$ also means $B \Rightarrow A$? Probability calculus explains why by showing us that if we know $B,$ then the probability of $A$ increases.

Read my next article to find out: Why bayesian inference is more powerful than logic