This note will mainly attempt to summarize the introduction of some intuitive notions of probability used in common sense human reasoning. Most of what is said here is available here (Jaynes 2003).
Three intuitive notions of probability
Jaynes presents some forms of inference that are not possible in classical first order or propositional logic, yet they are frequent in human common sense reasoning. Let’s present some rules and some examples along them:
If there exists a logical rule $A \to B$ we have that observing $\bar{A}$ makes $\bar{B}$ more probable. For example if the sentence is “If it rains, the terrain will be wet”, if we observe that it is not raining, it is more probable that the terrain is not even wet!. This can be also captured by the classical Bayesian rule: Let’s say $R$ is raining probability and $W$ is wet probability, then:
$$ \mathbb{P}(W \mid \bar{R}) = \frac{\mathbb{P}(\bar{R} \mid W)\mathbb{P}(W)}{\mathbb{P}(\bar{R})} \leq \mathbb{P}(W) $$We also can infer that if we observe that the terrain is wet, then we say it is more probable that it has rained.
$$ \mathbb{P}(R \mid W) = \frac{\mathbb{P}(W \mid R)\mathbb{P}(R)}{\mathbb{P}(W)} = 1 \frac{\mathbb{P}(R)}{\mathbb{P}(W)} \geq \mathbb{P}(R) $$These are some natural inferences that could be made by just having a specific logical rule. This is not logical reasoning, but it captures many facets of human probabilistical reasoning that we would like to imbue into machines.
The third inference is a weaker form of the second. If we have a rule that goes something like this: $A$ is true makes $B$ more probable and we observe that $B$ is true, then we infer that $A$ should be more probable. So we have $P(B\mid A) \geq P(B)$ we want to prove that $P(A \mid B) \geq P(A)$ We have that
$$ P(A \mid B) = \frac{P(B \mid A)P(A)}{P(B)} \geq P(A) $$So simple Bayes rule allows us to encode these important reasoning systems!
References
[1] Jaynes “Probability Theory: The Logic of Science” Cambridge University Press 2003