Introduction to the probabilistic Turing Machine
Most of real phenomena are better comprehended by a probabilistic view. This pushes to build a formal model that takes probability into account
Def: Probabilistic TM
Take a non deterministic TM La macchina di Turing. At each step there is a fair coin-flip that has two legal branches. So the probability of a certain branch is
$$ \mathbb{P}(b) = 2^{-k} $$Where $k$ is the length of the branch. Then the probability of accepting the word is given by this:
$$ \mathbb{P}(\mathcal{M} \text{ accepts } \omega) = \sum_{b \text{ is an accepting branch}} \mathbb{P}(b) $$Note that this is very similar to the Algorithmic probability defined in Kolmogorov complexity.
We have here a probability of error.
Bounded-error Probabilistic Polynomial Class
This is the new complexity class of the Turing machine, and sort of is similar to the PAC model of the machine learning predictions.
BPP is the class of languages that are decided by probabilistic polynomial time Turing machines with an error probability of $\frac{1}{3}$ .
Formal Language Models
This is something very similar to this model, but needs to be investigated. See (Cotterell et al. 2023).
References
[1] Cotterell et al. “Formal Aspects of Language Modeling” 2023