Freezing

Freezing

Freeeeeezing......
twitter
github

Some thoughts on Bayesian formula

image

Under the assumptions of the total probability formula, we get the celebrated Bayes’ formula, a famous theorem in probability theory. This formula first appeared in a 1763 paper by the English scholar Thomas Bayes (1702–1761), published after his death.

From a purely mathematical standpoint, the formula might seem unremarkable; it’s a straightforward deduction from the definition of conditional probability and the total probability formula. Its fame, however, lies in its practical and even philosophical significance.

Let’s first consider the probabilities P(B1B_1​), P(B2B_2​), and so on. These represent our initial beliefs about the likelihood of each event Bi​ occurring, before we have any additional information (that is, before we know whether event A has happened). Now, suppose we receive new information that event A has indeed occurred. This new information allows us to update our assessment of the likelihood of each event BiB_i.

This kind of reassessment happens all the time in our daily lives. A situation initially thought to be unlikely can suddenly become very probable (or vice versa) once a particular event occurs. Bayes’ formula provides a way to quantify this change.

To make this more intuitive, let’s think of event A as the “effect” and the events B1​,B2​,… as the possible “causes.” In this light, the total probability formula can be seen as reasoning from cause to effect. It helps us calculate the probability of an effect (AA) happening based on its possible causes (BiB_i​).

Bayes’ formula, on the other hand, does the exact opposite. Its purpose is to reason from effect to cause. Now that we know the “effect” (AA) has already happened, we want to figure out which of the many possible “causes” (Bi​) was the most likely one. This is a common question in both everyday life and scientific research. Bayes’ formula tells us that the probability of each cause is proportional to how likely it is to produce the observed effect.

For example, imagine a crime has been committed in a certain district. Based on existing records, the suspects might include individuals like Tom, David, and others for example. Before knowing the specific details of the crime (let’s call this “event AA”), the police have an initial assessment of each person’s likelihood of being the culprit (equivalent to the initial probabilities P(B1B_1​),P(B2B_2​),…), based on their prior criminal records. However, once the details of the crime become known, this assessment changes. For instance, Tom, who was initially considered less likely, might now become the primary suspect.

From this discussion, it’s also easy to see the importance of this formula in statistics. In statistics, we rely on collected data (the “effect,” or event AA) to find answers to questions of interest. This is fundamentally a process of “finding the cause from the effect,” which is precisely where Bayes’ formula proves to be so useful.

Loading...
Ownership of this post data is guaranteed by blockchain and smart contracts to the creator alone.