Bayes Theorem

Bayes Theorem

The probability of an event based on the conditions known beforehand which could be related to the event.

Scenario 1: Sam takes a drug test that is testing for cannabis.

This specific drug test kit may have something like 90% accuracy, so within any sample size there is an expected percentage of people who will show positive, even though they are non-drug users.

Sam tests positive, but that is because she smoked before the test.

Scenario 2: If 100% of patients with certain cancers show the same specific symptom, when an individual displays that same symptom, it does not automatically mean that the individual has that same type of cancer.

If the incidence rate of said cancer is 1/100000, and globally 10/100000 healthy individuals carry the exact same symptoms, then the probability of having that specific cancer given the conditions is 9.1%, and the remaining majority could be deemed  "false positives".

Definition

Bayes can be observed as a framework for critical thinking. When attempting to form accurate beliefs, you must start with the information you already possess. As time continues, you update your beliefs and try your best not to allow old information to erode.

Posterior odds can measure how likely a hypothesis is compared to another one, while prior odds measure how likely it was before new evidence was brought to light. Likelihood odds measure how well evidence can explain the current hypothesis compared to the other one.

History

Named after Thomas Bayes, a Presbyterian minister from the 1700s, Bayes Theorem is the process of calculating the validity of a hypothesis’, based on the known evidence. His idea for this theorem was th at an initial belief plus new evidence equals a new and improved belief.

It’s quite simple, really.

Deep Analysis

The late colleague is a strong and simple example for understanding Bayes Theorem. Let’s say you have a colleague who in the past seven days has arrived on time only 4 times, but has arrived late 3 times. How many more times does he have to be late for you to assume he is always late? Real-life hypotheses are usually not like that. Things aren’t that simple.

Hence, your increments can change your observations depending on what evidence has been uncovered. Though, this evidence can’t be accepted at face value. Keep it in context with what is already known.

Like in the being late example. Not updating incrementally would look something like:

“They’re late today, so I believe they’re always late”.

How often do you do this?

Simply put, evidence should not determine beliefs. Still, you should update your beliefs using evidence.

A real-world application of Bayes Thereom might be best understood when talking about cancer treatment.

In the real world, tests are rarely if ever totally reliable. So let’s say your test is 99 percent reliable. That is, 99 out of 100 people who have cancer will test positive, and 99 out of 100 who are healthy will test negative. That’s still a terrific test. If your test is positive, how probable is it that you have cancer?
Now Bayes’ theorem displays its power. Most people assume the answer is 99 percent, or close to it. That’s how reliable the test is, right? But the correct answer, yielded by Bayes’ theorem, is only 50 percent.
Plug the data into the right side of Bayes’ equation to find out why. P(B) is still .01. P(E|B), the probability of testing positive if you have cancer, is now .99. So P(B) times P(E|B) equals .01 times .99, or .0099. This is the probability that you will get a true positive test, which shows you have cancer.
What about the denominator, P(E)? Here is where things get tricky. P(E) is the probability of testing positive whether or not you have cancer. In other words, it includes false positives as well as true positives.
To calculate the probability of a false positive, you multiply the rate of false positives, which is one percent, or .01, times the percentage of people who don’t have cancer, .99. The total comes to .0099. Yes, your terrific, 99-percent-accurate test yields as many false positives as true positives.
Let’s finish the calculation. To get P(E), add true and false positives for a total of .0198, which when divided into .0099 comes to .5. So once again, P(B|E), the probability that you have cancer if you test positive, is 50 percent.
If you get tested again, you can reduce your uncertainty enormously, because your probability of having cancer, P(B), is now 50 percent rather than one percent. If your second test also comes up positive, Bayes’ theorem tells you that your probability of having cancer is now 99 percent, or .99. As this example shows, iterating Bayes’ theorem can yield extremely precise information.
But if the reliability of your test is 90 percent, which is still pretty good, your chances of actually having cancer even if you test positive twice are still less than 50 percent. (Check my math with the handy calculator in this blog post.)
Most people, including physicians, have a hard time understanding these odds, which helps explain why we are overdiagnosed and overtreated for cancer and other disorders. This example suggests that the Bayesians are right: the world would indeed be a better place if more people—or at least more health-care consumers and providers--adopted Bayesian reasoning.
On the other hand, Bayes’ theorem is just a codification of common sense. As Yudkowsky writes toward the end of his tutorial: “By this point, Bayes' theorem may seem blatantly obvious or even tautological, rather than exciting and new.  If so, this introduction has entirely succeeded in its purpose.”

Application in Business

In order to make informed business decisions, you must assess your current environment and build a plan based on your observations. The Bayes Theorem concludes that in order to build an effective plan, you will need to understand your current situation fully before taking on new information. Although it may not be possible to take in all of the information about your current situation, you should aim to learn as much as possible. This will make it easier to consume new information, which can result in a greater capacity to adapt to rapidly changing business environments.

Application in Education

People go to school to expand their sphere of knowledge. In education, the Bayes Theorem can be applied to assess the journey one takes over the course of their educational career.

What they know is built upon over years in class, and upon graduation, the lessons they’ve studied for so long are now knowledge that they can tap into it at will.

If they choose to pursue further education or a profession, their ideas and beliefs will be further altered by the various experiences these paths engage them in.

Explore More Models