# Hacking The Monty Hall Problem

My nephew Caleb asked me if I could explain the Monty Hall problem.

It's a tricky problem that has stumped many very smart people, including myself.

But by the time you finish this article it will all make perfect sense.

## What is the Monty Hall Problem?

You may remember the old TV show called *Let's Make a Deal* which ran on American television from 1963 to 1968.

Monty Hall hosted the show and offered deals to audience members in the form of choices, dares or other challenges. If the audience member succeeded they won a nice price like a new car. If not, they got the booby prize--a dirty shoe, an old goat, whatever.

The Monty Hall Problem became famous in 1990 when a reader asked the *Ask Marilyn* column in *Parade Magazine* the following question:

Suppose you're on a game show, and you're given the choice of three doors: Behind one door is a car; behind the others, goats. You pick a door, say No. 1, and the host, who knows what's behind the doors, opens another door, say No. 3, which has a goat. He then says to you, "Do you want to pick door No. 2?" Is it to your advantage to switch your choice?

Our intuition tells us that switching and picking door number two probably won't help. After all, I'm still picking between two doors so the chances are still 50-50 whether I switch or not.

It turns out, however, that **if you switch, your chance of winning goes from one third (33.3%) to two thirds (66.7%).**

What?!

According to Wikipedia, when the *Ask Marilyn* column announced the answer as 66.7%, over one thousand PhDs wrote to tell her that she was wrong.

Clearly, the Monty Hall problem is a brain bender.

But fear not, you'll understand by the end of this article.

## Start with Probability

Sometimes the only way I can gain my bearings with a statistics problem is to revert back to the most basic definition of probability:

Probability is a measure of the likelihood that one specific outcome will occur from the pool of all possible outcomes.

One of the simplest examples of probability is a coin toss.

What is the probability that a coin will come up heads?

Well, most of us know that it's 50%.

But why?

Look back at the definition of probability. In order to calculate a probability you must know the total number of possible outcomes.

How many possible outcomes do we have when flipping a coin?

Heads or tails, right. Two states.

And based on experience we think those two states are equally likely. 50-50 odds.

Now suppose someone glued a small ball bearing to the tail side of the coin. What would happen now?

The coin would tend to land with the heavy side down.

The pool of possible outcomes is still two, heads or tails. But thanks to the extra weight, the head side will be up more often than tails. The odds are no longer 50-50.

**Armed with this basic definition of probability we'll approach the Monty Hall problem by counting the possible outcomes.**

We're also going to pay close attention to the assumptions that are made in the Monty Hall problem.

## You Know What They Say About Assumptions

When solving problems in math or science we often make assumptions because it's easier to solve a problem.

When I said a coin flip has only two possible outcomes, you probably had visions in your mind of the following:

- The coin landing on its side. Neither tails nor heads.
- The coin falling through a crack in the floor.
- The coin landing in the ocean because you were on a boat and slipped.
- The coin being carried off by a previously unladen swallow who snatched it out of mid air.

All of these scenarios are *possible* outcomes in the real world, however unlikely.

But if we consider all of those scenarios, the problem becomes too hard, if not impossible, to solve.

We could assume, however, that in a controlled, laboratory environment using a thin coin that couldn't balance on its edge, that there would be only two possible outcomes.

Assumptions allow us to begin with simpler problems and advance from there.

**When solving the Monty Hall problem we'll use assumptions to start simple and advance from there.**

## What's an Outcome in the Monty Hall Problem?

The Monty Hall Problem is more complex than a coin toss. But we know that the first question we have to answer is how many possible outcomes are there?

But before that, we need to know what an outcome is!

Your first guess might be that there are two possible outcomes. You win the car or you don't.

You could absolutely calculate the probability of winning the car, and we'll get to that below.

But we need to start with something simpler. We need to break down the problem and deal with each step by itself.

**This is a multi-step probability problem.**

## Problem #1: You Choose a Door

__PROBLEM__: You are asked to pick one of three doors. If you pick the right door you win a car.

This is almost as simple as flipping a coin, but you're the coin, a three-sided coin.

You have no information to help you decide which door to pick so it's equally likely that you'll pick any one of the three doors.

Three equally likely outcomes means that you have a 33% chance of winning the car behind door #1.

## Problem #2: You Choose a Door, Then Clueless Monty Chooses a Door

__PROBLEM__: You are asked to pick one of three doors. After you pick, Monty, who doesn't know where the car is, will pick one of the two remaining doors.

This is almost as simple as flipping two coins in a row. You are a three-sided coin and Monty is a two-sided coin.

We know there's a 33% chance you'll pick the door with the car behind it.

But what are the odds Monty will pick the door with the car behind it?

If you already picked the door with the car then he has no shot, but if you didn't then what are his odds?

The table below will help us. It shows all possible outcomes of you picking, followed by Monty picking.

Remember, Monty can only choose between the two doors that you did *not* choose.

In two of the six possible outcomes, you picked door #1 which means you have a 2 in 6 (33%) probability of winning the car behind door #1.

Remember, in this problem Monty is as clueless as you about the location of the car. Therefore his choices between the two remaining doors are random.

## Problem #3: You Choose a Door, then Monty Chooses a Door, But Never the One With the Car

__PROBLEM:__ You are asked to pick one of three doors. After you pick, Monty, who *DOES* know where the car is, will pick one of the two remaining doors, but it won't be the one with the car.

So now Monty is no longer clueless about the location of the car. He knows where the car is and he specifically chooses to *NOT* choose the door with the car.

This is almost as simple as flipping a three-sided coin, followed by flipping a two-side coin, following by manually turning over all heads so they are now tails.

I'm confused...

The point is that what Monty is doing in *this* problem is not as simple as *just* flipping a coin. It's not random anymore.

There is an intervention because he applies his knowledge about where the car is located.

But, all is not lost.

We just need to count the possible outcomes.

**This is the critical piece of the entire problem and where most people fall of the rails. (I did myself.)**

So, before I show you the correct way and explain why, let me show you the wrong way.

In the following table (the wrong way), I have just removed all outcomes where Monty picked door #1 with the car behind it. After all, we just said that he's NOT allowed to pick door #1, so we should scratch those possible outcomes right?

If we do that it would leave us with four possible outcomes because rows 3 and 5 would be removed.

One way we know this is not the correct solution is to look at the check marks in the remaining four rows (1, 2, 4, 6) that are not scratched out.

We see two outcomes where you selected door #1, then just one outcome each for when you selected door #2 and door #3.

This suggests that you selected door #1 two out of four times or 50% of the time.

But we know from Problem #1 that you will select each of the doors one third of the time.

We've broken our probabilities. Whatever we do with our list of outcomes it will always have to preserve that 33% number. There must always be one third of the check marks on each door.

OK. So, why is this wrong again?

You still have to choose your first door with a one third probability.

Think about how it would actually go in the real world if you ran the Monty Hall scenario a bunch of times. Here are two possible tries.

- You pick door #2. Monty picks door #1. Oops, that's the one with the car so he can't pick it. He picks door #3 instead.
- You pick door #2. Monty picks door #3. #3 is OK because it doesn't have the car.

Do you see?

If we removed row 3 from the table above, we are effectively erasing all the times he would have picked a door but had to switch because it had the car behind it. We're pretending it never happened and just throwing it out.

But it did happen. Both scenarios in the list will happen. It's just that with our new rule that Monty can't pick the door with the car, that he'll have to pick door #3 in both cases.

What this means is that instead of removing outcomes from the table, we should just alter them. If Monty would have picked the door with the car, then we need to move his choice to the other door without the car.

So, it really is like flipping a coin, seeing that it's heads and manually turning it over to be tails.

This doesn't have anything to do with statistics really, it's just the rule of the game. Monty will only choose the door without a car, so we have to record an outcome.

The following table, then, shows how our outcome list *should* look.

Monty's choice in row #3 is moved from door #1 to door #3.

His choice in row #5 is moved from door #1 to door #2.

You see now that Monty's options have been limited by restricting him to NOT choosing the door with the car.

He now has a zero probability of picking the door with the car (because he planned it that way).

And you still have just a 33% chance of winning the car. In two of the six possible outcomes you have chosen door #1.

So far, nothing weird has happened. Our odds are still just one third.

But we have made sure that our number of outcomes is correct.

Let's push ahead.

## Problem #4: You Choose a Door. Monty Chooses A Goat. You Decide Whether to Switch Doors.

__PROBLEM:__ You are asked to pick one of three doors. After you pick, Monty, who *DOES* know where the car is, will pick one of the two remaining doors, but it won't be the one with the car. Finally, you can choose to stay with your choice or switch to the other remaining door.

We've finally reached the scenario that matches the Monty Hall Problem.

We've progressed from a simple problem to successively more complex problems while counting the possible outcomes for each.

This final step where you get to switch to another door is pretty simple. It just doubles the number of possible outcomes from the previous problem because for every outcome where you were stuck with your original choice you now have an additional possible outcome where you switch.

In the table below, the rows are in groups of two where one row in the group is you staying and the other is you switching.

We now have twelve possible outcomes in total.

If you look at the door #1 column, you can see two regular check marks and four green-box check marks.

There are two outcomes where you win the car if you stick with your first choice. And there are four outcomes where you win if you switch.

Still, only six of the twelve outcomes has you winning. That's only 50% odds of winning.

But I said something about 67% at the top of this article?

Fear not, we're answering a different question. This table shows that you have a 50% chance overall of winning the car if you choose your first door randomly and then also randomly choose whether or not to switch doors at the end.

But the question asked by the Monty Hall problem is *should you switch* in the last step of the challenge?

In order to figure that out let's compare the situation when you never switch to the case where you always switch.

In Table 4.2 below you can see that when you *never* switch, your odds of picking the right door stay at 33% all the way through from the beginning to the end of the game.

But looking at Table 4.3 on the right where you *always* switch, you can see that your odds jump up to 67%. Four out of the six outcomes have you winning the car.

**If you switch, you are twice as likely to win the car!**

## Summary

I hope this walk through the land of possibilities has been enlightening and helpful.

Every statistics problem you will ever encounter is about calculating the number of total possible outcomes and the likelihood of one or a few specific outcomes among all the possibilities.

This is a pretty easy problem when you consider that we're only dealing with at most twelve possible outcomes. Statistical calculations get *way* more complex than this.

But there's no doubt this is a tricky problem that has tripped up many very intelligent people.