# Project Euler Problem 436 – Unfair Wager

Spoiler Alert! This blog entry gives away the solution to problem 436 of Project Euler. Please don’t read any further if you have yet to attempt to solve the problem on your own. The information is intended for those who failed to solve the problem and are looking for hints or test data to help them track down bugs. It is posted not before the problem has a hundred solvers already.</br>
First I provide references of the major theoretical background that you probably need to know to solve the problem. The last section presents the solution from start to finish.

The post reflects my approach to the problem. Even though the final outcome was accepted by Project Euler doesn’t mean the information is correct or elegant. Algorithms won’t necessarily be the most efficient ones, but they are guaranteed to run within the time limit of one minute.

## References

This is a pen and paper problem that requires very little knowledge of maths. You mainly need to calculate conditional probabilities and use the distribution of the sum of uniform variables.

## Notation

I’ll use the following notation:

• $u_i$: The ith uniform number.
• $s_i$: The sum after i draws
• $n$: The number of draws to get the sum over 1
• $m$: The number of draws to get the sum over 2
• $p$: The first sum over 1, i.e. $p=s_n$
• $x$: The number that the first player gets.
• $y$: The number that the second player gets.

The capital letters denote the corresponding random variables and $f_X(x)$ denotes the probability density function of $X$ at $x$ .

## Solution

I use the following steps to solve the problem:

1. Sum after i picks
2. First Player – Distribution of x
3. Second Player – Distribution of y
4. Joint distribution of x and y
5. Probability of y bigger than x

### Sum after i picks

Let

The probability density function (pdf) for the sum after one step corresponds to the pdf of the continuous uniform distribution:

where $\mathbf{1}_{A}$ denotes the indicator function.

For each subsequent pick, the sum $S_n$ is obtained by adding a uniform random variable to $S_{n-1}$ . The uniform random variable is independent of the sum $S_{n-1}$ , and therefore the pdf of $S_{n}$ is obtained by convolution of $f_{S_{n-1}}$ and $f_U$ :

By induction we get
$% $

(We will only need the distribution below one).

### First Player – Distribution of x

Let $n$ denote the number of picks it takes to get the sum over 1. Consequently, $s_n$ is the first sum over 1, i.e., the sum where the second player takes over. The last summand of $s_n$ is the number $x^{(n)}$ that the first player gets if she draws $n$ numbers:

Because

and because $s_{n-1} \leq 1$ and $s_n > 1$ , we know that

so the joint probability density distribution of $x^{(n)}$ and $S_n$ is

The previous density applies to a given $n$ . Let $x$ be the number of the first player, and let $p$ be the first sum over 1. To get the unconditional probability $f_{P,X}$, we sum up $f_{X^{(n)},S_{n}}(x,s)$ (2) over all potential $n$ :

### Second Player – Distribution of y

The first player leaves the sum at $p$ an the second player needs to get the sum over 2. I prefer to shift the values to the point of origin: Let the second player start at 0 and let $y$ be the first value over $2-p$ . Also let $s_{i}’$ be the sum of the first i number that the second player picked. In other words, if the first player picked $n$ numbers that add up to $s_n$ , and the second player picks $m$ numbers that bring the total to $s_{n+m}$ , then $s_{m}’ = s_{n+m} - s_n$ .

We already know the distribution of $s’_i$ below $2-p$ from the first player from Equation 1:

To get to the distribution of $y$ we have to distinguish two fundamentally different cases.

#### Case 1: y is less or equal than 2-p

If $y$ is smaller than $2-p$ , then the second player needed at least two picks to get the sum over $2-p$ , and the penultimate sum is at greater than $2-p-y$ (since adding $y$ brings the sum over $2-p$ ) and at most $2-p$ (because if the sum would be bigger than that, it would be the last sum already).

Assume that the second player needs $m > 1$ picks and his last number is $y^{(m)}$ . Therefore, we have

Hence, the pdf of $y$ is:

#### Case 2: y is larger than 2-p

The second case is special, because this time the second player has a chance to get her number with the first pick. We can cover this case by defining

which means the sum of zero uniform random value is always 0 and the corresponding pdf is 1 at 0, and 0 for any other value.

Secondly, the lower bound of the integration needs to be adjusted: since $2-p-y$ is always smaller than zero, and the pdfs of $S_n$ are zero for negative values, we start integrating at 0.

The rest of the calculation is the same, except this time we include $m=1$ :
As before, let’s assume that the second player needs $m > 1$ picks and his last number is $y^{(m)}$ . Therefore, we have

Hence, the pdf of y is:

So all in all the pdf of y given P is:

### Joint distribution of x and y

The joint distribution of the picks of the first and second player is:

Since $L\leq 1 + x$ we have

The indicator is therefore always one if $y < 1-x$ and

If, on the other hand, $x + y \geq 0$ , then the indicator is only one if $L < 2 - y$ :

And we have:

### Probability of y bigger than x

Player two wins if $y$ greater than $x$ :

First, we tackle the case $x+y<1$ . Since also $y>x$ we have $0 \leq x < 1/2$ and therefore:

There are two possibilities for the case $x+y\geq 1$ :

1. $0 \leq x \leq 1/2$ and $1-x \leq y \leq 1$
2. $1/2 \leq x \leq 1$ and $x \leq y \leq 1$

Therefore,

So finally we have from Equations 3 and 4: