# Is the Javascript coin fair?

It is a simple question: is the Javascript coin fair ^{1}? The
question about coins is almost always in the first chapter of a
statistics book, so it should be relatively straight forward to
answer. We will be using the Bayesian method. In the Bayesian
framework the question has no definite answer, we can never be certain
if the coin is fair. However we can compute our belief that the coin
is fair, given our observations.

We wish to compute \(P(p|D,M)\), i.e. what is the posterior Probability Density Function (PDF) over the parameter \(p\) given the data \(D\) and some model \(M\)? Let \(D = (n\text{ heads}, n - r\text{ tails})\), from Bayes theorem:

\begin{equation} P(p|D,M) = \frac{P(D| p,M) P(p| M)}{P(D| M)} \end{equation}we know that \(P(p|D,M)\) is the likelihood multiplied by the prior, divided by a normalization constant. Lets assume that all coin tosses are independent and that the probability of heads only depends on a single parameter \(p\). We can model the likelihood as binominal distribution. For \(r\) heads and \(n\) tosses its PDF is:

\begin{equation} P(D| p,M) = {{n}\choose{r}} p^r (1 - p)^{n-r} \end{equation}with

\begin{equation} {{n}\choose{r}} = \frac{n!}{r!(n - r)!} \end{equation}and \(r \le n\).

For the prior we will assume a uniform prior (or a Beta distribution with \(\alpha = 1\) and \(\beta = 1\) if you are so inclined) since we are uniformed about the fairness and we wish to dominate the prior with \(D\).

\begin{equation} P(p| M) = \left\{ \begin{array}{l l} 1 & \quad \text{if $0 \le p \le 1$}\\ 0 & \quad \text{otherwise} \end{array} \right. \end{equation}simplifying yields:

\begin{equation} P(p| D,M) = \frac{1}{Z}p^r(1-p)^{n-r} \end{equation}With \(Z\) being some normalizing constant.

Lets generate some testing data using `Mozilla/5.0 (Macintosh; Intel Mac OS X 10.9; rv:27.0) Gecko/20100101 Firefox/27.0`

var r = 0; var n = 1E3 for(var i = 0; i < n; ++i) { if(Math.random() < .5) ++r; }

Here I picked a \(10^3\), but to do this properly you would have to do a statistical power analysis.

r <- 491 # from Javascript n <- 1E3 Nsamp <- 201 # no. of points to sample at p <- seq(from=0,to=1,length.out=Nsamp) deltap <- 1/(Nsamp-1) # step size between samples of p pdense <- dbinom(x=r, size=n, prob=p) pdense <- pdense/(deltap*sum(pdense)) # normalize posterior plot(p, pdense, type="l", xlab="p", ylab="P(p|D,M)") title(main=paste("r=",r), line=0.3, cex.main=1.0) p.mean <- sum(p*pdense)/sum(pdense) abline(v=p.mean, col="red")

Note that the posterior was normalized to integrate to 1 ^{2}.
Looks pretty fair to me, its point estimate is solidly at 0.5 and its credible interval narrow.

After some digging it looks like Firefox implements the Linear congruential generator PRNG. Which is fair enough (pun intended), but not cryptographically secure (as other have noted):

LCGs should not be used for applications where high-quality randomness is critical. For example, it is not suitable for a Monte Carlo simulation because of the serial correlation (among other things). They should also not be used for cryptographic applications; see cryptographically secure pseudo-random number generator for more suitable generators. If a linear congruential generator is seeded with a character and then iterated once, the result is a simple classical cipher called an affine cipher; this cipher is easily broken by standard frequency analysis.

## Footnotes:

^{1}

thanks woudje for bringing it up!

^{2}

The code was directly taken from http://www.mpia-hd.mpg.de/~calj/astrostats2013/astrostats2013_bayes.pdf