The entropy of a random system is a measure of the amount of randomness it has. The most commonly used entropy function is Shannon's entropy. It measures the average randomness of a system. A different entropy function is the min-entropy. This is a function of the most probable outcome of the system and is related to the probability of successfully guessing the correct outcome. This is an important quantity in secure random number generation. We need to know the randomness of the system before we can use it properly.
The aim of this project is to study different ways of estimating the min-entropy of simple distributions. Given a finite sample from an unknown distribution, how do we determine it's min-entropy. In particular, we will be comparing the Bayesian and Frequencist estimators for the min-enntropy. We will be looking at the bias and variance of these estimators.