# An efficient probability distribution data structure

A probability distribution is a list of nodes, each holding an object that may be sampled along its positive weight. Whenever asking for an object in the distribution (sampling the distribution), the data structure returns an object according to its weight and the weights of all the other objects. For example, if the list is $\langle (A, 1.0), (B, 1.0), (C, 3.0) \rangle$, whenever sampling the distribution, both $A$ and $B$ may be returned with 20% chance each, and $C$ may be returned with probability of 60%.

One obvious data structure is an array-based list (such as ArrayList in Java, or vector in C++). Both the list data structures are implemented in such a way, that adding a new object/weight pair is done in amortized constant time, yet sampling and removing an element will run in $\Theta(n)$ worst case time.

In this post I will explain how to implement a data structure that supports all three fundamental operations (insertion, sampling, removal) in logarithmic time in all cases. However, I will supply three other data structures for the sake of comparison.

#### The application programming interface

Clearly, in our probability distribution data structures, we need at least three methods:

• sampleElement,
• removeElement.

Now, let us proceed to defining an abstract base class:

net.coderodde.stat.AbstractProbabilityDistribution.java
package net.coderodde.stat;

import java.util.Objects;
import java.util.Random;

public abstract class AbstractProbabilityDistribution<E> {

protected double totalWeight;
protected final Random random;

protected AbstractProbabilityDistribution() {
this(new Random());
}

protected AbstractProbabilityDistribution(Random random) {
this.random =
Objects.requireNonNull(random,
"The random number generator is null.");
}

public abstract boolean isEmpty();
public abstract int size();
public abstract boolean addElement(E element, double weight);
public abstract E sampleElement();
public abstract boolean contains(E element);
public abstract boolean removeElement(E element);
public abstract void clear();

protected void checkWeight(double weight) {
if (Double.isNaN(weight)) {
throw new IllegalArgumentException("The element weight is NaN.");
}

if (weight <= 0.0) {
throw new IllegalArgumentException(
"The element weight must be positive. Received " + weight);
}

if (Double.isInfinite(weight)) {
// Once here, 'weight' is positive infinity.
throw new IllegalArgumentException(
"The element weight is infinite.");
}
}

protected void checkNotEmpty(int size) {
if (size == 0) {
throw new IllegalStateException(
"This probability distribution is empty.");
}
}
}


#### Trivial implementation: ArrayProbabilityDistribution.java

First, we start with a naïve implementation: the distribution is implemented by a vector of entries, each holding the actual element and its (positive) weight. When adding a new entry, it is appended to the end of the underlying vector. Whenever removing an entry, we remove it from the vector and shift the right part of the vector one position to the left. Clearly, addition runs in amortised constant time whenever using list data structures such as ArrayList or vector. What comes to the removal, unfortunately it is worst case and average case $\Theta(n)$. Also, in order to sample an element, we may need to visit as many entries as there is in the data structure, which yields worst and average case linear time for sampling an element.

All in all, the trivial implementation looks like this:

net.coderodde.stat.AbstractProbabilityDistribution.java
package net.coderodde.stat.support;

import java.util.ArrayList;
import java.util.Arrays;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.Random;
import net.coderodde.stat.AbstractProbabilityDistribution;

public class ArrayProbabilityDistribution<E>
extends AbstractProbabilityDistribution<E> {

private static final class Entry<E> {

private final E element;
private double weight;

Entry(E element, double weight) {
this.element = element;
this.weight = weight;
}

E getElement() {
return element;
}

double getWeight() {
return weight;
}

void setWeight(double weight) {
this.weight = weight;
}
}

private final List<Entry<E>> storage = new ArrayList<>();
private final Map<E, Entry<E>> map = new HashMap<>();

public ArrayProbabilityDistribution() {
this(new Random());
}

public ArrayProbabilityDistribution(Random random) {
super(random);
}

@Override
public boolean addElement(E element, double weight) {
checkWeightIsPositiveAndNonNanN(weight);
Entry<E> entry = map.get(element);

if (entry != null) {
entry.setWeight(entry.getWeight() + weight);
} else {
entry = new Entry<>(element, weight);
map.put(element, entry);
}

totalWeight += weight;
return true;
}

@Override
public E sampleElement() {
checkNotEmpty();
double value = random.nextDouble() * totalWeight;
int distributionSize = storage.size();

for (int i = 0; i < distributionSize; ++i) {
Entry<E> entry = storage.get(i);
double currentWeight = entry.getWeight();

if (value < currentWeight) {
return entry.getElement();
}

value -= currentWeight;
}

throw new IllegalStateException("Should not get here.");
}

@Override
public boolean removeElement(E element) {
Entry<E> entry = map.remove(element);

if (entry == null) {
return false;
}

totalWeight -= entry.getWeight();
storage.remove(entry);
return true;
}

@Override
public void clear() {
map.clear();
storage.clear();
totalWeight = 0.0;
}

@Override
public boolean isEmpty() {
return storage.isEmpty();
}

@Override
public int size() {
return storage.size();
}

@Override
public boolean contains(E element) {
return map.containsKey(element);
}

protected void checkNotEmpty() {
checkNotEmpty(storage.size());
}

public static void main(String[] args) {
AbstractProbabilityDistribution<Integer> pd =
new ArrayProbabilityDistribution<>();

int[] counts = new int[4];

for (int i = 0; i < 1000; ++i) {
Integer myint = pd.sampleElement();
counts[myint]++;
}

System.out.println(Arrays.toString(counts));
}
}


#### Improving the trivial implementation: BinarySearchProbabilityDistribution.java

The next probability distribution extends the one above by adding accumulated weights to each entry. Since the accumulated weights form a non-decreasing sequence as we march from the beginning of the storage list towards its end, and the storage list supports access in $\Theta(1)$, we can apply binary search to the list in order to sample an element.

Basically, binary search version has the same running times for insertion and removal operations as the very first one (ArrayProbabilityDistribution), yet it improves both average and worst case running times for sampling to $\Theta(\log n)$. The code follows:

package net.coderodde.stat.support;

import java.util.ArrayList;
import java.util.Arrays;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.Random;
import net.coderodde.stat.AbstractProbabilityDistribution;

/**
* This class implements a probability distribution data structure that
* maintains an accumulated sum of weights and thus allows sampling the elements
* in worst-case logarithmic time.
*
* @author Rodion "rodde" Efremov
* @version 1.61 (Sep 30, 2016)
*/
public class BinarySearchProbabilityDistribution<E>
extends AbstractProbabilityDistribution<E> {

/**
* This class implements the actual entry in the distribution.
*
* @param <E> the actual element type.
*/
private static final class Entry<E> {

private final E element;

private double weight;

private double accumulatedWeight;

Entry(E element, double weight, double accumulatedWeight) {
this.element = element;
this.weight = weight;
this.accumulatedWeight = accumulatedWeight;
}

E getElement() {
return element;
}

double getWeight() {
return weight;
}

void setWeight(double weight) {
this.weight = weight;
}

double getAccumulatedWeight() {
return accumulatedWeight;
}

accumulatedWeight += delta;
}
}

/**
* This map maps each element stored in this probability distribution to its
* respective entry.
*/
private final Map<E, Entry<E>> map = new HashMap<>();

/**
* Holds the actual distribution entries.
*/
private final List<Entry<E>> storage = new ArrayList<>();

/**
* Constructs this probability distribution with default random number
* generator.
*/
public BinarySearchProbabilityDistribution() {
this(new Random());
}

/**
* Constructs this probability distribution with given random number
* generator.
*
* @param random the random number generator.
*/
public BinarySearchProbabilityDistribution(Random random) {
super(random);
}

/**
* {@inheritDoc }
*/
@Override
public boolean addElement(E element, double weight) {
checkWeightIsPositiveAndNonNaN(weight);
Entry<E> entry = map.get(element);

if (entry == null) {
entry = new Entry<>(element, weight, totalWeight);
map.put(element, entry);
} else {
for (int i = storage.indexOf(entry); i < storage.size(); ++i) {
}
}

totalWeight += weight;
return true;
}

/**
* {@inheritDoc }
*/
@Override
public E sampleElement() {
checkNotEmpty();
double value = totalWeight * random.nextDouble();

int left = 0;
int right = storage.size() - 1;

while (left < right) {             int middle = left + ((right - left) >> 1);
Entry<E> middleEntry = storage.get(middle);
double lowerBound = middleEntry.getAccumulatedWeight();
double upperBound = lowerBound + middleEntry.getWeight();

if (lowerBound <= value && value < upperBound) {
return middleEntry.getElement();
}

if (value < lowerBound) {
right = middle - 1;
} else {
left = middle + 1;
}
}

return storage.get(left).getElement();
}

/**
* {@inheritDoc }
*/
@Override
public boolean contains(E element) {
return map.containsKey(element);
}

/**
* {@inheritDoc }
*/
@Override
public boolean removeElement(E element) {
Entry<E> entry = map.remove(element);

if (entry == null) {
return false;
}

int index = storage.indexOf(entry);
storage.remove(index);

for (int i = index; i < storage.size(); ++i) {
}

totalWeight -= entry.getWeight();
return true;
}

/**
* {@inheritDoc }
*/
@Override
public void clear() {
map.clear();
storage.clear();
totalWeight = 0.0;
}

/**
* {@inheritDoc }
*/
@Override
public boolean isEmpty() {
return storage.isEmpty();
}

/**
* {@inheritDoc }
*/
@Override
public int size() {
return storage.size();
}

private void checkNotEmpty() {
checkNotEmpty(storage.size());
}

public static void main(String[] args) {
BinarySearchProbabilityDistribution<Integer> d = new BinarySearchProbabilityDistribution<>();

d.removeElement(3);

System.out.println("");

binarySearchProbabilityDistributionDemo();
}

private static void binarySearchProbabilityDistributionDemo() {
BinarySearchProbabilityDistribution<Integer> pd =
new BinarySearchProbabilityDistribution<>();

int[] counts = new int[4];

for (int i = 0; i < 100; ++i) {
Integer myint = pd.sampleElement();
counts[myint]++;
System.out.println(myint);
}

System.out.println(Arrays.toString(counts));
}
}


#### Efficient removal of elements: LinkedListProbabilityDistribution.java

The third data structure improves the removal operation to constant time in all cases, yet runs the sampling operation in linear time in worst and average cases. The idea is to use a doubly-linked list for storing the entries. Also, we maintain a hashtable-based map mapping each element in the distribution to its node in the linked list. Clearly, in order to add a new element, we create a new linked list node for the element and map it in the map; both actions run in constant time with possible exception of the situation where the hash map has to be expanded, which is $\Theta(n)$. However, adding to a hash map runs in constant amortized time. What comes to the removal operation, we access the linked list node through the map, and unlink it; both actions run in constant time as well. However, in order to sample an element, in both average and worst cases we need to march through at least half the list, which is clearly linear time. The implementation follows:

package net.coderodde.stat.support;

import java.util.Arrays;
import java.util.HashMap;
import java.util.Map;
import java.util.Random;
import net.coderodde.stat.AbstractProbabilityDistribution;

/**
* This class implements a probability distribution relying on a linked list.
* The running times of the main methods are as follows:
*
*
<table>
*
<tr>
<td>Method</td>
<td>Complexity</td>
</tr>
*
<tr>
*
<td><tt>amortized constant time</tt>,</td>
</tr>
*
<tr>
<td><tt>sampleElement</tt></td>
<td><tt>O(n)</tt>,</td>
</tr>
*
<tr>
<td><tt>removeElement</tt></td>
<td><tt>O(1)</tt>.</td>
</tr>
*</table>
*
* This probability distribution class is best used whenever it is modified
* frequently compared to the number of queries made.
*
* @param <E> the actual type of the elements stored in this probability
*            distribution.
*
* @author Rodion "rodde" Efremov
* @version 1.61 (Sep 30, 2016)
*/
extends AbstractProbabilityDistribution<E> {

private static final class LinkedListNode<E> {

private final E element;
private double weight;

this.element = element;
this.weight  = weight;
}

E getElement() {
return element;
}

double getWeight() {
return weight;
}

void setWeight(double weight) {
this.weight = weight;
}

return prev;
}

return next;
}

prev = node;
}

next = node;
}
}

/**
* This map maps the elements to their respective linked list nodes.
*/
private final Map<E, LinkedListNode<E>> map = new HashMap<>();

/**
* Stores the very first linked list node in this probability distribution.
*/

/**
* Stores the very last linked list node in this probability distribution.
*/

/**
* Construct a new probability distribution.
*/
super();
}

/**
* Constructs a new probability distribution using the input random number
* generator.
*
* @param random the random number generator to use.
*/
super(random);
}

/**
* {@inheritDoc }
*/
@Override
public boolean addElement(E element, double weight) {
checkWeightIsPositiveAndNonNaN(weight);

if (node == null) {

} else {
}

map.put(element, node);
} else {
node.setWeight(node.getWeight() + weight);
}

totalWeight += weight;
return true;
}

/**
* {@inheritDoc }
*/
@Override
public E sampleElement() {
checkNotEmpty(map.size());
double value = random.nextDouble() * totalWeight;

node != null;
if (value < node.getWeight()) {
return node.getElement();
}

value -= node.getWeight();
}

throw new IllegalStateException("Should not get here.");
}

/**
* {@inheritDoc }
*/
@Override
public boolean contains(E element) {
return map.containsKey(element);
}

/**
* {@inheritDoc }
*/
@Override
public boolean removeElement(E element) {

if (node == null) {
return false;
}

totalWeight -= node.getWeight();
return true;
}

/**
* {@inheritDoc }
*/
@Override
public void clear() {
totalWeight = 0.0;
map.clear();
}

/**
* {@inheritDoc }
*/
@Override
public boolean isEmpty() {
return map.isEmpty();
}

/**
* {@inheritDoc }
*/
@Override
public int size() {
return map.size();
}

if (left != null) {
} else {
}

if (right != null) {
} else {
}
}

public static void main(String[] args) {

int[] counts = new int[4];

for (int i = 0; i < 100; ++i) {
Integer myint = pd.sampleElement();
counts[myint]++;
System.out.println(myint);
}

System.out.println(Arrays.toString(counts));
}
}


#### Efficient binary tree based implementation: BinaryTreeProbabilityDistribution.java

Finally, we can discuss the tree based probability distribution. The main idea is to organize the element entries according to a balanced binary tree. For that we define to different categories of nodes:

• leaf nodes,
• relay nodes.

Only leaf nodes store the actual elements. Each relay or leaf node $n$ is augmented with the following fields:

• element: stores the actual element in a leaf node; unused in the relay nodes,
• weight: if the node is a leaf node, stores the weight of element. Otherwise (if the node is a relay node), stores the sum of all weights in all the leaves of the subtree rooted at $n$,
• isRelayNode: set to true only in relay nodes,
• leftChild: the left subtree of $n$, or null if this node is a leaf node,
• rightChild: the right subtree of $n$, or null if this node is a leaf node,
• parent: the parent relay node of this node,
• numberOfLeafNodes: if $n$ is a leaf node, the field is set to one, otherwise it stores the number of leaf nodes in the subtree rooted at $n$.

Next, the invariant is that relay nodes have exactly two child nodes. Any of the two children of a relay node can be relay or leaf nodes.

###### Inserting an element

In order to insert an element in the data structure, we start from the root node. If the root is null (the distribution is empty), we create a leaf and set it as the root. Otherwise, if the root is a leaf $n$ (only one element in the structure), we create a new relay node $r$, append $n$ and the new leaf as its children, and set $r$ as a root. In case the root node is a relay node, we consult its two children, and descend to the one with smaller count field value. That way we keep the entire tree balanced. Note that this is possible due to the fact that we do not have to maintain any order in the tree: upon insertion, we just put a new leaf node to a least high tree. See Figure 1 for a demonstration of inserting some elements in the tree.

Figure 1: Inserting elements

In order to delete an element from the binary tree probability distribution, we have the following: let the tree node containing the element requested for removal be $n$. Now, if $n$ is the root, just set nil to the root. Otherwise, the parent of $n$ is a relay node $r$ by the data structure invariant. What we do is extracting the other child of $r$ (call it $n'$) and replace $r$ with $n'$. Note that even if we keep removing, say, rightmost leaf node, the entire tree remains balanced (try to read the Firgure 1 “backwards” for illustration).

Finally, what comes to the sampling, we toss a coin $c \in [0, \sum w_i)$, and start descending the tree downwards from the root until we reach a leaf node whose “bucket” contains $late x$. Since the depth of the tree is logarithmic in its size, we have the running time of $\Theta(\log n)$ for all cases. The Java implementation follows:

package net.coderodde.stat.support;

import java.util.Arrays;
import java.util.HashMap;
import java.util.Map;
import java.util.Random;
import net.coderodde.stat.AbstractProbabilityDistribution;

public class BinaryTreeProbabilityDistribution<E>
extends AbstractProbabilityDistribution<E> {

private static final class Node<E> {

private final E element;
private double weight;
private final boolean isRelayNode;
private Node<E> leftChild;
private Node<E> rightChild;
private Node<E> parent;
private int numberOfLeafNodes;

Node(E element, double weight) {
this.element           = element;
this.weight            = weight;
this.numberOfLeafNodes = 1;
this.isRelayNode       = false;
}

Node(double weight) {
this.element           = null;
this.weight            = weight;
this.numberOfLeafNodes = 1;
this.isRelayNode       = true;
}

@Override
public String toString() {
if (isRelayNode) {
return "[" + String.format("%.3f", getWeight()) + " : "
+ numberOfLeafNodes + "]";
}

return "(" + String.format("%.3f", getWeight()) + " : "
+ element + ")";
}

E getElement() {
return element;
}

double getWeight() {
return weight;
}

void setWeight(double weight) {
this.weight = weight;
}

int getNumberOfLeaves() {
return numberOfLeafNodes;
}

void setNumberOfLeaves(int numberOfLeaves) {
this.numberOfLeafNodes = numberOfLeaves;
}

Node<E> getLeftChild() {
return leftChild;
}

void setLeftChild(Node<E> block) {
this.leftChild = block;
}

Node<E> getRightChild() {
return rightChild;
}

void setRightChild(Node<E> block) {
this.rightChild = block;
}

Node<E> getParent() {
return parent;
}

void setParent(Node<E> block) {
this.parent = block;
}

boolean isRelayNode() {
return isRelayNode;
}
}

private final Map<E, Node<E>> map = new HashMap<>();
private Node<E> root;

public BinaryTreeProbabilityDistribution() {
this(new Random());
}

public BinaryTreeProbabilityDistribution(Random random) {
super(random);
}

@Override
public boolean addElement(E element, double weight) {
checkWeightIsPositiveAndNonNanN(weight);
Node<E> node = map.get(element);

if (node == null) {
node = new Node<>(element, weight);
insert(node);
map.put(element, node);
} else {
node.setWeight(node.getWeight() + weight);
}

totalWeight += weight;
return true;
}

@Override
public boolean contains(E element) {
return map.containsKey(element);
}

@Override
public E sampleElement() {
checkNotEmpty(map.size());
double value = totalWeight * random.nextDouble();
Node<E> node = root;

while (node.isRelayNode()) {
if (value < node.getLeftChild().getWeight()) {
node = node.getLeftChild();
} else {
value -= node.getLeftChild().getWeight();
node = node.getRightChild();
}
}

return node.getElement();
}

@Override
public boolean removeElement(E element) {
Node<E> node = map.remove(element);

if (node == null) {
return false;
}

delete(node);
totalWeight -= node.getWeight();
return true;
}

@Override
public void clear() {
root = null;
map.clear();
totalWeight = 0.0;
}

@Override
public boolean isEmpty() {
return map.isEmpty();
}

@Override
public int size() {
return map.size();
}

private void bypassLeafNode(Node<E> leafNodeToBypass,
Node<E> newNode) {
Node<E> relayNode = new Node<>(leafNodeToBypass.getWeight());
Node<E> parentOfCurrentNode = leafNodeToBypass.getParent();

relayNode.setLeftChild(leafNodeToBypass);
relayNode.setRightChild(newNode);

leafNodeToBypass.setParent(relayNode);
newNode.setParent(relayNode);

if (parentOfCurrentNode == null) {
root = relayNode;
} else if (parentOfCurrentNode.getLeftChild() == leafNodeToBypass) {
relayNode.setParent(parentOfCurrentNode);
parentOfCurrentNode.setLeftChild(relayNode);
} else {
relayNode.setParent(parentOfCurrentNode);
parentOfCurrentNode.setRightChild(relayNode);
}

}

private void insert(Node<E> node) {
if (root == null) {
root = node;
return;
}

Node<E> currentNode = root;

while (currentNode.isRelayNode()) {
if (currentNode.getLeftChild().getNumberOfLeaves() <
currentNode.getRightChild().getNumberOfLeaves()) {
currentNode = currentNode.getLeftChild();
} else {
currentNode = currentNode.getRightChild();
}
}

bypassLeafNode(currentNode, node);
}

private void delete(Node<E> leafToDelete) {
Node<E> relayNode = leafToDelete.getParent();

if (relayNode == null) {
root = null;
return;
}

Node<E> parentOfRelayNode = relayNode.getParent();
Node<E> siblingLeaf = relayNode.getLeftChild() == leafToDelete ?
relayNode.getRightChild() :
relayNode.getLeftChild();

if (parentOfRelayNode == null) {
root = siblingLeaf;
siblingLeaf.setParent(null);
return;
}

if (parentOfRelayNode.getLeftChild() == relayNode) {
parentOfRelayNode.setLeftChild(siblingLeaf);
} else {
parentOfRelayNode.setRightChild(siblingLeaf);
}

siblingLeaf.setParent(parentOfRelayNode);
}

double weightDelta,
int nodeDelta) {
while (node != null) {
node.setNumberOfLeaves(node.getNumberOfLeaves() + nodeDelta);
node.setWeight(node.getWeight() + weightDelta);
node = node.getParent();
}
}

public static void main(String[] args) {
AbstractProbabilityDistribution<Integer> pd =
new BinaryTreeProbabilityDistribution<>();

int[] counts = new int[4];

for (int i = 0; i < 1000; ++i) {
Integer myint = pd.sampleElement();
counts[myint]++;
}

System.out.println(Arrays.toString(counts));
}
}

ArrayProbabilityDistribution
Operation Worst case Average case Best case
addElement $\Theta(n)$ $\Theta(1)$ $\Theta(1)$
sampleElement $\Theta(n)$ $\Theta(n)$ $\Theta(1)$
removeElement $\Theta(n)$ $\Theta(n)$ $\Theta(1)$
BinarySearchProbabilityDistribution
Operation Worst case Average case Best case
addElement $\Theta(n)$ $\Theta(1)$ $\Theta(1)$
sampleElement $\Theta(\log n)$ $\Theta(\log n)$ $\Theta(1)$
removeElement $\Theta(n)$ $\Theta(n)$ $\Theta(1)$
Operation Worst case Average case Best case
addElement $\Theta(n)$ $\Theta(1)$ $\Theta(1)$
sampleElement $\Theta(n)$ $\Theta(n)$ $\Theta(1)$
removeElement $\Theta(1)$ $\Theta(1)$ $\Theta(1)$
BinaryTreeProbabilityDistribution
Operation Worst case Average case Best case
addElement $\Theta(\log n)$ $\Theta(\log n)$ $\Theta(\log n)$
sampleElement $\Theta(\log n)$ $\Theta(\log n)$ $\Theta(\log n)$
removeElement $\Theta(\log n)$ $\Theta(\log n)$ $\Theta(\log n)$

# Happy physics time

In this post, I consider a bar of infinite length with a finite charge, a charged particle, and examine what is the electromagnetic pull between the two. The situation is no more fuzzy than this:

Given a point $x$ on the $x$-axis (representing the charged bar), the distance between it and the charged particle $Q_2$ is given by

$\displaystyle d(x) = \sqrt{d^2 + x^2},$

where $d$ is the distance of the charged particle from the charged bar. By Coulomb’s law the scalar force between two charged objects with charges $Q_1$ and $Q_2$ is

$\displaystyle F = \frac{k_{\varepsilon}Q_1 Q_2}{r^2},$

where $k_{\varepsilon}$ is the Coulombs’s constant, and $r$ is the distance between the two particles.

We assume that the charged bar has constant charge density: all subchunks of equal length has equal charge. Now it is easy to see that if basic bar component of length $\Delta x$ has the charge of $\Delta x Q / w$, we can write

\displaystyle \begin{aligned} F &= \sum \Delta F \\ &= \sum \frac{k_{\varepsilon} Q_1 \frac{\Delta x}{w} Q_2}{d^2(x)} \\ &= \int_{-w/2}^{w/2} \frac{k_{\varepsilon} Q_1 Q_2 \mathrm{d}x}{wd^2(x)} \\ &= \frac{k_{\varepsilon}Q_1 Q_2}{w} \int_{-w/2}^{w/2} \frac{\mathrm{d}x}{x^2 + d^2} \\ &= \frac{k_{\varepsilon}Q_1 Q_2}{w} \Bigg[ \frac{\arctan(x / d)}{d} \Bigg]_{x = -w / 2}^{x = w / 2} \\ &= \frac{k_{\varepsilon}Q_1 Q_2}{dw} \Bigg[ \arctan(x / d) \Bigg]_{x = -w / 2}^{x = w / 2} \\ &= \frac{2k_{\varepsilon}Q_1 Q_2}{dw} \arctan(\frac{w}{2d}) \end{aligned}

Since $d, Q_1, Q_2$ and $k_{\varepsilon}$ are considered constants, the above expression approaches zero as $w \rightarrow \infty$.

# Funky triangle area

In this post, I will deal with an interesting high school math problem. Suppose we are given the following triangle:

Figure 1

We have two parameters:

• $n$, the number of circles touching a side of the triangle,
• $r$, the radius of each circle.

Figure 2

In order to find out the area of the triangle given the parameters $n$, $r$, we need to find the length of each side. Basically, we must calculate $d$ in Figure 2 as a function of $r$, which is fairly easy since we must have

\displaystyle \begin{aligned} \tan 30\textdegree &= \frac{r}{d}, \\ d &= \frac{r}{\tan 30\textdegree} \\ &= \frac{r}{\frac{1}{\sqrt{3}}} \\ &= \sqrt{3}r. \end{aligned}

Next, we can define the length of a side of the triangle as a function of $n$ and $r$:

\displaystyle \begin{aligned} s(n, r) &= 2d + 2r(n - 1) \\ &= 2\sqrt{3}r + 2r(n - 1) \\ &= 2r(n + \sqrt{3} - 1). \end{aligned}

If we set $p(n, r) = 3s(n, r)/2$ (i.e., $p(n, r)$ is the half of the perimeter of the triangle),
by Heron’s formula, we have that the area of the triangle is

\displaystyle \begin{aligned} A(n, r) &= \sqrt{p(n, r)[p(n, r) - s(n, r)]^3} \\ &= \sqrt{\frac{3}{2}s(n, r)\Bigg[\frac{3}{2}s(n, r) - s(n, r)\Bigg]^3} \\ &= \sqrt{\frac{3}{2}s(n, r)\Bigg[\frac{1}{2}s(n, r)\Bigg]^3} \\ &= \sqrt{\frac{3}{2}s(n, r)\frac{1}{8}s^3(n, r)} \\ &= \sqrt{\frac{3}{16}s^4(n, r)} \\ &= \frac{\sqrt{3}}{4}s^2(n, r) \\ &= \frac{\sqrt{3}}{4} [2r(n + \sqrt{3} - 1)]^2 \\ &= \frac{\sqrt{3}}{4} 4r^2 (n + \sqrt{3} - 1)^2 \\ &= \sqrt{3}r^2 (n + \sqrt{3} - 1)^2. \end{aligned}

# A question of mind

There are a couple of aspects of mind that bothered me for a while. First of all, I suspect that mind (or soul, or psyche) is matter-agnostic, i.e., its “state” does not depend on the actual particles comprising the brain, but rather the configuration of those particles. As a thought experiment, I was thinking about the following “inductive proof”: Suppose you can substitute an atom in a person’s brain with another atom of the same substance without disturbing all other atoms in the brain. Will mind get modified from its original course as a response to that atomic intervention? I suppose no, at least not in this idealized setting. Next, why not swap another atom, third, fourth, and so on? Eventually, we end up with the same psyche “running on another hardware.” If that would work, we might have concluded that, ultimately, psyche is just information. Next, can we “resurrect” a person by “recording” some of his memories while the person is alive, and after his physical death, bioprint a new corpus and “write” the memories back to the brain?

# Computing standard deviation in one pass

Given a sequence of real numbers $x_1, x_2, \dots, x_n$, standard deviation of the sequence is

$\displaystyle \sigma = \sqrt{\frac{\sum_{i = 1}^n (x_i - \mu)^2}{n - 1}},$

where

$\displaystyle \mu = \frac{1}{n}\sum_{i = 1}^n x_i$

is the average of all the given values. At first site, a person might think that he or she has to compute the average first, after which he or she has to do another pass in order to compute that funky sum, divide it by $n - 1$ and take the square root of it. This is, however, not quite true, and that’s why:

\displaystyle \begin{aligned} \sum_{i = 1}^n (x_i - \mu)^2 &= \sum_{i = 1}^n (x_i^2 - 2\mu x_i + \mu^2) \\ &= \sum_{i = 1}^n x_i^2 - 2\mu \sum_{i = 1}^n x_i + n \mu^2 \\ &= \sum_{i = 1}^n x_i^2 - 2 \Bigg( \frac{\sum_{i = 1}^n x_i}{n} \Bigg) \sum_{i = 1}^n x_i + n \Bigg( \frac{\sum_{i = 1}^n x_i}{n} \Bigg)^2 \\ &= \sum_{i = 1}^n x_i^2 - 2 \Bigg( \frac{\sum_{i = 1}^n x_i}{n} \Bigg) \sum_{i = 1}^n x_i + \frac{1}{n}\Bigg( \sum_{i = 1}^n x_i \Bigg)^2 \\ &= \sum_{i = 1}^n x_i^2 - \frac{2}{n} \Bigg( \sum_{i = 1}^n x_i \Bigg) \sum_{i = 1}^n x_i + \frac{1}{n}\Bigg( \sum_{i = 1}^n x_i \Bigg)^2 \\ &= \sum_{i = 1}^n x_i^2 - \frac{2}{n} \Bigg( \sum_{i = 1}^n x_i \Bigg)^2 + \frac{1}{n}\Bigg( \sum_{i = 1}^n x_i \Bigg)^2 \\ &= \sum_{i = 1}^n x_i^2 - \frac{1}{n} \Bigg( \sum_{i = 1}^n x_i \Bigg)^2. \end{aligned}

So, whenever computing standard deviation, just keep two sums: one for each value $x_i$ and another one for each $x_i^2$. As soon as you have them, return

$\displaystyle \sqrt{\frac{1}{n - 1}\Bigg( \sum_{i = 1}^n x_i^2 - \frac{1}{n} \Big( \sum_{i = 1}^n x_i \Big)^2 \Bigg)}.$

# Adaptive tree sort for integers in Java

### Introduction

In this post, I will discuss a tree sort algorithm for sorting (primitive) integer arrays a little bit more efficiently than java.util.Arrays.sort. A (general (and stable)) tree sort scans the requested array range, putting each scanned element into a (hopefully) balanced binary search tree. Whenever we scan an element that is already in the tree, there is two cases to consider:

• if we are sorting an array of primitive component type (int, float, etc.) we could just hold a counter variable in each tree node, counting the number of occurrences of the tree node key so far in the range being scanned, or
• if we are sorting an array of non-primitive component type (Objects in general), I see no other option like converting the counter variable mentioned in the above list item to an ordered list.

What comes to the case of sorting objects, after we have built the tree mapping each distinct object to the list of all objects considered to be “equal”, we traverse the tree keys in order, and at each key, we dump the list to the input array range. Same goes for the case of sorting primitive type arrays: take the least key (say, $x$), copy it to the beginning of the input range $k$ times, where $k$ is the number of times $x$ occurred in the input range; move to the second least element, dump it, move, and so on.

### The algorithm

Clearly, under assumption that the underlying tree is balanced, the worst case running time is $\Theta(n \log n)$. We can, however, improve this to $\Theta(n + k \log k)$, where $k$ is the number of distinct integers in the requested array range. In the sort, beyond the tree data structure mapping each distinct key to the number of its occurrences so far, we maintain a hash table mapping individual integers to their respective tree nodes; given an integer $x$, we first ask the hash table whether $x$ is already mapped to a tree node. If so, we just increment the counter of the tree node. Obviously, adding an integer we have already added takes constant time. If, however, $x$ is not in the tree (and the table), we add it to both of them in total time $\mathcal{O}(\log k)$.

The last important point is that, on sufficiently small $k$ — but on inherently random data — the tree sort under discussion improves java.util.Arrays by a factor of four. However, measurements revealed that, on fully random data with large $k$, the hidden constants are huge: the sort becomes unpractical. In order to circumvent the issue, our implementation switches to java.util.Arrays.sort as soon as the number of distinct integers encountered exceeds a particular threshold.

net.coderodde.util.sorting.IntTreeSort.java
package net.coderodde.util.sorting;

import java.util.Arrays;

/**
* This class implements a funky tree sort algorithm for sorting integers.
*
* @author Rodion "rodde" Efremov
* @version 1.6 (Feb 21, 2016)
*/
public class IntTreeSort {

public static void sort(int[] array) {
sort(array, 0, array.length);
}

public static void sort(int[] array, int fromIndex, int toIndex) {
if (toIndex - fromIndex < 2) {
return;
}

new IntTreeSort(array, fromIndex, toIndex).sort();
}

private final int[] array;
private final int fromIndex;
private final int toIndex;
private final HashTableEntry[] table;
private TreeNode root;

private IntTreeSort(int[] array, int fromIndex, int toIndex) {
this.array     = array;
this.fromIndex = fromIndex;
this.toIndex   = toIndex;

int capacity   = computeCapacity(toIndex - fromIndex);

this.table     = new HashTableEntry[capacity];
}

private static int computeCapacity(int length) {
int ret = 1;

while (ret < length) {
ret <<= 1;
}

return ret;
}

private static final class TreeNode {
int key;
int count;
int height;
TreeNode left;
TreeNode right;
TreeNode parent;

TreeNode(int key) {
this.key = key;
this.count = 1;
}
}

private static final class HashTableEntry {
int key;
TreeNode treeNode;
HashTableEntry nextEntry;

HashTableEntry(int key, TreeNode treeNode, HashTableEntry nextEntry) {
this.key = key;
this.treeNode = treeNode;
this.nextEntry = nextEntry;
}
}

private static int height(TreeNode node) {
return node == null ? -1 : node.height;
}

private int index(int element) {
}

private TreeNode findTreeNode(int element, int elementHash) {
HashTableEntry entry = table[elementHash];

while (entry != null && entry.treeNode.key != element) {
entry = entry.nextEntry;
}

return entry == null ? null : entry.treeNode;
}

private void sort() {
int maximumAllowedNodes = (toIndex - fromIndex) / 240;
int treeNodeCount = 0;
int initialKey = array[fromIndex];
root = new TreeNode(initialKey);
table[index(initialKey)] = new HashTableEntry(initialKey,
root,
null);

for (int i = fromIndex + 1; i < toIndex; ++i) {
int currentElement = array[i];
int currentElementHash = index(currentElement);

TreeNode treeNode = findTreeNode(currentElement,
currentElementHash);

if (treeNode != null) {
treeNode.count++;
} else {
++treeNodeCount;

if (treeNodeCount > maximumAllowedNodes) {
Arrays.sort(array, fromIndex, toIndex);
return;
}

HashTableEntry newentry =
new HashTableEntry(currentElement,
newnode,
table[currentElementHash]);
table[currentElementHash] = newentry;
}
}

TreeNode node = minimum(root);
int index = fromIndex;

while (node != null) {
int key = node.key;
int count = node.count;

for (int i = 0; i < count; ++i) {
array[index++] = key;
}

node = successor(node);
}
}

private TreeNode minimum(TreeNode node) {
while (node.left != null) {
node = node.left;
}

return node;
}

private TreeNode successor(TreeNode node) {
if (node.right != null) {
return minimum(node.right);
}

TreeNode parent = node.parent;

while (parent != null && parent.right == node) {
node = parent;
parent = parent.parent;
}

return parent;
}

TreeNode parent = null;
TreeNode node = root;

while (node != null) {
if (key < node.key) {
parent = node;
node = node.left;
} else if (key > node.key) {
parent = node;
node = node.right;
} else {
break;
}
}

TreeNode newnode = new TreeNode(key);

if (key < parent.key) {
parent.left = newnode;
} else {
parent.right = newnode;
}

newnode.parent = parent;
fixAfterInsertion(parent);
return newnode;
}

private void fixAfterInsertion(TreeNode node) {
TreeNode parent = node.parent;
TreeNode grandParent;
TreeNode subTree;

while (parent != null) {
if (height(parent.left) == height(parent.right) + 2) {
grandParent = parent.parent;

if (height(parent.left.left) >= height(parent.left.right)) {
subTree = rightRotate(parent);
} else {
subTree = leftRightRotate(parent);
}

if (grandParent == null) {
root = subTree;
} else if (grandParent.left == parent) {
grandParent.left = subTree;
} else {
grandParent.right = subTree;
}

if (grandParent != null) {
grandParent.height = Math.max(
height(grandParent.left),
height(grandParent.right)) + 1;
}

return;
} else if (height(parent.right) == height(parent.left) + 2) {
grandParent = parent.parent;

if (height(parent.right.right) >= height(parent.right.left)) {
subTree = leftRotate(parent);
} else {
subTree = rightLeftRotate(parent);
}

if (grandParent == null) {
root = subTree;
} else if (grandParent.left == parent) {
grandParent.left = subTree;
} else {
grandParent.right = subTree;
}

if (grandParent != null) {
grandParent.height =
Math.max(height(grandParent.left),
height(grandParent.right)) + 1;
}

return;
}

parent.height = Math.max(height(parent.left),
height(parent.right)) + 1;
parent = parent.parent;
}
}

private TreeNode leftRotate(TreeNode node1) {
TreeNode node2 = node1.right;
node2.parent = node1.parent;
node1.parent = node2;
node1.right = node2.left;
node2.left = node1;

if (node1.right != null) {
node1.right.parent = node1;
}

node1.height = Math.max(height(node1.left), height(node1.right)) + 1;
node2.height = Math.max(height(node2.left), height(node2.right)) + 1;
return node2;
}

private TreeNode rightRotate(TreeNode node1) {
TreeNode node2 = node1.left;
node2.parent = node1.parent;
node1.parent = node2;
node1.left = node2.right;
node2.right = node1;

if (node1.left != null) {
node1.left.parent = node1;
}

node1.height = Math.max(height(node1.left), height(node1.right)) + 1;
node2.height = Math.max(height(node2.left), height(node2.right)) + 1;
return node2;
}

private TreeNode rightLeftRotate(TreeNode node1) {
TreeNode node2 = node1.right;
node1.right = rightRotate(node2);
return leftRotate(node1);
}

private TreeNode leftRightRotate(TreeNode node1) {
TreeNode node2 = node1.left;
node1.left = leftRotate(node2);
return rightRotate(node1);
}
}

Demo.java
import java.util.Arrays;
import java.util.Random;
import net.coderodde.util.sorting.IntTreeSort;

public class Demo {

private static final int CONSOLE_WIDTH = 80;
private static final int DISTINCT_INTS = 1000;
private static final int LENGTH = 10_000_000;

public static void main(String[] args) {
System.out.println(title("Small number of distinct integers"));
int[] array = new int[LENGTH];
long seed = System.nanoTime();
Random random = new Random(seed);

for (int i = 0; i < array.length; ++i) {
array[i] = 3 * random.nextInt(DISTINCT_INTS);
}

System.out.println("Seed = " + seed);

profile(array);
}

private static void profile(int[] array) {
int[] array2 = array.clone();
int[] array3 = array.clone();

long startTime = System.nanoTime();
Arrays.sort(array);
long endTime = System.nanoTime();

System.out.printf("Arrays.sort in %.2f milliseconds.\n",
(endTime - startTime) / 1E6);

startTime = System.nanoTime();
IntTreeSort.sort(array2);
endTime = System.nanoTime();

System.out.printf("IntTreeSort.sort in %.2f milliseconds.\n",
(endTime - startTime) / 1E6);

System.out.println("Equals: " + Arrays.equals(array, array2));
}

public static String title(String text) {
return title(text, '=');
}

private static String title(String text, char c) {
StringBuilder sb = new StringBuilder();

int left = (CONSOLE_WIDTH - 2 - text.length()) / 2;
int right = CONSOLE_WIDTH - 2 - text.length() - left;

for (int i = 0; i < left; ++i) {
sb.append(c);
}

sb.append(' ').append(text).append(' ');

for (int i = 0; i < right; ++i) {
sb.append(c);
}

return sb.toString();
}
}


# Discrete event simulation of a prioritized lunch queue in Java

### Introduction

In this post, we will devise a discrete event simulation of a prioritized lunch queue using Java. We begin with the problem setting: We are considering a lunch queue in an university campus. There is four categories (listed in decreasing priority):

1. doctors (PhD),
2. masters (MSc),
3. bachelors (BSc),

Now, whenever, say, a master arrives to the cafeteria, he skips all bachelors and undergraduates, and joins the tail of the subqueue of masters. As often, we want to find out the following statistics within each category:

• the minimum waiting time,
• the average waiting time,
• the maximum waiting time,
• the standard deviation of the waiting times.

The waiting times will include the actual service times. Also, we will consider idle time statistics of the cafeteria cashier.

### Internals of a discrete event simulation program

In the simulation, we first create the population, which is a simple collection of persons and their arrival times to the cafeteria. We assume that the time points at which people go for a lunch exhibit a normal distribution. We sort the collection of persons by their arrival times. In case two or more persons arrive at precisely the same moment, we break the ties by their academic degree.

Since we are simulating a prioritized lunch queue, we need a simple priority queue data structure: it maintains a FIFO (first in, first out) queue for each privilege level (doctors, masters, and so on). Pushing a person to the queue will append her/him to the queue of its peers. Popping operation scans the queues from most privileged to least privileged, and pops the first non-empty queue.

The last important component of a simulation is the actual simulating procedure. In pseudocode it looks something like this:

let P be the empty prioritized queue
let Q be the queue of persons sorted by arrival times
persons_pending = Q.size

while persons_pending > 0:
# Load the persons that arrived during the service
# of the previously served person
while Q is not empty and head(Q).arrival_time <= current_clock:
enqueue(P, dequeue(Q))

if P is empty:
# The cashier is being idle, jump forward in time

# Pop the most prioritized + earliest person
current_event = pop(P)
service_time = get_service_time()
current_clock += service_time
persons_pending -= 1
# Served!


At this point you might be asking why we need the internal queue (P in the above algorithm). Why we do not just sort the entire population first by privilege levels and each subportion by the arrival time. The answer should be obvious after considering an example: suppose the entire population consists of only two bachelors and a master; bachelors arrive earlier, and the master arrives while the second bachelor is being served. If we just had sorted the population, the master would “travel forward in time”. This issue requires us to simulate the actual lunch queue, i.e., the actual set of persons that are waiting in the queue, not entire population.

Now we are ready to start coding. It makes sense to start from the enumeration of academic degrees:

package net.coderodde.simulation.lunch;

/**
* This class implements an enumeration over academic degrees.
*
* @author Rodion "rodde" Efremov
* @version 1.6 (Dec 2, 2015)
*/

// The order denotes priority from highest to lowest.
DOCTOR       ("PhD"),
MASTER       ("MSc"),
BACHELOR     ("BSc"),

private final String description;

@Override
public String toString() {
return description;
}

this.description = description;
}
}


In Java (since JDK 5), every enumeration is implicitly derived from the abstract class java.lang.Enum<E extends Enum<E>>. When you declare the constants in an enumeration, behind the scenes you create instances of that very enumeration type, so that there is only one enumeration instance for each constant. Also, the runtime associates with each constant an ordinal value: the very first declared (topmost) constant receives the value of zero, the second one receives one, and so on. Also, enumerations are java.lang.Comparable and comparing the constants relies on their respective ordinal values. Now, by declaring the constants in the way we did, we imply the order of the academic degrees, in which doctors preceed masters, masters preceed bachelors, and so on.

Next we need a class that represents a person:

net.coderodde.simulation.lunch.Person.java
package net.coderodde.simulation.lunch;

import java.util.Objects;

/**
* This class implements a record for a person.
*
* @author Rodion "rodde" Efremov
* @version 1.6 (Dec 2, 2015)
*/
public final class Person implements Comparable<Person> {

private final String firstName;
private final String lastName;
private final String stringRepresentation;
private final String identity;

public static LastNameSelector withFirstName(String firstName) {
Objects.requireNonNull(firstName,
"The first name of a person is null.");
Configuration configuration = new Configuration();
configuration.firstName = firstName;
return new LastNameSelector(configuration);
}

public static final class LastNameSelector {

private final Configuration configuration;

private LastNameSelector(Configuration configuration) {
this.configuration = configuration;
}

Objects.requireNonNull(lastName,
"The last name of a person is null.");
configuration.lastName = lastName;
}
}

public static final class AcademicDegreeSelector {

private final Configuration configuration;

this.configuration = configuration;
}

Objects.requireNonNull(degree, "The academic degree is null.");
return new Person(configuration.firstName,
configuration.lastName,
degree);
}
}

private Person(String firstName,
String lastName,
this.firstName      = firstName;
this.lastName       = lastName;
this.stringRepresentation = "[" + firstName + " " + lastName + ", " +
this.identity = firstName + " " + lastName;
}

public String getFirstName() {
return firstName;
}

public String getLastName() {
return lastName;
}

}

@Override
public String toString() {
return stringRepresentation;
}

@Override
public int compareTo(Person o) {
}

@Override
public int hashCode() {
return identity.hashCode();
}

@Override
public boolean equals(Object obj) {
if (obj == null) {
return false;
}

if (getClass() != obj.getClass()) {
return false;
}

Person other = (Person) obj;
return Objects.equals(identity, other.identity);
}

private static final class Configuration {
private String firstName;
private String lastName;
}
}


The identity of each person consists of two attributes: the first name and the last name. Once again, we make the Person a Comparable so that the persons are ordered by their respective academic degrees. Also, we opt to use strong fluent API for constructing instances of Person:

Person person = Person.withFirstName("Jack")
.withLastName("Cooley")


In a weak fluent API, the order in which the user fills out the attributes (first name, last name, academic degree) is not fixed, which makes it possible that the user may forget to specify some attributes. In a strong fluent API, the order of filling the attributes is fixed, which ensures that the client programmer will not omit any required attribute. Regardless of whether we use a strong or weak fluent API, sticking to one makes code more self-explanatory.

Since we will put Persons in a couple of hash tables, we must implement Person.equals(Object) and Person.hashCode() in the way that takes the identity (first and last names) into account.

Next, we proceed to the class that represents a cashier. We do not attach any identity information to a Cashier. All a Cashier knows is its mean service time and the standard deviation of the service time:

net.coderodde.simulation.lunch.Cashier.java
package net.coderodde.simulation.lunch;

import java.util.Objects;
import java.util.Random;
import static net.coderodde.simulation.lunch.Utils.checkMean;
import static net.coderodde.simulation.lunch.Utils.checkStandardDeviation;

/**
* This class models the action of a cashier.
*
* @author Rodion "rodde" Efremov
* @version 1.6 (Dec 4, 2015)
*/
public final class Cashier {

private final double meanServiceTime;
private final double standardDeviationOfServiceTime;
private final Random random;

/**
* Initiates a strong fluent API for creating a {@code Cashier}.
*
* @param  random the random number generator to use.
* @return the mean service time selector.
*/
public static MeanServiceTimeSelector withRandom(Random random) {
Objects.requireNonNull(random, "The input Random is null.");
Configuration configuration = new Configuration();
configuration.random = random;
return new MeanServiceTimeSelector(configuration);
}

/**
* Initiates a strong fluent API for creating a {@code Cashier} using a
* default random number generator.
*
* @return the mean service time selector.
*/
public static MeanServiceTimeSelector withDefaultRandom() {
return withRandom(new Random());
}

public final static class MeanServiceTimeSelector {

private final Configuration configuration;

private MeanServiceTimeSelector(Configuration configuration) {
this.configuration = configuration;
}

/**
* Selects the mean service time and returns a standard deviation
* selector.
*
* @param  meanServiceTime the mean service time in seconds.
* @return a standard deviation selector.
*/
public StandardDeviationSelector
withMeanServiceTime(double meanServiceTime) {
checkMean(meanServiceTime);
configuration.meanServiceTime = meanServiceTime;
return new StandardDeviationSelector(configuration);
}
}

public final static class StandardDeviationSelector {

private final Configuration configuration;

private StandardDeviationSelector(Configuration configuration) {
this.configuration = configuration;
}

/**
* Selects a standard deviation for the service time and returns the
* {@code Cashier} using the gathered parameters.
*
* @param  standardDeviationOfServiceTime the standard deviation of the
*                                        service time in seconds.
* @return a {@code Cashier} object.
*/
public Cashier withStandardDeviationOfServiceTime(
double standardDeviationOfServiceTime) {
checkStandardDeviation(standardDeviationOfServiceTime);
return new Cashier(configuration.meanServiceTime,
standardDeviationOfServiceTime,
configuration.random);
}
}

private Cashier(double meanServiceTime,
double standardDeviationOfServiceTime,
Random random) {
this.meanServiceTime = meanServiceTime;
this.standardDeviationOfServiceTime = standardDeviationOfServiceTime;
this.random = random;
}

public int getServiceTime() {
return (int)(Math.round(meanServiceTime +
standardDeviationOfServiceTime *
random.nextGaussian()));
}

private static final class Configuration {
private Random random;
private double meanServiceTime;
}
}


Once again, as Cashier is exposed to a potential client programmer, we opt to a strong fluent API for creating instances of Cashier:

Cashier cashier = Cashier.withRandom(random)
.withMeanServiceTime(15.0)
.withStandardDeviationOfServiceTime(2.0);


Next, we need a class for representing a population:

net.coderodde.simulation.lunch.Population.java
package net.coderodde.simulation.lunch;

import java.util.ArrayDeque;
import java.util.ArrayList;
import java.util.Collections;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.Objects;
import java.util.Queue;
import java.util.Set;
import static net.coderodde.simulation.lunch.Utils.checkTime;

/**
* This class represents simulated population.
*
* @author Rodion "rodde" Efremov
* @version 1.6 (Dec 3, 2015)
*/
public final class Population {

private final Map<Person, Integer> arrivalTimeMap = new HashMap<>();

public final class ArrivalTimeSelector {
private final Person person;

ArrivalTimeSelector(Person person) {
this.person = Objects.requireNonNull(person,
"The input person is null.");
}

public boolean withArrivalTime(int arrivalTime) {
checkTime(arrivalTime);

if (arrivalTimeMap.containsKey(person)) {
return false;
}

arrivalTimeMap.put(person, arrivalTime);
return true;
}
}

return new ArrivalTimeSelector(person);
}

public int size() {
return arrivalTimeMap.size();
}

Set<Person> getPersonSet() {
return Collections.<Person>unmodifiableSet(arrivalTimeMap.keySet());
}

Queue<LunchQueueEvent> toEventQueue() {
List<LunchQueueEvent> eventList = new ArrayList<>(size());

getPersonSet().stream().forEach((person) -> {
arrivalTimeMap.get(person)));
});

Collections.sort(eventList,
(event1, event2) -> {
// Try to compare by the time stamps of the events.
int cmp = event1.compareTo(event2);

if (cmp != 0) {
return cmp;
}

// The two input events have same time stamp, break ties by person
// priority.
return event1.getPerson().compareTo(event2.getPerson());
});

return new ArrayDeque<>(eventList);
}
}


Basically, all Population does is keep track of all the persons and their respective arrival times. Adding a person to the population:

Person person = Person.withFirstName("Jack")
.withLastName("Cooley")
Population population = new Population();
.withArrivalTime(100);


Since it would be really laborous to construct manually a large population, we need a class for generating random populations:

net.coderodde.simulation.lunch.RandomPopulationGenerator.java
package net.coderodde.simulation.lunch;

import java.util.ArrayList;
import java.util.Collections;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.Objects;
import java.util.Random;
import static net.coderodde.simulation.lunch.Utils.checkMean;
import static net.coderodde.simulation.lunch.Utils.checkStandardDeviation;

/**
* This class facilitates random generation of population.
*
* @author Rodion "rodde" Efremov
* @version 1.6 (Dec 2, 2015)
*/
public final class RandomPopulationGenerator {

private final Random random;
private final double meanLunchTime;
private final double standardDeviationOfLunchTime;

/**
* Initiates the strong fluent API for constructing a
* {@code RandomPopulationGenerator}.
*
* @param  random the random number generator to use.
* @return a degree selector.
*/
public static DegreeCountSelector withRandom(Random random) {
Objects.requireNonNull(random, "The input Random is null.");
Configuration configuration = new Configuration();
configuration.random = random;
return new DegreeCountSelector(configuration);
}

/**
* Initiates the strong fluent API for constructing a
* {@code RandomPopulationGenerator} using a default {@code Random}.
*
* @return a degree selector.
*/
public static DegreeCountSelector withDefaultRandom() {
return withRandom(new Random());
}

public static final class DegreeCountSelector {

private final Configuration configuration;

DegreeCountSelector(Configuration configuration) {
this.configuration = configuration;
}

/**
* Starts constructing a population  wit selected academic degree.
*
* @param  count the number of persons for a degree group.
* @return a degree selector for the group being constructed.
*/
public DegreeSelector with(int count) {
if (count < 0) {
throw new IllegalArgumentException(
"The people count is negative: " + count);
}

return new DegreeSelector(configuration, count);
}

/**
* Terminates creation of groups and selects a mean time at which people
* go for a lunch. (Lunch time does not mean the duration of a lunch.)
*
* @param  meanLunchTime the mean of lunch times
* @return a standard deviation selector.
*/
public StandardDeviationSelector
withMeanLunchTime(double meanLunchTime) {
checkMean(meanLunchTime);
configuration.meanLunchTime = meanLunchTime;
return new StandardDeviationSelector(configuration);
}
}

public static final class DegreeSelector {

private final Configuration configuration;
private final int count;

DegreeSelector(Configuration configuration, int count) {
this.configuration = configuration;
this.count = count;
}

Objects.requireNonNull(degree, "The input degree is null.");
configuration.distribution.put(degree, count);
return new DegreeCountSelector(configuration);
}
}

public static final class StandardDeviationSelector {

private final Configuration configuration;

StandardDeviationSelector(Configuration configuration) {
this.configuration = configuration;
}

/**
* Selects the standard deviation and generates a population with
* specified parameters.
*
* @param  lunchTimeStandardDeviation the standard deviation of the
*                                    times at which people go to lunch.
* @return a population.
*/
public Population withLunchTimeStandardDeviation(
double lunchTimeStandardDeviation) {
checkStandardDeviation(lunchTimeStandardDeviation);
return new RandomPopulationGenerator(
configuration.random,
configuration.distribution,
configuration.meanLunchTime,
lunchTimeStandardDeviation).generate();
}
}

private RandomPopulationGenerator(Random random,
double meanLunchTime,
double standardDeviationOfLunchTime) {
this.random       = random;
this.distribution = distribution;
this.meanLunchTime = meanLunchTime;
this.standardDeviationOfLunchTime = standardDeviationOfLunchTime;
}

public Population generate() {
int populationSize = 0;

for (Map.Entry<AcademicDegree, Integer> entry : distribution.entrySet()) {
populationSize += entry.getValue();
}

List<Person> allPersonList =
new ArrayList<>(FIRST_NAMES.length * LAST_NAMES.length);

int count = distribution.getOrDefault(degree, 0);

for (int i = 0; i < count; ++i) {
}
}

int i = 0;

outer:
for (String firstName : FIRST_NAMES) {
for (String lastName : LAST_NAMES) {
if (i == degreeList.size()) {
break outer;
}

.withLastName(lastName)
++i;
}
}

Collections.shuffle(allPersonList, random);
populationSize = Math.min(populationSize, allPersonList.size());

Population population = new Population();

for (i = 0; i < populationSize; ++i) {
.withArrivalTime(getRandomLunchTime());
}

return population;
}

private int getRandomLunchTime() {
return (int)(meanLunchTime + standardDeviationOfLunchTime *
random.nextGaussian());
}

private static final class Configuration {
private final Map<AcademicDegree, Integer> distribution =
new HashMap<>();

private Random random;
private double meanLunchTime;
}

private static final String[] FIRST_NAMES = {
"Alice",
"Al",
"Alma",
"Alvin",
"Amanda",
"Bob",
"Brandon",
"Brooke",
"Bruce",
"Camilla",
"Cecilia",
"Carl",
"David",
"Elsa",
"Ida",
"Jack",
"John",
"Nathan",
"Nick",
"Phoebe",
"Rachel",
"Richard",
"Rodion",
"Roger",
"Roland",
"Rolf",
"Roy",
"Terence",
"Terry",
"Viola"
};

private static final String[] LAST_NAMES = {
"Abbey",
"Ackerman",
"Bonham",
"Cantrell",
"Carter",
"Dawkins",
"Dawson",
"Edison",
"Efremov",
"Fay",
"Fleming",
"Garrett",
"Hallman",
"Irvine",
"Jacobson",
"Kidd",
"Lacey",
"Marlow",
"Nelson",
"Oliver",
"Parks",
"Pearson",
"Peterson",
"Quincey",
"Ridley",
"Saunders",
"Thompson",
"Walton",
"Wilkerson"
};
}


A sample call to the generator looks like this:

Population population =
RandomPopulationGenerator
.withRandom(random)
.withMeanLunchTime(10800.0)
.withLunchTimeStandardDeviation(1200.0);


Next, we need a class for representing events:

net.coderodde.simulation.lunch.LunchQueueEvent.java
package net.coderodde.simulation.lunch;

/**
* This class describes a lunch queue event.
*
* @author Rodion "rodde" Efremov
* @version 1.6 (Dec 2, 2015).
*/
final class LunchQueueEvent implements Comparable<LunchQueueEvent> {

private final Person person;
private final int timeStamp;

LunchQueueEvent(Person person, int timeStamp) {
this.person = person;
this.timeStamp = timeStamp;
}

Person getPerson() {
return person;
}

int getTimestamp() {
return timeStamp;
}

@Override
public int compareTo(LunchQueueEvent anotherEvent) {
return Double.compare(timeStamp, anotherEvent.timeStamp);
}
}


Note that the lunch queue event comprises only a person and an arrival time. You might ask: Aren’t we supposed to have a third attribute that indicates which one of the two possible events (arrival, being served) is in question? No, we will maintain two distinct data structures holding the events: one for arrival events and the other one for service events. Also, since LunchQueueEvent is not supposed to be exposed to the client programmer, its access modifier is package private.

Now, the actual prioritized queue:

net.coderodde.simulation.lunch.PrioritizedQueue.java
package net.coderodde.simulation.lunch;

import java.util.ArrayDeque;
import java.util.EnumMap;
import java.util.Map;
import java.util.NoSuchElementException;
import java.util.Queue;

/**
* This class implements a FIFO queue over priority categories. Not to be
* confused with a priority queue.
*
* @author Rodion "rodde" Efremov
* @version 1.6 (Dec 3, 2015)
*/
final class PrioritizedQueue {

private int size;

void push(LunchQueueEvent event) {
map.putIfAbsent(degree, new ArrayDeque<>());
++size;
}

boolean isEmpty() {
return size == 0;
}

LunchQueueEvent pop() {
if (isEmpty()) {
throw new NoSuchElementException(
"Popping from an empty prioritized queue.");
}

for (Queue<LunchQueueEvent> queue : map.values()) {
if (!queue.isEmpty()) {
--size;
return queue.remove();
}
}

throw new IllegalStateException(
"This should never happend. Please debug.");
}
}


The implementation is trivial, yet if we were to add new privilige levels to the simulation, the above implementation would not had to be modified; only recompilation would suffice. At this point we are ready to proceed to the actual simulator class:

net.coderodde.simulation.lunch.Simulator.java
package net.coderodde.simulation.lunch;

import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.Objects;
import java.util.Queue;

/**
* This class runs the lunch queue simulation.
*
* @author Rodion "rodde" Efremov
* @version 1.6 (Dec 2, 2015)
*/
public final class Simulator {

//// Internals.
private final Map<Person, LunchQueueEvent> arrivalEventMap =
new HashMap<>();
private final Map<Person, LunchQueueEvent> servedEventMap =
new HashMap<>();
private final Map<AcademicDegree, Integer> groupCounts = new HashMap<>();

private final Map<AcademicDegree, Integer> mapMinimumWaitTime =
new HashMap<>();
private final Map<AcademicDegree, Integer> mapMaximumWaitTime =
new HashMap<>();
private final Map<AcademicDegree, Integer> mapAverageWaitTime =
new HashMap<>();
private final Map<AcademicDegree, Integer> mapWaitTimeSum     =
new HashMap<>();
private final Map<AcademicDegree, Integer> mapWaitTimeDeviation =
new HashMap<>();

private final List<Integer> cashierIdleIntervals = new ArrayList<>();
private Population population;

public static PopulationSelector simulate() {

return new PopulationSelector();
}

public static final class PopulationSelector {

public CashierSelector withPopulation(Population population) {
Objects.requireNonNull(population, "The input population is null.");
return new CashierSelector(population);
}
}

public static final class CashierSelector {

private final Population population;

CashierSelector(Population population) {
this.population = population;
}

public SimulationResult withCashier(Cashier cashier) {
Objects.requireNonNull(cashier, "The input cashier is null.");
return new Simulator().simulate(population, cashier);
}
}

private SimulationResult simulate(Population population, Cashier cashier) {
this.population = population;
Queue<LunchQueueEvent> inputEventQueue = population.toEventQueue();
preprocess(inputEventQueue);

if (population.size() == 0) {
return new SimulationResult(arrivalEventMap, servedEventMap);
}

PrioritizedQueue QUEUE = new PrioritizedQueue();
int currentClock = inputEventQueue.peek().getTimestamp();

for (int personsPending = population.size();
personsPending > 0;
personsPending--) {
// Load all hungry people that arrived during the service of the
// previously served person.
while (!inputEventQueue.isEmpty()
&& inputEventQueue.peek().getTimestamp()
<= currentClock) {
QUEUE.push(inputEventQueue.remove());
}

if (QUEUE.isEmpty()) {
currentClock);
} else {
}

// Admit an earliest + highest priority person to the cashier.
LunchQueueEvent currentEvent = QUEUE.pop();
Person currentPerson = currentEvent.getPerson();

// Serving...
int serviceTime = cashier.getServiceTime();
currentClock += serviceTime;
LunchQueueEvent servedEvent = new LunchQueueEvent(currentPerson,
currentClock);
servedEventMap.put(currentPerson, servedEvent);
// Served!
}

return postprocess();
}

private void preprocess(Queue<LunchQueueEvent> inputEventQueue) {
// groupCounts.keySet() will now list only those academic degrees that
// are present in the population.
for (LunchQueueEvent event : inputEventQueue) {
Person person = event.getPerson();
arrivalEventMap.put(person, event);
groupCounts.put(degree, groupCounts.getOrDefault(degree, 0) + 1);
}
}

private SimulationResult postprocess() {
// Start computing system statistics.

for (AcademicDegree degree : groupCounts.keySet()) {
mapMinimumWaitTime.put(degree, Integer.MAX_VALUE);
mapMaximumWaitTime.put(degree, Integer.MIN_VALUE);
mapWaitTimeSum.put(degree, 0);
}

// Computing minimum/maximum wait time for each academic degree.
for (Person person : population.getPersonSet()) {
LunchQueueEvent arrivalEvent = arrivalEventMap.get(person);
LunchQueueEvent servedEvent  = servedEventMap.get(person);

int waitTime = servedEvent.getTimestamp() -
arrivalEvent.getTimestamp();

if (mapMinimumWaitTime.get(degree) > waitTime) {
mapMinimumWaitTime.put(degree, waitTime);
}

if (mapMaximumWaitTime.get(degree) < waitTime) {
mapMaximumWaitTime.put(degree, waitTime);
}

mapWaitTimeSum.put(degree, mapWaitTimeSum.get(degree) + waitTime);
}

// Computing the average waiting time for each academic degree.
for (AcademicDegree degree : groupCounts.keySet()) {
int average = (int) Math.round(1.0 * mapWaitTimeSum.get(degree) /
groupCounts.get(degree));

mapAverageWaitTime.put(degree, average);
mapWaitTimeDeviation.put(degree, 0);
}

for (Person person : population.getPersonSet()) {

int duration = servedEventMap.get(person).getTimestamp() -
arrivalEventMap.get(person).getTimestamp();

int contribution = duration - mapAverageWaitTime.get(degree);

contribution *= contribution;
mapWaitTimeDeviation.put(degree,
mapWaitTimeDeviation.get(degree) +
contribution);
}

for (AcademicDegree degree : groupCounts.keySet()) {
int sum = mapWaitTimeDeviation.get(degree);
mapWaitTimeDeviation.put(degree,
(int) Math.round(
Math.sqrt(sum /
groupCounts
.get(degree))));
}

SimulationResult result = new SimulationResult(arrivalEventMap,
servedEventMap);

for (AcademicDegree degree : groupCounts.keySet()) {
result.putWaitMinimumTime(degree, mapMinimumWaitTime.get(degree));
result.putWaitMaximumTime(degree, mapMaximumWaitTime.get(degree));
result.putAverageWaitTime(degree, mapAverageWaitTime.get(degree));
result.putWaitTimeStandardDeviation(degree,
mapWaitTimeDeviation
.get(degree));
}

// Process cashier idle time statistics:
if (cashierIdleIntervals.isEmpty()) {
return result;
}

int sum = 0;
int min = cashierIdleIntervals.get(0);
int max = cashierIdleIntervals.get(0);

for (int value : cashierIdleIntervals) {
sum += value;

if (min > value) {
min = value;
} else if (max < value) {
max = value;
}
}

double average = 1.0 * sum / cashierIdleIntervals.size();

sum = 0;

// Compute standard deviation:
for (int value : cashierIdleIntervals) {
double diff = average - value;
diff *= diff;
sum += diff;
}

int standardDeviation =
(int)(Math.round(
Math.sqrt(1.0 *sum / cashierIdleIntervals.size())));

result.putCashierMinimumIdleTime(min);
result.putCashierAverageIdleTime((int)(Math.round(average)));
result.putCashierMaximumIdleTime(max);
result.putCashierStandardDeviation(standardDeviation);

return result;
}
}


Of course, we need a class for holding the simulation statistics:

net.coderodde.simulation.lunch.SimulationResult.java
package net.coderodde.simulation.lunch;

import java.util.ArrayList;
import java.util.Collections;
import java.util.HashMap;
import java.util.List;
import java.util.Map;

/**
* This class holds the statistics of a simulation.
*
* @author Rodion "rodde" Efremov
* @version 1.6 (Dec 2, 2015)
*/
public final class SimulationResult {

private static final String NL = "\n";
private static final String SKIP = "    ";
private static final int NO_DATA = -1;

private final Map<AcademicDegree, Integer> waitAverageMap = new HashMap<>();
private final Map<AcademicDegree, Integer> waitStandardDeviationMap =
new HashMap<>();

private final Map<AcademicDegree, Integer> waitMinMap = new HashMap<>();
private final Map<AcademicDegree, Integer> waitMaxMap = new HashMap<>();

private final Map<Person, LunchQueueEvent> arrivalEventMap;
private final Map<Person, LunchQueueEvent> servedEventMap;

private int cashierMinimumIdleTime = NO_DATA;
private int cashierAverageIdleTime = NO_DATA;
private int cashierMaximumIdleTime = NO_DATA;
private int cashierStandardDeviation = NO_DATA;

return waitMinMap.getOrDefault(degree, NO_DATA);
}

return waitAverageMap.getOrDefault(degree, NO_DATA);
}

return waitMaxMap.getOrDefault(degree, NO_DATA);
}

return waitStandardDeviationMap.getOrDefault(degree, NO_DATA);
}

public int getCashierMinimumIdleTime() {
return cashierMinimumIdleTime;
}

public int getCashierAverageIdleTime() {
return cashierAverageIdleTime;
}

public int getCashierMaximumIdleTime() {
return cashierMaximumIdleTime;
}

public int getCashierStandardDeviation() {
return cashierStandardDeviation;
}

SimulationResult(Map<Person, LunchQueueEvent> arrivalEventMap,
Map<Person, LunchQueueEvent> servedEventMap) {
this.arrivalEventMap = arrivalEventMap;
this.servedEventMap = servedEventMap;
}

void putWaitMinimumTime(AcademicDegree degree, int minimumWaitTime) {
waitMinMap.put(degree, minimumWaitTime);
}

void putAverageWaitTime(AcademicDegree degree, int averageTime) {
waitAverageMap.put(degree, averageTime);
}

void putWaitMaximumTime(AcademicDegree degree, int maximumWaitTime) {
waitMaxMap.put(degree, maximumWaitTime);
}

int timeStandardDeviation) {
waitStandardDeviationMap.put(degree, timeStandardDeviation);
}

void putCashierMinimumIdleTime(int cashierMinimumIdleTime) {
this.cashierMinimumIdleTime = cashierMinimumIdleTime;
}

void putCashierAverageIdleTime(int cashierAverageIdleTime) {
this.cashierAverageIdleTime = cashierAverageIdleTime;
}

void putCashierMaximumIdleTime(int cashierMaximumIdleTime) {
this.cashierMaximumIdleTime = cashierMaximumIdleTime;
}

void putCashierStandardDeviation(int cashierStandardDeviation) {
this.cashierStandardDeviation = cashierStandardDeviation;
}

@Override
public String toString() {
StringBuilder sb = new StringBuilder();
List<Person> personList = new ArrayList<>(arrivalEventMap.keySet());

Collections.<Person>sort(personList,
(p1, p2) -> {
double arrivalTime1 = arrivalEventMap.get(p1).getTimestamp();
double servedTime1 = servedEventMap.get(p1).getTimestamp();

double arrivalTime2 = arrivalEventMap.get(p2).getTimestamp();
double servedTime2 = servedEventMap.get(p2).getTimestamp();

return Double.compare(servedTime1 - arrivalTime1,
servedTime2 - arrivalTime2);
});

for (Person person : personList) {
sb.append(person.toString())
.append(", wait time: ")
.append((int)(servedEventMap.get(person).getTimestamp() -
arrivalEventMap.get(person).getTimestamp()))
.append(" seconds.")
.append(NL);
}

sb.append("Cashier:")
.append(NL)
.append(SKIP)
.append("Minimum idle time:  ")
.append(getCashierMinimumIdleTime())
.append(" seconds.")
.append(NL)
.append(SKIP)
.append("Average idle time:  ")
.append(getCashierAverageIdleTime())
.append(" seconds.")
.append(NL)
.append(SKIP)
.append("Maximum idle time:  ")
.append(getCashierMaximumIdleTime())
.append(" seconds.")
.append(NL)
.append(SKIP)
.append("Standard deviation: ")
.append(getCashierStandardDeviation())
.append(" seconds.");

return sb.toString();
}

private void toString(StringBuilder sb, AcademicDegree degree) {
sb.append(degree.toString()).append(":").append(NL);

sb.append(SKIP)
.append("Minimum wait time:  ")
.append(getMinimumWaitTime(degree))
.append(" seconds.")
.append(NL);

sb.append(SKIP)
.append("Average wait time:  ")
.append(getWaitAverage(degree))
.append(" seconds.")
.append(NL);

sb.append(SKIP)
.append("Maximum wait time:  ")
.append(getMaximumWaitTime(degree))
.append(" seconds.")
.append(NL);

sb.append(SKIP)
.append("Standard deviation: ")
.append(getWaitStandardDeviation(degree))
.append(" seconds.")
.append(NL);
}
}


Validating your input is a good practice. The next class provides for some miscellaneous validation routines:

net.coderodde.simulation.lunch.Utils.java
package net.coderodde.simulation.lunch;

/**
* This class contains miscellaneous utilities.
*
* @author Rodion "rodde" Efremov
* @version 1.6 (Dec 2, 2015)
*/
final class Utils {

public static void checkMean(double mean) {
if (Double.isNaN(mean)) {
throw new IllegalArgumentException(
"The mean is NaN (not-a-number):");
}

if (Double.isInfinite(mean)) {
throw new IllegalArgumentException("The mean is infinite: " + mean);
}
}

public static void checkStandardDeviation(double deviation) {
if (Double.isNaN(deviation)) {
throw new IllegalArgumentException(
"The standard deviation is NaN (not-a-number):");
}

if (Double.isInfinite(deviation)) {
throw new IllegalArgumentException(
"The standard deviation is infinite: " + deviation);
}

if (deviation < 0.0) {
throw new IllegalArgumentException(
"The standard deviation is negative: " + deviation);
}
}

public static void checkTime(double time) {
if (Double.isNaN(time)) {
throw new IllegalArgumentException(
"The input time is NaN (not-a-number).");
}

if (Double.isInfinite(time)) {
throw new IllegalArgumentException(
"The input time is infinite: " + time);
}
}
}


Finally, the only missing block is the actual demonstration program:

Demo.java
import java.util.Random;
import net.coderodde.simulation.lunch.Cashier;
import net.coderodde.simulation.lunch.Population;
import net.coderodde.simulation.lunch.RandomPopulationGenerator;
import net.coderodde.simulation.lunch.SimulationResult;
import net.coderodde.simulation.lunch.Simulator;

public class Demo {

public static void main(final String... args) {
long seed = System.nanoTime();
Random random = new Random(seed);

Population population =
RandomPopulationGenerator
.withRandom(random)
.withMeanLunchTime(10800.0)
.withLunchTimeStandardDeviation(1200.0);

// Cashier serves in average in 15 seconds, s.d. 2 seconds.
Cashier cashier = Cashier.withRandom(random)
.withMeanServiceTime(15.0)
.withStandardDeviationOfServiceTime(2.0);

System.out.println("Seed = " + seed);

long startTime = System.nanoTime();
SimulationResult result = Simulator.simulate()
.withPopulation(population)
.withCashier(cashier);
long endTime = System.nanoTime();

System.out.printf("Simulated in %.2f milliseconds.\n",
(endTime - startTime) / 1e6);

System.out.println(result);
}
}


After running the above demonstration, you may get something like

Seed = 389000045239110
Simulated in 334.89 milliseconds.
[Bob Carter, Undergraduate], wait time: 11 seconds.
[Brandon Peterson, BSc], wait time: 11 seconds.
[Alice Efremov, Undergraduate], wait time: 12 seconds.
.
.
.
[Alma Marlow, Undergraduate], wait time: 2250 seconds.
[Brandon Fay, Undergraduate], wait time: 2251 seconds.
[Carl Walton, Undergraduate], wait time: 2253 seconds.
PhD:
Minimum wait time:  12 seconds.
Average wait time:  20 seconds.
Maximum wait time:  31 seconds.
Standard deviation: 5 seconds.
MSc:
Minimum wait time:  15 seconds.
Average wait time:  26 seconds.
Maximum wait time:  50 seconds.
Standard deviation: 8 seconds.
BSc:
Minimum wait time:  11 seconds.
Average wait time:  40 seconds.
Maximum wait time:  133 seconds.
Standard deviation: 26 seconds.
Minimum wait time:  11 seconds.
Average wait time:  1408 seconds.
Maximum wait time:  2253 seconds.
Standard deviation: 813 seconds.
Cashier:
Minimum idle time:  0 seconds.
Average idle time:  4 seconds.
Maximum idle time:  644 seconds.
Standard deviation: 37 seconds.


Now, we are ready for simulation. Please feel free to give me some feedback or ask a question!