All essays
Information theoryMay 10, 20264 min read

How Much Information Is in a Lottery Ticket?

A Shannon-entropy view: a Powerball ticket carries about 28 bits of information. Compared to your phone, that's nothing.

Shannon defined the information content of an event as:

I(event) = −log₂(P(event)) bits

For a Powerball jackpot win:

I(jackpot) = −log₂(1 / 292,201,338)
= log₂(292,201,338)
≈ 28.12 bits

A Powerball jackpot win conveys about 28 bits of information. That's the answer to: "How many bits do I need to specify which combination won, from among all possible combinations?"

28 bits in context

  • A single ASCII character: 7–8 bits.
  • A four-letter English word: ~20 bits (after letter-frequency correction).
  • A Powerball jackpot win: ~28 bits.
  • A typical password (8 chars, mixed case + digits): ~50 bits.
  • A 256-bit AES key: 256 bits.
  • The state of a 1MB file: 8,000,000 bits.

By the information-theoretic measure, your Powerball jackpot win is less surprising than a moderately strong password being correctly guessed on the first try.

What this perspective tells us

Lottery jackpots feel impossibly improbable in human terms. But quantitatively, they're well within the range of events humans handle every day in computational contexts. Your phone generates 256-bit cryptographic keys for every TLS connection; that's nine times more entropy than a Powerball draw.

The reason a lottery win feels so improbable isn't the number of bits. It's that we don't experience cryptographic events viscerally; we don't see 256-bit keys. A 28-bit lottery win is a small information-theoretic event with a huge financial payout. That mismatch — large payout, small information — is exactly the property that makes lotteries entertaining and economically dangerous.

The entropy of a random combination

When you generate a quantum-seeded Powerball combination, you're producing about 28 bits of entropy: the information content of selecting one of ~292M outcomes uniformly at random. If your generator is biased (favors some numbers over others), the entropy is lower than 28 bits, and your combination is in principle more predictable.

This is the formal way to say what we said about quantum vs. pseudo-randomness in an earlier article: a uniform RNG produces full entropy; a biased one doesn't. For lottery picks, the entropy of your ticket is exactly the entropy of the lottery's draw — both 28 bits — so they match. Any departure from uniformity costs you some of those bits.

Keep reading

All essays
Combinatorics

The 1-in-292-Million Number

4 min read
Probability

The Gambler's Fallacy in Three Numbers

5 min read
Statistics

The Quick Pick Paradox

4 min read