How secure is your passphrase?


xkcd.com

Analyze it all in your browser.

High entropy passphrase:
correct / horse / battery / staple (loading...)
Low entropy passphrase:
Once / upon / a / time (loading...)

Try it out yourself!
The greener the better. Oranges are okay.



How it works

Entropy gets calculated from word-based Markov models of modern printed English, via Google Books ngram data set.
Code available on GitHub for creating the statistical model and code available for the data strucutre more efficient than a Bloom filter for compactly representing it as a set.

© 2012 Lee Butterman.