Wordle Entropy¶

Wordle on the board
Goal¶
Students solve Wordles at the beginning of a class.
Time¶
5’
Concepts¶
GINI-Score
Entropy
Mutual Information
The Game¶
Wordle is a great warm-up game for lessons that can be used repeatedly. It is also fantastic for statistics, as it illustrates the topics conditional probability and information theory.
Good words for getting started are: MEDIAN, RANGE, TABLE, PIVOT, EXCEL and BAYES.
Lesson Plan¶
For a deeper dive into the wordle topic, you might want to apply some information theory. Using the SOWPODS list, an official list of allowed Scrabble words, you can take steps to help you choose the optimal word in a given situation.
As a simpler metric you could also calculate the GINI-Score. In any case, the calculation on a larger body of words might require some more time and skills in a programing language, because the task will get rather tedious in a spreadsheet.
download the SOWPODS list of Scrabble words
calculate the entropy for each position
apply a condition, calculate entropy again on the remaining words
count frequent combinations of characters
calculate mutual information of two columns
discuss strategies to find the best guess
See also
Note
using Wordle in a classroom was brought to me by Christina.