Ok, so I was just pondering this and I'm getting a little confused. Math is not my strong suit.
If there isn't any difference (mathematically) between a pool of six "2" tokens and six "0" tokens and twelve "1" tokens, how can that be if a draw pool of twelve "1" tokens yields ten monsters on ten draws 100% of the time, while the draw pool of six "2" tokens and six "0" tokens surely won't; after all, you won't always draw exactly five zeroes and five twos.
I guess the answer is that we're talking averages here, but is there some way to mathematically represent the difference between those two situations?
If there isn't any difference (mathematically) between a pool of six "2" tokens and six "0" tokens and twelve "1" tokens, how can that be if a draw pool of twelve "1" tokens yields ten monsters on ten draws 100% of the time, while the draw pool of six "2" tokens and six "0" tokens surely won't; after all, you won't always draw exactly five zeroes and five twos.
I guess the answer is that we're talking averages here, but is there some way to mathematically represent the difference between those two situations?