Is there an easy way to calculate standard deviation for dice rolls?
So I’ve got all these polyhedral dice for role playing games. I can use classical statistics to calculate quite easily the assumed uniform distribution for a single die roll, and what the mean is.
Now if I roll the same die several times, and add the results, the probability for any particular number starts to form a bell shaped curve.
For example, if I roll a six-sided die three times, there would be a sort of bell shaped curve, with 10.5 being the mean, with 3 and 18 being the extremes.
It seems to me that I should be able to calculate the standard deviation for these sort of bell shaped curves without too much trouble, but our statistics teacher thinks the only way to do it is to roll the dice about a billion times, and go through these complicated equations for each individual roll. Isn’t there an easier way?
What a mess! I might as well make a table for each one and estimate it off of that!
Video posted by: ye_river_xiv