   SEARCH HOME Math Central Quandaries & Queries  Question from Shahzada, a student: When and why we use standard deviation in data analysis? Standard deviation is a measure of the variability or dispersion in a data set. It is used widely in data analysis. Sometime is a purely descriptive sense. Here is an example

My friend Penny teaches two classes of algebra 1, class A and class B, each with the same number of students. She gave the same test to both classes and each class scored a class average of 65%. Class A had a standard deviation of 15 and class B had a standard deviation of 5. Thus the grades in class B were much more uniform than the grades in class A. This may result in Penny looking more closely at the grades in class A. Why is there so much dispersion?

Another place data analysis is used is in estimation.

A large factory wants to estimate the amount of carbon dioxide emission it produces in a day. Twenty-five days are randomly selected and the amount of emission is measured on these days. The mean is found to be 2.7 tons per day. Thus my estimate of the amount of carbon dioxide emission is 2.7 tons per day. This is only an estimate however and there is a margin of error. I should say my estimate is 2.7 tons plus or minus a margin of error. The margin of error is going to depend on how variable the 25 measurements are. This variability in the measurements is calculated using the standard deviation.

I hope this helps,
Harley     Math Central is supported by the University of Regina and The Pacific Institute for the Mathematical Sciences.