What is standard deviation?
Standard deviation is a statistical measure that quantifies how spread out values are in a dataset relative to the mean. A low standard deviation means the data points cluster tightly around the average, while a high standard deviation indicates the values are more dispersed. It is one of the most widely used measures of variability in statistics, finance, and science.
How do you calculate standard deviation step by step?
To calculate standard deviation from a dataset, follow these steps:
- Find the mean (average) of all numbers
- Subtract the mean from each number to get the deviations
- Square each deviation
- Find the average of the squared deviations (this gives you the variance)
- Take the square root of the variance
For example, given the numbers 2, 4, 4, 4, 5, 5, 7, 9: the mean is 5, the squared deviations sum to 32, the population variance is 32 ÷ 8 = 4, and the population standard deviation is √4 = 2.
What is the difference between population and sample standard deviation?
The key difference lies in what you divide by when calculating variance. Population standard deviation (σ) divides by N (the total number of values) and is used when your data represents the entire population. Sample standard deviation (s) divides by n − 1 instead and is used when your data is a sample from a larger population. The n − 1 correction, known as Bessel's correction, compensates for the fact that a sample tends to underestimate the true variability. In most real-world scenarios, you are working with a sample, so the sample standard deviation is the appropriate choice.
What are some standard deviation examples?
Consider two classrooms of test scores. Class A scores: 70, 72, 74, 76, 78 — the mean is 74 and the population standard deviation is 2.83. Class B scores: 50, 60, 74, 88, 98 — the mean is also 74, but the population standard deviation is 17.26. Even though both classes have the same average, Class B has far more variability in performance. You can use our mean, median, and mode calculator to find the averages before analyzing spread.
When is standard deviation useful?
Standard deviation has practical applications across many fields. In finance, it measures investment risk — a stock with a higher standard deviation has more volatile returns. In manufacturing, it drives quality control by determining whether products fall within acceptable tolerances. In science, it quantifies measurement precision and experimental reliability. In education, it helps interpret test score distributions and grade curves. Paired with a Z-score calculator, standard deviation lets you determine how unusual any individual data point is within its distribution.
What is variance and how does it relate to standard deviation?
Variance is the average of the squared deviations from the mean — in other words, it is the standard deviation squared. Population variance is denoted σ² and sample variance is denoted s². While variance is mathematically useful (it is additive for independent variables), standard deviation is more intuitive because it is expressed in the same units as the original data. For instance, if you measure heights in centimeters, the variance is in cm² while the standard deviation is in cm. Use our percentage calculator if you need to express deviations as a proportion of the mean.
How should you interpret standard deviation results?
The empirical rule (68-95-99.7 rule) provides a useful framework for normally distributed data: approximately 68% of values fall within one standard deviation of the mean, 95% within two, and 99.7% within three. A standard deviation of zero means all values are identical. When comparing datasets, a smaller standard deviation relative to the mean indicates more consistent data. The coefficient of variation (standard deviation divided by the mean, expressed as a percentage) is useful for comparing variability between datasets with different scales or units.