Risk Measure: Standard Deviation

Standard deviation is probably used the most to measure security's risk. Standard deviation is a statistical term that is computed using the annual rate of return of a security to provide a good indication of the security's volatility. It simply quantifies how much a series of security's returns varies around their mean, or average.

Standard deviation is a way of putting a security's performance swings into a single number, regardless whether the security delivers good or bad return. The more a security's returns fluctuate from month to month, the greater its standard deviation, the riskier the security is. Standard deviation contains both systematic (market) and unsystematic (specific) risk.

To illustrate the concept, let's review the following examples.

A security that gained 1% each and every month over the past 36 months would have a standard deviation of zero, because its monthly returns didn't change from one month to the next.

A security that lost 1% each and every month would also have a standard deviation of zero, because, again, its returns didn't vary.

A security that gained 5% one month, 25% the next, and -7% the next would have a much higher standard deviation; its returns have been more varied. 

Investors like using standard deviation because it provides a precise measure of how varied a security's returns have been over a particular time period in the past. Using the past standard deviation, you can predict the range of returns your security is likely to generate in the future. 

For most securities, future monthly returns will fall within the following ranges.

  • Within one standard deviation of its average return 68% of the time
  • Within two standard deviations 95% of the time.

For instance, a security has a standard deviation of 4% and an average return of 10% per year. It's predicted that:

68% of the time, the security's future returns would range between 6% and 14% (its 10% average plus or minus its 4% standard deviation).

95% of the time, its returns would fall between 2% and 18%, or within two standard deviations. 

It's possible to own a security with a low standard deviation and still lose money, although that's rare. Over short time frames, securities with modest standard deviations tend to lose less money than those with high standard deviations. 

A security is possible to have higher average return then another security but still has smaller standard deviation than the other security. The tradeoff between standard deviation and average return doesn’t hold for individual securities, but it does for asset classes.

A drawback of standard deviation is that it isn't intuitive. A standard deviation of 8% is obviously higher than a standard deviation of 6%, but are those high or low figures? Because a security's standard deviation is not compared to other securities or to a benchmark, it is not very useful to you without some context. You should look at similar securities, those in the same category as the security you're examining. You can compare a fund's standard deviation with a relevant index. The S&P 500, a common benchmark for large-cap funds.