The recent economic situation has caused many people to suggest that the U.S. stock market experienced a “lost decade” from 2000 to 2009. The Standard & Poor’s 500 index, in particular, was relatively flat, starting — on a dividend-adjusted basis — at 1170.9 and ending at 1073.9. But, has the whole decade really been lost? The answer depends on your approach to investing. The majority of investors are constantly accumulating assets (i.e., dollar-cost averaging), by making monthly contributions to a retirement account. Did these investors lose 10 years of return? The answer is no.
Suppose you purchase an asset for $100, and one year later that asset is worth $50. The first-year return is minus 50 percent. Also suppose that in the second year, the asset gains in value, ending at $75. That second year return is 50 percent. Taking the straight (arithmetic) average, you see a return of zero percent, which suggests that you came out even. This is obviously not the case; you invested $100, and two years later you only had $75 — a loss of 25 percent.
However, using the average (geometric) basis, we see that your average return is minus 13.4 percent. This loss is more reflective of the actual return that you earned. The difference between the two averages is driven by volatility and associated compounding of the volatile returns.
What if, instead, you dollar-cost averaged and invested $100 each period? You would not need to earn 100 percent to offset the original loss. The original $100 investment would be worth $50 at the end of the first year, at which time another $100 would be added. At the end of the second year, the $150 would have grown to $225, which is greater than the absolute amount of $200 that was invested over the time period.
This scenario shows a positive return for the period because more money was invested during the better year. That return is referred to as the dollar-weighted return. The return for our example would be 8.11 percent per year. Even though the market was down, as a result of dollar-cost averaging, you would see a positive average return. This is the benefit of volatility — it allows investors to purchase more shares at relatively lower prices. Not all volatility is bad, particularly if the dollar-cost averaging approach is used.
DCA and the lost decade
Given the potential benefit of dollar-cost averaging, or DCA, the question now becomes whether the lost decade was truly lost, or if some investors saw gains during that period. To examine the issue, I have analyzed monthly dividend-adjusted closing prices for the S&P 500 from Jan. 3, 2000, through Jan. 4, 2010, the period of time most commonly associated with the lost decade. The monthly arithmetic average return is 0.03 percent and the geometric average is minus 0.08 percent. This reflects on the earlier example — the geometric (or compounded) average is always less than the arithmetic average when volatility is present. It also calls out the downside of volatility. However, for a DCA investor, the return is 0.04 percent. While this is not large, it shows that DCA enables an investor to offset the negative impact of volatility on compounded returns by taking advantage of buying opportunities. In the end, it’s safe to say that the decade, while not strong, was not completely lost.
View the full research of “The lost (or found) decade: Capturing volatility benefits using DCA.”