Enter the phrase "investment forecasts" on Google and you will be bombarded with over 37 million hits. Merely append a single word to make your search "unsubstantiated investment forecasts" or similar phrases and not a single hit will likely be found! Yet it should be obvious that far too many investment forecasts prove to be less than helpful.
It is easy to locate investment forecasts whether they are actually labeled "forecasts" or not. As broadly defined, they can be found anywhere someone expresses a recommendation for (or against) one or more particular investments. Implicit would be the assumption that the forecast provider has some type of objective data, or at a bare minimum, a presumably knowledgeable opinion, reflecting favorably or unfavorably on what an investment's future performance is likely to be. If not explicitly stated, the target audience assumes a reasonable time frame for the prediction.
Sadly missing from the picture is that most of these forecasts will never be revisited to determine how accurate the forecast turned out to be. Predictions, obviously, are intended to cast some light on what the future will show. If a source issuing investment predictions has shown itself through follow-up data to have been relatively accurate, its forecasts might be said to show at least some degree of predictive validity. If the opposite, the forecasts would be said to show little or no validity. It stands to reason that it can be dangerous for investors to act upon predictions coming from sources that cannot document some sort of prior predictive validity regarding their forecasts.
In order to demonstrate the potential usefulness of a newly issued prediction coming from a given source, two requirements must be met: 1) in the past, the source must not only have stated a prediction, but the event predicted must have been revisited at some later date to assess its accuracy; 2) ideally, the source must be able to show at least several accurate predictions over a stretch of time to ensure that a single instance of predictive success was not merely the result of luck.
As investors, it behooves us to always be mentally "on guard" whenever we see an investment forecast. Is a particular investment forecast to be trusted, or do we take almost all investment forecasts with a grain of salt? If the former, have we based our judgment on the above two rules, or, perhaps the forecaster's expertise is merely assumed. If the latter, perhaps we may be giving short shrift to some forecasts that have indeed demonstrated relative accuracy in the past.
As you are likely aware, my investment newsletter, like most others, aims to help investors make sound choices. By making specific recommendations, I am really making predictions as to what will ultimately turn out to be good choices, or those that will be not so good.
I don't think it makes much sense for me to make recommendations as to what investments one should choose to achieve good future results without examining how my prior such choices have done. (If someone's past forecasts didn't pan out, why should you believe that their future ones will?)
While, granted, it is impossible for anyone to predict with anything near 100% accuracy, if positive and/or negative recommendations turn out to be closer to 50% correct, this means that there is really very little, or even no, value in considering them; merely guessing, or even flipping a coin, should yield a similar result. Of course, some forecasters' predictions are even less than 50% correct. This likely results from the often counterintuitive nature of investing: that is, just when things look about as bad as they can be for an investment, or vice versa, things can turn around, leaving even some who might seem the most expert with egg on their faces.
Along this line, if predictions fail to pan out, one could argue that something totally unexpected by nearly everyone occurred which caused the predictions to go awry. This might pass as an acceptable explanation on occasion. But too many near-chance predictive outcomes suggests that the predictions aren't working, and therefore, should not be given much credence. At best, it suggests that the kind of outcome being predicted is too complex and likely cannot be predicted by anyone, which is sometimes argued by some in the case of all investment forecasting.
Yet the great majority of investment recommendations we are exposed to in newspapers, magazines, other media, or even by financial advisors, do not give us a good idea of what the success rate was for these sources' prior predictions. Perhaps this is why some investors turn to investment newsletters, which, in my experience, do tend to show their prior forecasting track record. Of course, such a continuing source of predictions also allows longer-term readers to judge for themselves through time how accurate the recommendations previously given have proven to be.
Here is but one concrete example of the myriad of investment forecasts we all likely come across regularly. It was chosen not to single it out but only to illustrate our premise:
CNBC.com, among the most popular websites in the U.S., published an article entitled "10 Best Mutual Funds for 2012" near the close of 2011. Apparently, the list was derived from another popular financial website's rating of approximately 25,000 funds. The following table shows the 10 funds recommended and their current performance since the start of the year: (Note that neither CNBC, nor their associated website, are, as far as we know, providing you with this follow-up evidence regarding the predictive accuracy of their "10 best" list; I am. If a reader has knowledge of the source's own follow-up data regarding their predictions, I would appreciate hearing about it.)
Morningstar Data for CNBC.com's "10 Best Mutual Funds for 2012" Ordered by Explicit Forecast Preference | ||||||
Top Five Funds | 2nd Best Five Funds | |||||
Fund (Symbol) | Tot. Ret. | Rank in Cat. | Fund (Symbol) | Tot. Ret. | Rank in Cat. | |
1. Permanent Portfolio* (PRPFX) | 2.17% | 92% | 6. Schwab Tax-Free Bond Fund* (SWNTX) | 3.02% | 31% | |
2. Bruce Fund* (BRUFX) | 2.99 | 86% | 7. Appleseed Fund Investor (APPLX) | 4.26 | 86% | |
3. ING Value Choice A (PAVAX) | -6.50 | 99% | 8. Vanguard Wellesley Income Inv* (VWINX) | 5.19 | 31% | |
4. Nuveen Tradewinds Value Opp A (NVOAX) | -6.06 | 99% | 9. Matthews Asia Dividend Fund Inv (MAPIX) | 8.76 | 10% | |
5. Tilson Dividend (TILDX) | 7.17 | 47% | 10. ING Morgan Stanley Global Franch A (IGFAX) | 8.09 | 17% |
This list of recommendations for 2012, which represents a forecast (although, as the author points out, not a guarantee) of good performance, is certainly not panning out as one might have hoped for thus far this year. As can be seen, the most highly recommended four funds are doing extremely poorly vs. their category peers. Out of the 10 funds, exactly half might be said to be doing satisfactorily while the other half certainly aren't. This represents at best a 50% success rate.
If one had invested equally in each of the 10 recommended funds, one's 6 month total return would currently be 2.91%. This compares with the year-to-date return of 9.41% on the unmanaged Vanguard 500 Index and 2.37% for the Vanguard Total Bond Market Index. With 2012 half over, it appears unlikely that the forecasted recommendations will wind up living up to their billing as the "10 best" for 2012.
To see how my Newsletter's choices of funds, also made near the close of 2011, have done thus far in 2012, you can visit my site at http://funds-newsletter.com. The July Newsletter also presents our current Model Portfolios for both stock and bond funds.