Macro-economic forecasting, value-investing and asset management are often seen as distinct fields by DACH region institutional investors. Carsten-Patrick Meier, managing director of Kiel Economics and former head of the German economic forecasting unit at the Kiel Institute for the World Economy, thinks comprehensively about these areas. On behalf of FONDSBOUTIQUEN.DE, Markus Hill discusses portfolio management, Big Data, expectations formation and risk management with him. Subjects such as quantitative investment strategies, beauty contests, financial valuation ratios and statistical methods are also touched upon as are the pros and cons of rule-based investing.
Hill: What is the role of asset management in the economy?
Meier: Professional asset managers direct capital from savers to investors, i.e. from where there is a surplus to where it is scarce, not unlike banks do. They continually obtain information and, on this basis, decide which investment projects their capital is allocated to. Additionally, they provide liquidity to markets: For instance, when savers want to sell off shares after a crash in order to adjust their portfolio to their personal risk preference or regulatory requirements, asset managers take up the other side of the deal. By buying these assets, they allow these transactions to be carried out. For this provision of liquidity, they are compensated by the market via higher average returns. Thus, in the economy as a whole, more capital can be utilised which increases productivity and, in addition, wealth can be transferred to the future in a profitable, diversified, and liquid way.
Hill: As an applied business cycle researcher, do you have a distinct view of the financial industry?
Meier: The macro-economy and financial markets are related via corporate earnings. In the long term, stock prices can only grow to the extent of overall nominal GDP growth. In the short term, modern capital market theory holds that the price level in the stock market is determined by corporate earnings multiplied with the “stochastic discount factor” that depends on the state, the world is currently in (e. g. risky or not-so-risky). The business cycle affects both of these terms, earnings and the stochastic discount factor. In business cycle forecasting as well as in capital market strategies, an essential building block are predictions about macroeconomic phenomena and the decisions which follow from those predictions. In applied macroeconomics these decisions relate to the policy choices of central banks and other economic policy makers. In active asset management they are trading strategies and trading rules. The theoretical foundation is largely the same in both disciplines as are the statistical methodologies. The data do differ along some lines but not fundamentally so. A major difference is that risk management plays a much larger role in asset management than in applied macroeconomics.
Hill: Is a good business cycle analyst automatically a good capital market strategist?
Meier: Yes and no. Yes, because business cycle analysts with an academic background are always “quants”, too. The quantitative skills that currently make freshly graduated econometricians attractive for Google and Facebook are also applicable in the classic “big data” field of capital market analysis. Adding to this, business cycle researchers are sensitive to the intricacies of macroeconomic data. Which data are available, how and since when have they been collected, how often and at which frequency are they being published and revised, etc.
Hill: And why not?
Meier: Because a macroeconomic world view does not automatically translate into a valuable assessment of capital market dynamics. This notion can already be found in chapter eight of Graham and Dodds’ “Security Analysis” of 1934 and it is just as relevant now as it was back then. Markets are typically ahead of the overall cycle of the economy. For this reason, future market movements that have not yet been priced in can only be derived from a macro prediction about an even longer forecast horizon – which is, of course, even more uncertain than the more short-term predictions. Additionally, the co-movement of the macro-economy and stock prices are not that close. In Germany since 1950, the highest correlation between the yearly DAX return and the nominal growth rate of GDP can be found with a lead (!) of the DAX before GDP by one year, with a coefficient of about one third only.
Hill: Why is that?
Meier: Keynes explained this with his game-theoretic parable about the “beauty contest”. In a competition, a prize is awarded to the person who chooses from a number of photos the one selected by most of the other participants as the most beautiful. The winning strategy in this game is not to choose the photo according to your own taste, but according to the presumed taste of the majority of participants. It is similar with regards to the correlation between economic forecasts and capital market predictions: In order to forecast the trajectory of the market, it is more important to anticipate what the majority of market participants believe about the development of macroeconomic production, price levels, corporate profits, etc., than to know the actual future development of these variables. In short: expectation beats reality in the forecasting of asset prices. This is the subtle difference between business cycle research and macroeconomic capital market analysis.
Hill: What does this mean for the choice of market strategy?
Meier: It means that the focus must be on variables that reflect the expectations of the different capital market players. Surveys of market participants like the Sentix survey in Germany or Gallup or Duke University’s CFO survey in the US can be used as direct indicators of expectations. In addition, there are surveys on the broader economy, such as those of ifo, DIHK, or GfK, but also macroeconomic forecasts – although the market opinion ought to be more important than the one’s own assessment. Furthermore, classical valuation ratios such as the price-earnings ratio, the price-to-book ratio and other ratios derived from balance sheet and investment data, calculated for the market as a whole, are also suitable for market timing.
Hill: How do you deduce market dynamics from valuation measures and expectation indicators?
Meier: Here, time series come in useful. For example, if I observe that, over many years or even decades, the stock market’s dividend yield leads the market’s rate of return, I can estimate the average correlation between dividend yield and return rate by the partial correlation coefficient. A particular value of the dividend yield in the current period then implies a certain statistical expected value for the rate of return in the following period. Since I can also see how widely earnings “forecasts” of this type had scattered around the actual value in the past, I can use this to also calculate an interval in which the rate of return is likely to lie with a certain probability, e.g. 95 percent. I can then build the risk management system around this information.
Hill: This means that, alongside knowledge of statistical methods, you first of all need time series of the relevant variables?
Meier: Ideally, we would like to have data over many decades, i.e. covering as many market and economic cycles as possible. But they are not easy to find for Germany. National accounts data for Germany as a whole begin in 1991 and the DAX was calculated for the first time in 1987. Many macroeconomic researchers are already satisfied with working this data from this period only. This is dangerous, however, as two or three economic cycles only to a limited extent allow for robust conclusions. And there is no reason to be content with these short series, because statistical data have been collected in Germany for a long time and stock markets have existed before 1987. Given expertise in the statistischal concepts and, from time to time, some investigative work one can construct much longer time series. The series in our database typically begin in the 1950s, and some date back as far as the turn of the last century.
Hill: Do analyses about long periods of time like sixty years of more not entail the risk that the uncovered relationships are unstable across time?
Meier: First of all, it is remarkable how stable many macro-economic relationships are over time. However, there is always the risk of structural breaks. But to refrain from using as long a series of data as possible for just this reason would mean throwing out the proverbial baby with the bathwater. Instabilities in the analysis period can be detected by statistical tests and then be dealt with, for example by shortening the analysis period or modelling the break explicitly. Instabilities in the forecast period are more dangerous, as they cannot be predicted. Quoting former US Secretary of Defense Donald Rumsfeld, these are the “unknown unknowns”, i.e. that part of the system which we do not know and cannot know that one does not know it. We reduce this risk in the traditional way: through diversification. That is, we are not relying on a single indicator and a single relationship, but on a very large set of indicators, indices, and relationships. The likelihood that all these relations will break during the forecasting period ought to be rather low – but this simply cannot be known exactly.
Hill: Since this already brings to right into the topic of risk management, which factors do you think are especially important in portfolio management?
Meier: A major advantage of a strictly rules-based quantitative analysis approach is that the risk can be quantified to a large extent – using Monte Carlo methods and assuming that the above-mentioned approach succeeds in marginalizing the influence of the “unknown unknown”. The multivariate forecast distribution determined in this way can then be utilized for optimization methods such as the Markowitz approach. Note however that the expected values, variances, and covariances are then based on a forecasted distribution, not on averages of the past. Admittedly, Markowitz is based on the assumption that variance is a suitable measure of risk which can be questioned. Firstly, most investors are likely to manage much more easily with deviations upwards from the earnings means than with deviations downwards. On the other hand, variance only correctly reflects the risk of an investment if the returns are normally distributed which, as is well known, is not the case. For that to be the case, heavy losses occur too often. It therefore makes sense to base the construction of a portfolio on risk factors that take such extreme losses into account better than variance. This also applies analogously when using the Kelly criterion.
Hill: Investment strategies based on the Kelly criterion are said to be very risky…
Meier: The discussion in the investment community on the Kelly criterion is rather confused. For example, it often does not correctly differentiate between a one-off investment, such as the purchase of an option that expires on a certain date and investments for an indefinite period, such as the purchase or sale of a share or a stock index ETF. In the latter case, the allocation of wealth to risky and risk-free asset according to Kelly corresponds exactly to that of Markowitz – given a certain risk preference and under the, albeit doubtful, assumption of normally distributed returns. This is especially remarkable because the two approaches come from different schools of thought. Harry Markowitz developed his approach as a long-term, static strategy, while Kelly explicitly assumed constantly changing opportunities, therefore describing a dynamic strategy. This may be the reason why Nassim Taleb considers the Kelly criterion to be “anti-fragile” and Markowitz to be “fragile”. Be that as it may, the equivalence of Kelly and Markowitz for continuous investments makes it clear that adherence to one of the two criteria does not in and of itself imply a strategy that is more or less risky. Just as you can leverage a Markowitz portfolio umpteen times over or not, you can implement Kelly strategies at 100 or 200 percent or 50 or 25 percent and even less, just depending on your willingness to take risks. However, it can theoretically be shown that Kelly strategies at more than 100 percent are inefficient, i.e. the additional risk is not compensated for by greater yields.
Hill: You already mentioned the advantages of a rules-based asset allocation for risk management. I assume these are not the only advantages of fixed decision rules?
Meier: Paraphrasing Graham and Dodds, one can say that asset management is always systematic, otherwise it is just gambling. It must originate from a systematic foundation. The often-made distinction between “rules-based” and “non-rules-based” investment strategies is clearly inappropriate. Rather, one should speak of “explicitly regulated” strategies of those that follow less explicit rules. Binding to explicit rules does indeed have further advantages other than risk management. The most important thing is that it brings discipline into the investment process. This replaces feelings with a rational process, just exactly where psychologists have shown they are not helpful but may lead to misguided investment decisions. Essentially, the point is that when things get dicey, one does not get tempted to think of the isolated case at hand but instead conceives the problem as a process. Moreover, binding processes by rules means that nothing is forgotten or misplaced. That may sound trivial but in many professions where risk plays a role, rules have proven to be effective in ensuring a systematic way of working, from surgery and civil aviation to car repair. In addition to all this, explicit rule orientation has the advantage that it allows the development of new investment strategies and risk management systems by backtesting.
Source: www.institutional-investment.com
Photo: www.pixabay.com