HomePrice DistortionsA brief history of quantitative equity strategies

A brief history of quantitative equity strategies

-

Understanding quantitative equity investments means understanding a significant portion of market positions. Motivated by the apparent failure of the capital asset pricing model and the efficient market hypothesis, a large share of equity investors follows stylized “factors” that are expected to outperform the market portfolio in the long run. Yet, popularity and past performance of such factors can be self-defeating, while published research is prone to selection bias and overfitting. Big data has introduced greater information efficiency with respect to textual analysis, picking up short-term sentiment but without clear and documented benefits for long-term investment so far. In the future greater emphasis may be placed on dynamic factor models that – in principle – can apply plausible performance factors at the times that they matter.

Becker, Ying and Marc Reinganum, (2018) “The Current State of Quantitative Equity Investing “, CFA Institute Research Foundation; Literature Review, Volume 13, Issue 1, June 2018.

The post ties in with SRSV’s summary lecture on information inefficiency.
The below are quotes from the paper. Emphasis and cursive text have been added.

Failure of theoretical market efficiency

“The current approaches and products of quantitative equity investing stand on the shoulders of major theoretical and empirical contributions in financial economics…

  • The ‘capital asset pricing model’ (CAPM)…demonstrated that…the expected return of a risky asset equals the risk-free rate of interest plus a risk premium, where the risk premium is proportional to the asset’s beta (the covariance between the security return and the market return scaled by the variance of the market return)…There may be many risky events that could affect the realized returns of securities, but only beta risk is systematically priced…because other risks can be ‘diversified away.’…The market portfolio in the CAPM is the unique portfolio that attains the maximum value of the Sharpe ratio and offers investors the best possible risk–return trade-off…
  • According to the efficient market hypothesis, price changes over time are uncorrelated because current prices ‘fully’ reflect all relevant information.”

“By the late 1970s and early 1980s…the initial foundations of the CAPM and the efficient market hypothesis were showing cracks that might be exploited by investors…

  • Information contained in price-earnings ratios did not appear to be ‘fully reflected’ in prices. ..
  • Portfolios of small-cap stocks outperformed portfolios of large-cap stocks on a beta risk-adjusted basis—the so-called size effect…
  • Revisions in analysts’ forecasts of earnings could be used to earn abnormal returns in the two months following the release…
  • Book-to-price ratios could help investors exploit pricing errors after controlling for several risk indexes.”

Factor investing

“The era of equity factor investing…[might be] attributed to the empirically based three-factor model [of] Fama and French [which] combined previous research on the size effect, the value effect, and the overall market factor into one cross-sectional equation…The three factors are [1] the difference in return between the cap-weighted market portfolio and the risk-free rate of interest, [2] the difference in returns between a portfolio of small-cap stocks and a portfolio of large-cap stocks; and [3] the difference in returns between a portfolio with high book values relative to market values of equity and a portfolio with low [book-price ratios].”

“Following Fama and French (1993), the floodgates for factor investing seemed to open…

  • Over short and intermediate time horizons, stock returns exhibit momentum…documented…in more than two centuries of price data…
  • At longer, three- to five-year, investment horizons…prior stock market losers tend to become subsequent winners relative to prior winners…
  • Illiquidity premiums [are seen] as a significant influence on expected returns because even seemingly small differences in transaction costs could make a meaningful difference in asset values.
  • Substantial increases in capital investments tend to be associated with subsequent negative abnormal returns…empire building might be hazardous to shareholder wealth…
  • The most puzzling cross-sectional anomaly is risk itself…Unexpectedly large returns are seen in…low-volatility stocks, versus high-volatility ones, and…low-beta stocks, versus high-beta ones)…”

“The backlash against the proliferation of potential factors emerged…[Critics] raised the issue of data snooping in tests of asset pricing models…Concern about overfitting and potential solutions to it has only grown over time.”

Smart beta

“The term ‘smart beta’ was probably introduced into the quantitative equity management lexicon by Towers Watson, a global consulting firm, in 2013….The smart factors can be implemented in a variety of weighting schemes. But the smart factors and the strategies based on them all scream out at least one common chorus: ‘I am not cap-weighted!’”

The capitalization-weighted market portfolio became synonymous with passive investing; active portfolios were defined as portfolios whose security weights differed from those of capitalization weights…[Academic research] suggested that asset owners may be disappointed by their aggregate active performance because they may be overpaying by 43% on average for their active risk, given that this part of risk could be obtained through low-cost factor solutions.”

“If not capitalization-weighted, then what?…

  • Fundamental indexes’… use such items as gross revenue, equity book value, and total employment to calculate security weights…[in order to] deliver superior mean–variance performance relative to capitalization-weighted indexes…
  • The risk parity approach yields weights… such that each risky asset has the same risk allocation…[in order to achieve] better diversification and more portfolio efficiency…

From its humble and outcast beginning nearly 40 years ago, factor investing has now gone mainstream.”

Big data

“Big data is often discussed in terms of the four Vs: volume, variety, velocity, and (more recently) veracity. With advances in technology, the different types of information that quantitative equity might find useful has exploded. In practice, most ‘big data’ analyses to date in quantitative equity have focused on unstructured data emanating from text-based sources, with various degrees of credibility and curation…

  • Momentum [appears to be] exacerbated by news coverage. That is, prior stock market winners with excessively high media coverage experience returns substantially greater than the returns of prior losers with excessive media coverage over about nine months…The return differential between winners and losers is much smaller for companies with excessively low media coverage compared with high-media coverage companies…
  • Tone changes in the management discussion and analysis…do convey information not embedded in the regular financial statements…
  • The information content of Twitter sentiment has been predictive of market returns around FOMC meetings…”

“Most published work on big data in equity management has focused on investor sentiment extracted from natural language processing algorithms applied to social media, official documents, press releases, and company conference calls. The preponderance of published evidence indicates that to the extent that big data does contain useful sentiment information, it is for the most part short-lived in terms of profitable stock trading. Although these data may be quite relevant for market makers and trade desks, they do not seem to contain hidden, easy-to-exploit gems of information. Indeed, for long-term investors, it is not yet clear that big data per se is a big deal for their investment processes.”

Dynamic factor models

“Dynamic modeling is a current, promising area of quantitative research, with roots dating back to the 1980s and models of time-varying expected returns. Evidence of predictable returns can be found in very recent research as well…Insights may turn out to be centered much more on what factors are rewarded at given points in time than on what factors are rewarded on average over time. Some might label this approach ‘factor timing.’”

“[Academic research] suggested a dynamic factor-weighting framework to respond to changes in factor predictability. Incorporating classification tree analysis…[a] multifactor dynamic approach generated reward-to-risk ratios nearly four times greater than those generated by static approaches.”

“Momentum, a statistical factor, continues to be a bedrock of factor investing, yet it doesn’t always work and sometimes crashes dramatically. [Recent academic research] demonstrated that a dynamic momentum strategy can double the alpha and the Sharpe ratio of a static momentum strategy….[Other research] concluded that there are cycles in low-volatility investing and that the performance of low volatility is time varying and influenced by the economic environment.”

 

Editor
Editorhttps://research.macrosynergy.com
Ralph Sueppel is managing director for research and trading strategies at Macrosynergy. He has worked in economics and finance since the early 1990s for investment banks, the European Central Bank, and leading hedge funds.