For all major economies, statistics offices publish wide arrays of economic data series, often with changing definitions, elaborate adjustments, multiple revisions and occasional large distortions. Monitoring economic data consistently is tedious and expensive. Most professional investors find it easier to trade on data surprises than on actual macro trends. It is not uncommon for investment managers to consider an economic report only in respect to its presumed effect on other investors’ expectations and positions and to subsequently forget its contents within hours of its release.
What makes monitoring economies difficult is that there is usually no single series that represents a broad macroeconomic trend on its own in a timely and consistent fashion. For example, even something apparently simple such as an inflation trend requires watching many different data series, such as consumer price growth, “core” inflation measures, price surveys, wage increases, labour market conditions, household spending, exchange rates and inflation derivatives in financial markets. Market conventions have developed on which data releases to follow and which to neglect. However, even following a reduced subset can be challenging, as indicators can carry overlapping information, often send contradictory messages and may be published with varying and frustraingly long time lags.
Therefore, good macroeconomic trend indicators condense information. There are two complementary methods to accomplish this.
- First, logical connections and sound judgment can provide theoretical structure that allows combining different data series in plausible ways so as to generate a combined more intuitive conceptual indicator.
- Second, statistical methods estimate cross-variable and intertemporal relations, thereby helping to shed uninformative data (“dimension reduction”) and to update important trends quickly (“nowcasting”).
Theoretical structure establishes a plausible relation between the observed data and the conceived macroeconomic trend. This is opposite to data mining and requires that we set out a formula based on our understanding of the data and the economy, before we explore the actual data.
- As a simple example, we can combine production data from different sectors by giving each sector the weight it carries in the country’s gross domestic product.
- As a more advanced example, we can check whether rising consumer price inflation is associated with stronger or weaker demand. This helps distinguishing between supply and demand shocks, making it easier to judge whether a price pressure will last or not (view post here).
- In principle, modern macroeconomic theory can also help. True, dynamic stochastic general equilibrium models are often too complex and ambiguous for practical insights. However, simplified static models of the New Keynesian type incorporate important features of dynamic models, while still allowing us to analyze the effect of macro shocks on interest rates, exchange rates and asset prices in simple diagrams (view post here for interest rates and here for exchange rates).
Statistical methods become useful where our prior knowledge on data structure ends They necessarily rely on the available data sample. In respect to economic trends they can accomplish two major goals: dimension reduction and nowcasting.
- Dimension reduction condenses the information content of a multitude of data series into small manageable set of factors or functions. This reduction is important for forecasting with macro variables because many data series have only limited and highly correlated information content. (view post here).
- Nowcasting tracks a meaningful macroeconomic trend in a timely and consistent fashion. An important challenge for macro trend indicators is timeliness. Unlike financial market data economic series have monthly or quarterly frequency, giving only 4-12 observations per year. For example, GDP growth, the broadest measure of economic activity, is typically only published quarterly with one to three months delay. Hence, it is necessary to integrate lower and higher-frequency indicators and to make use of data releases with different time lags.
In recent years, dynamic factor models have become a popular method for both dimension reduction and nowcasting. Dynamic factor models extract the communal underlying factor behind timely economic reports and translate the information of many data series into a single underlying trend (view post here and here). This single underlying trend is then interpreted conceptually, for example as “broad economic growth” or “inflation expectations”. Also, financial conditions of an economy can be estimated by using dynamic factor models that distill a broad array of financial variables (view post here). The estimation process may look daunting, but its basics are intuitive and calculation is executable in statistical programming language R.
It is important to measure local macroeconomic trends with a global perspective. Just looking at domestic indicators is almost never appropriate in an integrated global economy. As a simple example, inflation trends have increasingly become a global phenomenon, as a consequence of globalization and convergent monetary policy regimes. Over the past three decades local inflation has typically been drifting towards global trends in the wake of deviations (view post here). As an example of the global effects of small-country shocks, “capital flow deflection” is a useful conceptual factor for emerging markets that stipulates that one country’s capital inflow restrictions are likely to increase the inflows into other similar countries (view post here). In order to measure this effect one needs to build a time series of capital controls in all major economies in order to distil the specific impact on a single currency.