Recovering Stars in Macroeconomics

Many key macroeconomic variables such as the NAIRU, potential GDP, and the neutral real rate of interest—which are needed for policy analysis—are latent. Collectively, these latent variables are known as ‘stars’ and are typically estimated using the Kalman filter or smoother from models that can be expressed in State Space form. When these models contain more shocks than observed variables, they are ‘short’, and potentially create issues in recovering the star variable of interest from the observed data. Recovery issues can occur when the model is correctly specified and its parameters are known. In this paper, we summarize the literature on shock reco..

Econometric Time Series

A Combination Forecast for Nonparametric Models with Structural Breaks

Structural breaks in time series forecasting can cause inconsistency in the conventional OLS estimator. Recent research suggests combining pre and post-break estimators for a linear model can yield an optimal estimator for weak breaks. However, this approach is limited to linear models only. In this paper, we propose a weighted local linear estimator for a nonlinear model. This estimator assigns a weight based on both the distance of observations to the predictor covariates and their location in time. We investigate the asymptotic properties of the proposed estimator and choose the optimal tuning parameters using multifold cross-validation to account for the dependence structure in time seri..

Econometric Time Series

Introducing the $\sigma$-Cell: Unifying GARCH, Stochastic Fluctuations and Evolving Mechanisms in RNN-based Volatility Forecasting

This paper introduces the $\sigma$-Cell, a novel Recurrent Neural Network (RNN) architecture for financial volatility modeling. Bridging traditional econometric approaches like GARCH with deep learning, the $\sigma$-Cell incorporates stochastic layers and time-varying parameters to capture dynamic volatility patterns. Our model serves as a generative network, approximating the conditional distribution of latent variables. We employ a log-likelihood-based loss function and a specialized activation function to enhance performance. Experimental results demonstrate superior forecasting accuracy compared to traditional GARCH and Stochastic Volatility models, making the next step in integrating do..

Econometric Time Series

The Local Projection Residual Bootstrap for AR(1) Models

This paper contributes to a growing literature on confidence interval construction for impulse response coefficients based on the local projection (LP) approach. We propose an LP-residual bootstrap method to construct confidence intervals for the impulse response coefficients of AR(1) models. The method uses the LP approach and a residual bootstrap procedure to compute critical values. We present two theoretical results. First, we prove the uniform consistency of the LP-residual bootstrap under general conditions, which implies that the proposed confidence intervals are uniformly asymptotically valid. Second, we show that the LP-residual bootstrap can provide asymptotic refinements to the co..

Econometric Time Series

Consistency, distributional convergence, and optimality of score-driven filters

We study the in-fill asymptotics of score-driven time series models. For general forms of model mis-specification, we show that score-driven filters are consistent for the Kullback-Leibler (KL) optimal time-varying parameter path, which minimizes the pointwise KL divergence between the statistical model and the unknown dynamic data generating process. This directly implies that for a correctly specified predictive conditional density, score-driven filters consistently estimate the time-varying parameter path even if the model is mis-specified in other respects. We also obtain distributional convergence results for the filtering errors and derive the filter that minimizes the asymptotic filte..

Econometric Time Series

Robust bootstrap inference for linear time-varying coefficient models: Some Monte Carlo evidence

We propose two robust bootstrap-based simultaneous inference methods for time series models featuring time-varying coefficients and conduct an extensive simulation study to assess their performance. Our exploration covers a wide range of scenarios, encompassing serially correlated, heteroscedastic, endogenous, nonlinear, and nonstationary error processes. Additionally, we consider situations where the regressors exhibit unit roots, thus delving into a nonlinear cointegration framework. We find that the proposed moving block bootstrap and sieve wild bootstrap methods show superior, robust small sample performance, in terms of empirical coverage and length, compared to the sieve bootstrap intr..

Econometric Time Series

Estimating Pipeline Pressures in New Keynesian Phillips Curves: A Bayesian VAR-GMM Approach

This paper considers a vertical production chain in an otherwise canonical sticky price model, and estimates the New Keynesian Phillips Curve with the vertical production stages (PS-NKPC), using the commodity-flow-based U.S. price data. We employ a Bayesian VAR-GMM method and compare the PS-NKPC with the canonical NKPC based on a quasi-marginal likelihood criterion, which is robust under weakly identified parameters. Thus our result adds to the empirical relevance of the so-called ``pipeline price pressures'' incurred by upstream stages of production. Our estimates suggest that (i) the PS-NKPC performs better than the canonical New Keynesian Phillips Curve in terms of quasi-marginal likeliho..

Econometric Time Series

The Bayesian Context Trees State Space Model for time series modelling and forecasting

A hierarchical Bayesian framework is introduced for developing rich mixture models for real-valued time series, along with a collection of effective tools for learning and inference. At the top level, meaningful discrete states are identified as appropriately quantised values of some of the most recent samples. This collection of observable states is described as a discrete context-tree model. Then, at the bottom level, a different, arbitrary model for real-valued time series - a base model - is associated with each state. This defines a very general framework that can be used in conjunction with any existing model class to build flexible and interpretable mixture models. We call this the Ba..

Econometric Time Series

Financial Condition Indices in an Incomplete Data Environment

We construct a Financial Conditions Index (FCI) for the United States using a dataset that features many missing observations. The novel combination of probabilistic principal component techniques and a Bayesian factor-augmented VAR model resolves the challenges posed by data points being unavailable within a high-frequency dataset. Even with up to 62% of the data missing, the new approach yields a less noisy FCI that tracks the movement of 22 underlying financial variables more accurately both in-sample and out-of-sample.

Econometric Time Series

Quantile Time Series Regression Models Revisited

This article discusses recent developments in the literature of quantile time series models in the cases of stationary and nonstationary underline stochastic processes.

Econometric Time Series

High Dimensional Time Series Regression Models: Applications to Statistical Learning Methods

These lecture notes provide an overview of existing methodologies and recent developments for estimation and inference with high dimensional time series regression models. First, we present main limit theory results for high dimensional dependent data which is relevant to covariance matrix structures as well as to dependent time series sequences. Second, we present main aspects of the asymptotic theory related to time series regression models with many covariates. Third, we discuss various applications of statistical learning methodologies for time series analysis purposes.

Econometric Time Series

Linear Regression with Weak Exogeneity

This paper studies linear time series regressions with many regressors. Weak exogeneity is the most used identifying assumption in time series. Weak exogeneity requires the structural error to have zero conditional expectation given the present and past regressor values, allowing errors to correlate with future regressor realizations. We show that weak exogeneity in time series regressions with many controls may produce substantial biases and even render the least squares (OLS) estimator inconsistent. The bias arises in settings with many regressors because the normalized OLS design matrix remains asymptotically random and correlates with the regression error when only weak (but not strict) ..

Econometric Time Series

GARHCX-NoVaS: A Model-free Approach to Incorporate Exogenous Variables

In this work, we further explore the forecasting ability of a recently proposed normalizing and variance-stabilizing (NoVaS) transformation after wrapping exogenous variables. In practice, especially in the area of financial econometrics, extra knowledge such as fundamentals- and sentiments-based information could be beneficial to improve the prediction accuracy of market volatility if they are incorporated into the forecasting process. In a classical approach, people usually apply GARCHX-type methods to include the exogenous variables. Being a Model-free prediction method, NoVaS has been shown to be more accurate and stable than classical GARCH-type methods. We are interested in whether the..

Econometric Time Series

Target PCA: Transfer Learning Large Dimensional Panel Data

This paper develops a novel method to estimate a latent factor model for a large target panel with missing observations by optimally using the information from auxiliary panel data sets. We refer to our estimator as target-PCA. Transfer learning from auxiliary panel data allows us to deal with a large fraction of missing observations and weak signals in the target panel. We show that our estimator is more efficient and can consistently estimate weak factors, which are not identifiable with conventional methods. We provide the asymptotic inferential theory for target-PCA under very general assumptions on the approximate factor model and missing patterns. In an empirical study of imputing data..

Econometric Time Series

Vector Autoregression in Cryptocurrency Markets: Unraveling Complex Causal Networks

Methodologies to infer financial networks from the price series of speculative assets vary, however, they generally involve bivariate or multivariate predictive modelling to reveal causal and correlational structures within the time series data. The required model complexity intimately relates to the underlying market efficiency, where one expects a highly developed and efficient market to display very few simple relationships in price data. This has spurred research into the applications of complex nonlinear models for developed markets. However, it remains unclear if simple models can provide meaningful and insightful descriptions of the dependency and interconnectedness of the rapidly dev..

Econometric Time Series

SGMM: Stochastic Approximation to Generalized Method of Moments

We introduce a new class of algorithms, Stochastic Generalized Method of Moments (SGMM), for estimation and inference on (overidentified) moment restriction models. Our SGMM is a novel stochastic approximation alternative to the popular Hansen (1982) (offline) GMM, and offers fast and scalable implementation with the ability to handle streaming datasets in real time. We establish the almost sure convergence, and the (functional) central limit theorem for the inefficient online 2SLS and the efficient SGMM. Moreover, we propose online versions of the Durbin-Wu-Hausman and Sargan-Hansen tests that can be seamlessly integrated within the SGMM framework. Extensive Monte Carlo simulations show tha..

Econometric Time Series

Exact Likelihood for Inverse Gamma Stochastic Volatility Models

We obtain a novel analytic expression of the likelihood for a stationary inverse gamma Stochastic Volatility (SV) model. This allows us to obtain the Maximum Likelihood Estimator for this non linear non Gaussian state space model. Further, we obtain both the filtering and smoothing distributions for the inverse volatilities as mixture of gammas and therefore we can provide the smoothed estimates of the volatility. We show that by integrating out the volatilities the model that we obtain has the resemblance of a GARCH in the sense that the formulas are similar, which simplifies computations significantly. The model allows for fat tails in the observed data. We provide empirical applications u..

Econometric Time Series

Principal Component Analysis and Hidden Markov Model for Forecasting Stock Returns

This paper presents a method for predicting stock returns using principal component analysis (PCA) and the hidden Markov model (HMM) and tests the results of trading stocks based on this approach. Principal component analysis is applied to the covariance matrix of stock returns for companies listed in the S&P 500 index, and interpreting principal components as factor returns, we apply the HMM model on them. Then we use the transition probability matrix and state conditional means to forecast the factors returns. Reverting the factor returns forecasts to stock returns using eigenvectors, we obtain forecasts for the stock returns. We find that, with the right hyperparameters, our model yields ..

Econometric Time Series

The Yule-Frisch-Waugh-Lovell Theorem

This paper traces the historical and analytical development of what is known in the econometrics literature as the Frisch-Waugh-Lovell theorem. This theorem demonstrates that the coefficients on any subset of covariates in a multiple regression is equal to the coefficients in a regression of the residualized outcome variable on the residualized subset of covariates, where residualization uses the complement of the subset of covariates of interest. In this paper, I suggest that the theorem should be renamed as the Yule-Frisch-Waugh-Lovell (YFWL) theorem to recognize the pioneering contribution of the statistician G. Udny Yule in its development. Second, I highlight recent work by the statisti..

Econometric Time Series

Stationarity with Occasionally Binding Constraints

This paper studies a class of multivariate threshold autoregressive models, known as censored and kinked structural vector autoregressions (CKSVAR), which are notably able to accommodate series that are subject to occasionally binding constraints. We develop a set of sufficient conditions for the processes generated by a CKSVAR to be stationary, ergodic, and weakly dependent. Our conditions relate directly to the stability of the deterministic part of the model, and are therefore less conservative than those typically available for general vector threshold autoregressive (VTAR) models. Though our criteria refer to quantities, such as refinements of the joint spectral radius, that cannot feas..

Econometric Time Series

Noise reduction for functional time series

A novel method for noise reduction in the setting of curve time series with error contamination is proposed, based on extending the framework of functional principal component analysis (FPCA). We employ the underlying, finite-dimensional dynamics of the functional time series to separate the serially dependent dynamical part of the observed curves from the noise. Upon identifying the subspaces of the signal and idiosyncratic components, we construct a projection of the observed curve time series along the noise subspace, resulting in an estimate of the underlying denoised curves. This projection is optimal in the sense that it minimizes the mean integrated squared error. By applying our meth..

Econometric Time Series

Gaussian semiparametric estimation Gaussian semiparametric estimation of two-dimensional intrinsically stationary random fields

We consider Gaussian semiparametric estimation (GSE) for two-dimensional intrinsically stationary random fields (ISRFs) observed on a regular grid and derive its asymptotic properties. Originally GSE was proposed to estimate long memory time series models in a semiparametric way either for stationary or nonstationary cases. We try an extension of GSE for time series to anisotropic ISRFs observed on two dimensional lattice that include isotropic fractional Brownian fields (FBF) as special cases, which have been employed to describe many physical spatial behaviours. The GSE extended to ISRFs is consistent and has a limiting normal distribution with variance independent of any unknown parameter..

Econometric Time Series

"Generalized Extreme Value Approximation to the CUMSUMQ Test for Constant Unconditional Variance in Heavy-Tailed Time Series".

This paper focuses on testing the stability of the unconditional variance when the stochastic processes may have heavy-tailed distributions. Finite sample distributions that depend both on the effective sample size and the tail index are approximated using Extreme Value distributions and summarized using response surfaces. A modification of the Iterative Cumulative Sum of Squares (ICSS) algorithm to detect the presence of multiple structural breaks is suggested, adapting the algorithm to the tail index of the underlying distribution of the process. We apply the algorithm to eighty absolute log-exchange rate returns, finding evidence of (i) infinite variance in about a third of the cases, (ii..

Econometric Time Series

Random Subspace Local Projections

We show how random subspace methods can be adapted to estimating local projections with many controls. Random subspace methods have their roots in the machine learning literature and are implemented by averaging over regressions estimated over different combinations of subsets of these controls. We document three key results: (i) Our approach can successfully recover the impulse response function in a Monte Carlo exercise where we simulate data from a real business cycle model with fiscal foresight. (ii) Our results suggest that random subspace methods are more accurate than factor models if the underlying large data set has a factor structure similar to typical macroeconomic data sets such ..

Econometric Time Series

Robust Impulse Responses using External Instruments: the Role of Information

External-instrument identification leads to biased responses when the shock is not invertible and the measurement error is present. We propose to use this identification strategy in a structural Dynamic Factor Model, which we call Proxy DFM. In a simulation analysis, we show that the Proxy DFM always successfully retrieves the true impulse responses, while the Proxy SVAR systematically fails to do so when the model is either misspecified, does not include all relevant information, or the measurement error is present. In an application to US monetary policy, the Proxy DFM shows that a tightening shock is unequivocally contractionary, with deteriorations in domestic demand, labor, credit, hous..

Econometric Time Series

Information-Theoretic Time-Varying Density Modeling

We present a comprehensive framework for constructing dynamic density models by combining optimization with concepts from information theory. Specifically, we propose to recursively update a time-varying conditional density by maximizing the log-likelihood contribution of the latest observation subject to a Kullback-Leibler divergence (KLD) regularization centered at the one-step ahead predicted density. The resulting Relative Entropy Adaptive Density (READY) update has attractive optimality properties, is reparametrization invariant and can be viewed as an intuitive regularized estimator of the pseudo-true density. Popular existing models, such as the ARMA(1, 1) and GARCH(1, 1), can be retr..

Econometric Time Series

Estimating and Testing for Functional Coefficient Quantile Cointegrating Regression

This paper proposes a generalized quantile cointegrating regressive model for nonstationary time series, allowing coefficients to be unknown functions of informative covariates at each quantile level. Using a local polynomial quantile regressive method, we obtain the estimator for the functional coefficients at each quantile level, which is shown to be nonparametrically super-consistent. To alleviate the endogeneity of the model, this paper proposes a fully modified local polynomial quantile cointegrating regressive estimator which is shown to follow a mixed normal distribution asymptotically. We then propose two types of test statistics related to functional coefficient quantile cointegrati..

Econometric Time Series

Flexible Bayesian MIDAS: time‑variation, group‑shrinkage and sparsity

We propose a mixed‑frequency regression prediction approach that models a time‑varying trend, stochastic volatility and fat tails in the variable of interest. The coefficients of high‑frequency indicators are regularised via a shrinkage prior that accounts for the grouping structure and within‑group correlation among lags. A new sparsification algorithm on the posterior motivated by Bayesian decision theory derives inclusion probabilities over lag groups, thus making the results easy to communicate without imposing sparsity a priori. An empirical application on nowcasting UK GDP growth suggests that group‑shrinkage in combination with the time‑varying components substantially inc..

Econometric Time Series

Uniform Inference for Cointegrated Vector Autoregressive Processes

Uniformly valid inference for cointegrated vector autoregressive processes has so far proven difficult due to certain discontinuities arising in the asymptotic distribution of the least squares estimator. We show how asymptotic results from the univariate case can be extended to multiple dimensions and how inference can be based on these results. Furthermore, we show that the novel instrumental variable procedure proposed by [20] (IVX) yields uniformly valid confidence regions for the entire autoregressive matrix. The results are applied to two specific examples for which we verify the theoretical findings and investigate finite sample properties in simulation experiments.

Econometric Time Series

Improving the accuracy of bubble date estimators under time-varying volatility

In this study, we consider a four-regime bubble model under the assumption of time-varying volatility and propose the algorithm of estimating the break dates with volatility correction: First, we estimate the emerging date of the explosive bubble, its collapsing date, and the recovering date to the normal market under assumption of homoskedasticity; second, we collect the residuals and then employ the WLS-based estimation of the bubble dates. We demonstrate by Monte Carlo simulations that the accuracy of the break dates estimators improve significantly by this two-step procedure in some cases compared to those based on the OLS method.

Econometric Time Series