ICEEE 2025 PALERMO: ELEVENTH ITALIAN CONGRESS OF ECONOMETRICS AND EMPIRICAL ECONOMICS
PROGRAM FOR THURSDAY, MAY 29TH
Days:
next day
all days

View: session overviewtalk overview

14:00-15:40 Session 1A: Empirical Macroeconomics I
14:00
Large datasets for the Euro Area and its member countries and the dynamic effects of the common monetary policy
PRESENTER: Lorenzo Tonni

ABSTRACT. We present and describe a new publicly available large dataset which encompasses quarterly and monthly macroeconomic time series for both the Euro Area (EA) as a whole and its ten primary member countries. The dataset, which is called EA-MD-QD, includes more than 800 time series and spans the period from January 2000 to the latest available month. Since January 2024 EA-MD-QD is updated on a monthly basis and constantly revised, making it an essential resource for conducting policy analysis related to economic outcomes in the EA. To illustrate the usefulness of EA-MD-QD, we study the country specific Impulse Responses of the EA wide monetary policy shock by means of the Common Component VAR plus either Instrumental Variables or Sign Restrictions identification schemes. The results reveal asymmetries in the transmission of the monetary policy shock across countries, particularly between core and peripheral countries. Additionally, we find comovements across Euro Area countries' business cycles to be driven mostly by real variables, compared to nominal ones.

14:25
Piecing the puzzle: real exchange rates and long-run fundamentals

ABSTRACT. This paper examines the structural determinants of real exchange rates, emphasizing the persistent low-frequency movements that traditional models, such as Purchasing Power Parity (PPP) and Uncovered Interest Parity (UIP), often fail to capture. To address this, we propose a structural VAR model with common trends, enabling a clear distinction between transitory and long-term effects of structural shocks. Estimated using Bayesian techniques and applied to Canada and Norway — two resource-rich economies — the model reveals that productivity shifts and commodity market trends significantly influence domestic activity and the real exchange rate in both countries. Importantly, the model also avoids the delayed overshooting puzzle commonly associated with recursive VARs in response to monetary policy shocks. Instead, it generates exchange rate dynamics consistent with the UIP hypothesis, characterized by immediate overshooting followed by a gradual depreciation to equilibrium.

14:50
Online Monitoring of Policy Optimality

ABSTRACT. We present a method for online evaluation of the optimality of the current stance of monetary policy given the most up to date data available. The framework combines estimates of the causal effects of monetary policy tools on inflation and the unemployment gap with forecasts for these target variables. The forecasts are generated with a nowcasting model, incorporating new data as it becomes available, while using entropy tilting to anchor the long end of the forecast at long run survey expectations. In a retrospective analysis of the Fed's monetary policy decisions in the lead up to the Great Recession we find that we can reject the optimality of the policy stance as early as the beginning of February 2008. This early detection stems from the timely nowcasting of the deteriorating unemployment outlook.

15:15
Cast out the pure? Inflation and relative prices on both sides of the Atlantic
PRESENTER: Chiara Osbat

ABSTRACT. What drives inflation – domestic monetary policy or relative price shocks? After decades of low inflation in advanced economies, large relative price shocks – notably those related to energy prices – seem to have accounted for the bulk of inflation movements. We illustrate how even aggregate shocks can generate persistent relative price changes in the presence of heterogeneity in price flexibility. We then estimate the role of “pure” inflation versus relative prices, using a Bayesian dynamic factor model on disaggregated, comparable price data for the euro area and the United States. We document the different responses of pure inflation and relative prices to various aggregate and sectoral shocks in both monetary areas. We find that relative prices substantially explain the movements of inflation over the past 20 years, with an even more sizeable role since 2021. Our analysis finds little support for pure inflation being a material cause of recent inflation dynamics and arguments to that effect should, we contend, be cast out.

14:00-15:40 Session 1B: Cointegration
14:00
Inference on the cointegration and the attractor spaces via functional approximation
PRESENTER: Massimo Franchi

ABSTRACT. This paper discusses semiparametric inference on hypotheses on the cointegration and the attractor spaces for $I(1)$ linear processes, using canonical correlation analysis and functional approximation of Brownian Motions. It proposes inference criteria based on the estimation of the number of common trends in various subsets of variables, and compares them to sequences of tests of hypotheses. The exact limit distribution for one of the test statistics is derived in the univariate case. An application on exchange rates illustrates the proposed procedures.

14:25
Common Trends and Long-Run Identification in Nonlinear Structural VARs
PRESENTER: James Duffy

ABSTRACT. While it is widely recognised that linear (structural) VARs may fail to capture important aspects of economic time series, the use of nonlinear SVARs has to date been almost entirely confined to the modelling of stationary time series, because of a lack of understanding as to how common stochastic trends may be accommodated within nonlinear models. This has unfortunately circumscribed the range of series to which such models can be applied -- and/or required that these series be first transformed to stationarity, a potential source of misspecification -- and prevented the use of long-run identifying restrictions in these models. To address these problems, we develop a flexible class of additively time-separable nonlinear SVARs, which subsume models with threshold-type endogenous regime switching, both of the piecewise linear and smooth transition varieties. We extend the Granger-Johansen representation theorem to this class of models, obtaining conditions that specialise exactly to the usual ones when the model is linear. We further show that, as a corollary, these models are capable of supporting the same kinds of long-run identifying restrictions as are available in linearly cointegrated SVARs.

14:50
Robust Multivariate Observation-Driven Filtering for a Common Stochastic Trend: Theory and Application

ABSTRACT. We introduce a nonlinear semi-parametric model that allows for the robust filtering of a common stochastic trend in a multivariate system of cointegrated time series. The observation-driven stochastic trend can be specified using flexible updating mechanisms. The model provides a general approach to obtain an outlier-robust trend-cycle decomposition in a cointegrated multivariate process. A simple two-stage procedure for the estimation of the parameters of the model is proposed. In the first stage, the loadings of the common trend are estimated via ordinary least squares. In the second stage, the other parameters are estimated via Gaussian quasi-maximum likelihood. We formally derive the theory for the consistency of the estimators in both stages and show that the observation-driven stochastic trend can also be consistently estimated. A simulation study illustrates how such robust methodology can enhance the filtering accuracy of the trend compared to a linear approach as considered in previous literature. The practical relevance of the method is shown by means of an application to spot prices of oil-related commodities.

15:15
Integrated Modified OLS Estimation and Fixed-b Inference for Cointegrating Multivariate Polynomial Regressions
PRESENTER: Martin Wagner

ABSTRACT. This paper shows that the integrated modified OLS (IM-OLS) estimator developed for cointegrating linear regressions in Vogelsang and Wagner (2014a) can be extended to cointegrating multivariate polynomial regressions. These are regression models that include as explanatory variables deterministic variables, integrated processes and products of (non-negative) integer powers of these variables as regressors. The stationary errors are allowed to be serially correlated and the regressors are allowed to be endogenous. The setting thus overcomes, for polynomial-type functions, the almost omnipresent additively separable between integrated regressors setting in the nonlinear cointegration literature. The IM-OLS estimator is tuning-parameter free and does not require the estimation of any long-run variances. A scalar long-run variance, however, has to be estimated and scaled out when using IM-OLS for inference. In this respect, we consider both standard asymptotic inference as well as fixed-b inference. Fixed-b inference requires that the regression model is of full design. The results are interesting, e. g., for estimating Translog relationships or for specification testing of cointegrating relationships, with RESET-type specification tests following immediately. The simulation section zooms in on RESET specification testing and illustrates that the performance of IM-OLS is qualitatively comparable to its performance in cointegrating linear regressions.

14:00-15:40 Session 1C: Microeconometric Methods
14:00
Identifying Causal Effects of Discrete, Ordered and Continuous Treatments using Multiple Instrumental Variables

ABSTRACT. Inferring causal relationships from observational data is often challenging due to endogeneity. This paper provides new identification results for causal effects of discrete, ordered and continuous treatments using multiple binary instruments. The key contribution is the identification of a new causal parameter that has a straightforward interpretation with a positive weighting scheme and is applicable in many settings due to a mild monotonicity assumption. This paper further leverages recent advances in causal machine learning for both estimation and the detection of local violations of the underlying monotonicity assumption. The methodology is applied to estimate the returns to education and assess the impact of having an additional child on female labor market outcomes.

14:25
A Local Differencing Test for the Credibility of Selection-on-Observables

ABSTRACT. One of the most common research designs employed for the identification of causal parameters from observational data is selection-on-observables. However, justifying the underlying assumptions can be challenging and the common practice of controlling for a large number of covariates may not always suffice. This paper introduces a local differencing test designed to evaluate the credibility of selection-on-observables within specific applications. Our procedure leverages the possibility of constructing, under selection-on-observables, two consistent and asymptotically normal estimators of the conditional average treatment effect (CATE) function. We propose to locally test the null hypothesis that identification is credible at some prediction point by employing the ratio of the difference between the two estimates evaluated at that prediction point to its standard error. We show that an "oracle” version of this statistic follows an asymptotic standard normal distribution under the null hypothesis when either OLS regressions or honest regression forests are employed for CATE estimation. We then demonstrate that the oracle version closely approximates the behavior of a "feasible” statistic in large samples. Our theoretical results are validated by a simulation exercise.

14:50
A Test for Bayesian-Nash Behavior in Binary Games with Incomplete Information and Correlated Types
PRESENTER: Elia Lapenta

ABSTRACT. We provide a test to check if the distribution of the observed data can be rationalized by a unique Bayesian-Nash equilibrium of a binary game with incomplete information, where agents’ types can be mutually correlated. This hypothesis is common in empirical models of games with incomplete information and is key to identify the fundamentals of the game. The game structure is nonparametrically specified. We construct an Integrated Conditional Moment statistic. Our statistic depends on preliminary nonparametric estimators constructed by a multi-step procedure. Under the null hypothesis our statistic converges to a functional of a Gaussian process. Since the null asymptotic distribution of the statistic depends on unknown features of the data, we obtain the critical value by a novel multinomial bootstrap and prove its validity. This scheme resamples the observations by imposing that a unique Bayesian-Nash equilibrium is played. A Monte Carlo experiment shows the good small-sample performance of the test.

14:00-15:40 Session 1D: Time Series Theory I
14:00
Quantile Granger Causality in the Presence of Instability
PRESENTER: Alexander Mayer

ABSTRACT. We propose a new framework for assessing Granger causality in quantiles in unstable environments, for a fixed quantile or over a continuum of quantile levels. Our proposed test statistics are consistent against fixed alternatives, they have nontrivial power against local alternatives, and they are pivotal in certain important special cases. In addition, we show the validity of a bootstrap procedure when asymptotic distributions depend on nuisance parameters. Monte Carlo simulations reveal that the proposed test statistics have correct empirical size and high power, even in absence of structural breaks. Moreover, a procedure providing additional insight into the timing of Granger causal regimes based on our new tests is proposed. Finally, an empirical application in energy economics highlights the applicability of our method as the new tests provide stronger evidence of Granger causality.

14:25
Pseudo-variance quasi-maximum likelihood estimation of semi-parametric time series models
PRESENTER: Mirko Armillotta

ABSTRACT. We propose a novel estimation approach for a general class of semi-parametric time series models where the conditional expectation is modeled through a parametric function. The proposed class of estimators is based on a Gaussian quasi-likelihood function and it relies on the specification of a parametric pseudo-variance that can contain parametric restrictions with respect to the conditional expectation. The specification of the pseudo-variance and the parametric restrictions follow naturally in observation-driven models with bounds in the support of the observable process, such as count processes and double-bounded time series. We derive the asymptotic properties of the estimators and a validity test for the parameter restrictions. We show that the results remain valid irrespective of the correct specification of the pseudo-variance. The key advantage of the restricted estimators is that they can achieve higher efficiency compared to alternative quasi-likelihood methods that are available in the literature. Furthermore, the testing approach can be used to build specification tests for parametric time series models. We illustrate the practical use of the methodology in a simulation study and two empirical applications featuring integer-valued autoregressive processes, where assumptions on the dispersion of the thinning operator are formally tested, and autoregressions for double-bounded data with application to a realized correlation time series.

14:50
Inference on breaks in weak location time series models with quasi-Fisher scores

ABSTRACT. Based on Godambe's theory of estimating functions, we propose a class of cumulative sum (CUSUM) statistics to detect breaks in the dynamics of time series under weak assumptions. First, we assume a parametric form for the conditional mean, but make no specific assumption about the data-generating process (DGP) or even about the other conditional moments. The CUSUM statistics we consider depend on a sequence of weights that influence their asymptotic accuracy. Data-driven procedures are proposed for the optimal choice of the sequence of weights, in the sense of Godambe. We also propose modified versions of the tests that allow to detect breaks in the dynamics even when the conditional mean is misspecified. Our results are illustrated using Monte Carlo experiments and real financial data.

14:00-15:40 Session 1E: Machine Learning Methods I
14:00
Probabilistic Partial Least Squares
PRESENTER: Miguel Herculano

ABSTRACT. Despite its strengths and uses across the sciences, Partial Least Squares (PLS) lacks a probabilistic foundation, which limits its ability to account for parameter uncertainty, model misspecification, and patterns of missing data. We address all these limitations by developing a probabilistic version of PLS (PPLS), along with an expectation-maximization (EM) algorithm to learn the parameters and forecast the targets of interest. In a simulation exercise, we show that PPLS outperforms PLS at recovering the common underlying factors affecting both features and target variables, and also provides valid forecasts under contamination, such as measurement error or outliers. Finally, we provide two applications in Economics and Finance where PPLS performs competitively compared with PLS and Principal Component Analysis (PCA) at forecasting out-of-sample.

14:25
Locally Robust Estimation of the Intergenerational Elasticity

ABSTRACT. Numerous insights from the mobility literature rely on accurately estimating the intergenerational elasticity (IGE). This paper shows non-parametric identification of the IGE in the presence of incomplete income data. We construct a locally robust moment function based on the identifying moment, thereby providing a consistent debiased machine learning estimator of the intergenerational elasticity. We illustrate through simulations that our estimator performs well in finite samples under different scenarios.

14:50
Double Machine Learning for Static Panel Data with Instrumental Variables: New Method and Applications

ABSTRACT. When the instrumental variable (IV) requires conditioning on covariates to be valid, controlling for flexible functional forms of confounding variables might be crucial to the identification of the causal effect (Abadie, 2003). We employ the power of machine learning (ML) algorithms to overcome the 'curse of dimensionality' that would make any traditional estimation technique (e.g., two stage least squares) unfeasible. We propose an extension of the approaches developed by Clarke and Polselli (2023) for partially linear panel regression (PLPR) models with fixed effects to cases where the treatment or main regressor is endogenous. These are built on the double machine learning (DML) method introduced by Chernozhukov et al. (2018) for cross-sectional data. Panel data is commonly adopted in empirical studies and endogeneity arises in many cases. Thus, the extension to the DML framework for panel data when an IV is required to conduct valid causal analysis significantly enriches the toolkit available to empirical researchers. We discuss four empirical applications to illustrate the applicability of this method with different regression specifications and data structures, and advise researchers on when it is advisable to use DML and how to interpret the results. This can potentially raise the interest of applied researchers in complementing their analyses with these alternative methods that are designed to capture the complexity of the data.

14:00-15:40 Session 1F: Macroeconometric Methods I
14:00
Inference with Local Projections

ABSTRACT. This paper studies the asymptotic properties of local projection (LP-) IV estimators. In particular, we show that, in constrast to the local projection estimator without external instrument, the LP-IV estimator is consistent for the true impulse response of any multivariate short memory process, as long as a valid and relevent external instrument is available. Moreover, with the use of a martingale approximation, we derive the asymptotic distribution including an analytic expression for the asymptotic variance of the resulting LP-IV estimator. To our knowledge, this limit distribution has not been derived before under a general linear process without imposing a particular parametric model for the data. The derived distribution is valid even when the dimension of the system and the horizon are infinite. The resulting analytic expression for the asymptotic variance should be useful to researchers who routinely employ LP-IV regressions for empirical analysis. For example, we show how to conduct valid inference by modifying the IV-based standard errors in order to obtain a consistent estimator for the asymptotic variance. We demonstrate that the modified inference procedure delivers valid and sharp inference for any short memory process and performs better than the widely used HAC standard errors.

14:25
Functional Linear Projection and Impulse Response Analysis
PRESENTER: Won-Ki Seo

ABSTRACT. This paper proposes econometric methods for studying economic dynamics involving functional variables. Our methods are developed based on linear projection estimation of predictive regression models with a function-valued predictor and other control variables. We show that our linear projection coefficient associated with the functional variable allows for the impulse response interpretation in a functional structural vector autoregressive model under a certain identification scheme, similar to well-known Sims' (1972) causal chain, but with nontrivial complications in our functional setup. We employ the proposed statistical inference methodology and study responses of macroeconomic variables when (i) there is a certain shift in the economy sentiment distribution and (ii) a monetary policy shock is given to yield curves.

14:50
Nonparametric Local Projections
PRESENTER: Elena Pesavento

ABSTRACT. Nonlinearities play an increasingly important role in applied work when studying the responses of macroeconomic aggregates to policy shocks. Seemingly natural adaptations of the popular local linear projection estimator to nonlinear settings may fail to recover the population responses of interest. In this paper we study the properties of an alternative nonparametric local projection estimator of the conditional and unconditional responses of an outcome variable to an observed identified shock. We discuss alternative ways of implementing this estimator and how to allow for data-dependent tuning parameters. Our results are based on data generating processes that involve, respectively, nonlinearly transformed regressors, state-dependent coefficients, and nonlinear interactions between shocks and state variables. Monte Carlo simulations show that a local-linear specification of the estimator tends to work well in reasonably largesamples and is robust to nonlinearities of unknown form.

15:40-16:40Coffee Break
15:40-16:40 Session 2: Poster Session I
Counterfactual and Synthetic Control Method: Causal Inference with Instrumented Principal Component Analysis

ABSTRACT. In this paper, we propose a novel method for causal inference within the framework of counterfactual and synthetic control. Matching forward the generalized synthetic control method developed by Xu (2016), our instrumented principal component analysis method instruments factor loadings with predictive covariates rather than including them as regressors. These instrumented factor loadings exhibit time-varying dynamics, offering a better economic interpretation. Covariates are instrumented through a transformation matrix, $\Gamma$, when we have a large number of covariates it can be easily reduced in accordance with a small number of latent factors helping us to effectively handle high-dimensional datasets and making the model parsimonious. Finally, the novel way of handling covariates is less exposed to model misspecification and achieved better prediction accuracy. Our simulations show that this method is less biased in the presence of unobserved covariates compared to other mainstream approaches. In the empirical application, we use the proposed method to evaluate the effect of Brexit on foreign direct investment to the UK.

Fiscal Shocks and the Surge of Inflation

ABSTRACT. Using a structural vector autoregression, I document the dominant role of fiscal policy in the recent surge of inflation in the United States. The comovement of output, prices, and primary deficit yields unique restrictions that allow me to identify the causal effects of an exogenous fiscal stimulus. While fiscal shocks have historically been important determinants of inflation, their dominant role in the recent spike reflects the unprecedented scale of fiscal interventions. In the Euro Area, inflation also has a fiscal component, with the timing of interventions explaining its lag behind the United States. I show that a model with monetary and fiscal policy interactions – where Ricardian equivalence fails due to finite planning horizons – can account for fiscal inflation in the recent period. My analysis supports the need to place fiscal policy at the center of the current macroeconomic agenda and calls for a deeper understanding of its transmission mechanisms.

Testing Conditional Moment Restrictions: A Partitioning Approach

ABSTRACT. This paper proposes $\chi^2$ tests for assessing the specification of regression models or general conditional moment restrictions. The data is partitioned according to the explanatory variables into several cells, and the tests evaluate whether the difference between the observed average of the dependent variable and its expected value under the model specification arises by chance. In contrast to existing omnibus procedures, $\chi^2$ tests are asymptotically pivotal and fairly insensitive to the curse of dimensionality. The computation is straightforward and does not require bootstrapping or smoothing techniques. Importantly, the asymptotic properties of the test are invariant to sample-dependent partitions, which can be chosen to favor certain alternatives. A Monte Carlo study provides evidence of the good performance of the tests using samples of small or moderate size compared with existing omnibus alternatives, particularly with many explanatory variables. Empirical illustrations in difference-in-difference settings with continuous treatments and in static binary treatment designs with numerous covariates complement the finite sample study.

It is All About Demand and Supply: a Dualistic View of the Euro Area Business Cycle
PRESENTER: Marco Mazzali

ABSTRACT. We study the nature of Euro Area business cycle drivers. We build an extensive dataset of quarterly time series covering EA aggregates and the most important country members. We find that two shocks are enough to explain the majority of EA aggregates variability, and we show that those two cyclical shocks can be interpreted as classic demand-side and supply-side shocks. Additionally, we (i) document a high synchronization in the responses to such shocks across the different EA members, (ii) provide a demand-supply historical decomposition of the EA variables, with a particular focus on the recent inflation surge and (iii) study the reasons behind the flattening of the EA Phillips' Curve. On this topic, results find that the EA Phillips curve is alive, and point to a flattening of the curve to a stricter mandate of the monetary policy stance towards inflation targeting.

Improved Inference for Nonparametric Regression and Regression-Discontinuity Designs
PRESENTER: Edoardo Zanelli

ABSTRACT. We consider inference for (possibly) non-linear conditional expectations in the setup of nonparametric regression and regression-discontinuity designs. In this context, inference is challenging due to asymptotic bias of local polynomial estimators. We propose a novel approach to restore valid inference by means of proper implementations of the bootstrap. Specifically, we show conditions under which, even if the bootstrap test statistic is not able to mimic the behavior of the asymptotic bias – making the bootstrap fail using standard arguments – the large sample distribution of the bootstrap p-value only depends on nuisance parameters which are easily estimable. We introduce two bootstrap algorithms, namely the local polynomial (LP) and fixed-local (FL) bootstrap, which deliver asymptotically valid confidence intervals (CIs) for both interior and boundary points without requiring undersmoothing or direct bias correction. We demonstrate the theoretical validity and analyze the efficiency properties of these methods, highlighting the asymptotic equivalence of the FL bootstrap-based CIs with robust bias correction (RBC) intervals, while showing that LP bootstrap-based CIs achieve greater efficiency. Monte Carlo simulations confirm the practical relevance of our methods.

Partially Identified Rankings from Pairwise Interactions}
PRESENTER: Federico Crippa

ABSTRACT. This paper considers the problem of ranking objects based on their latent merits using data from pairwise interactions. Existing approaches rely on the restrictive assumption that all the interactions are either observed or missed randomly. We investigate what can be inferred about rankings when this assumption is relaxed. First, we demonstrate that in parametric models, such as the popular Bradley-Terry-Luce model, rankings are point-identified if and only if the tournament graph is connected. Second, we show that in nonparametric models based on strong stochastic transitivity, rankings in a connected tournament are only partially identified. Finally, we propose two statistical tests to determine whether a ranking belongs to the identified set. One test is valid in finite samples but computationally intensive, while the other is easy to implement and valid asymptotically. We illustrate our procedure using Brazilian employer-employee data to test whether male and female workers rank firms differently when making job transitions.

Distributional Difference-in-Differences with Multiple Time Periods

ABSTRACT. Researchers are often interested in evaluating the impact of a policy on the entire (or specific parts of the) distribution of the outcome of interest. In this paper, I provide a method to recover the whole distribution of the untreated potential outcome for the treated group in non-experimental settings with staggered treatment adoption by generalizing the existing quantile treatment effects on the treated (QTT) estimator proposed by Callaway and Li (2019). Besides the QTT, I consider different approaches that anonymously summarize the quantiles of the distribution of the outcome of interest (such as tests for stochastic dominance rankings) without relying on rank invariance assumptions. The finite-sample properties of the estimator proposed are analyzed via different Monte Carlo simulations. Despite being slightly biased for relatively small sample sizes, the proposed method’s performance increases substantially when the sample size increases.

Climate and Macroeconomic Variability
PRESENTER: Marco Tibullo

ABSTRACT. Using a novel global-to-local identification strategy, this paper examines the impact of temperature shocks on inflation and its volatility across a diverse set of countries, focusing on developing economies. We find that a 1°C increase in temperature anomalies leads to significant and persistent rises in both headline and food inflation, particularly in vulnerable developing regions, where inflation surges by up to 3%. Our results also highlight the importance of considering stochastic volatility, as variance notably influences inflation dynamics.

Are Hysteresis Effects Nonlinear?

ABSTRACT. This paper investigates the nonlinear effects of aggregate demand dynamics over medium- and long-term horizons, focusing on whether negative aggregate demand shocks have distinct long-lasting impacts compared to positive shocks (sign dependence). We begin by identifying a long-term demand shock, termed the 'hysteresis' shock, within a structural vector autoregression framework. To assess sign dependence, we employ local projections with a nonlinear transformation of the shock. This methodology is applied to a quarterly U.S. macroeconomic dataset that includes variables related to the productivity and labor market channels of hysteresis. Our findings indicate that, for productivity-related variables, negative hysteresis shocks have larger initial effects than positive shocks. However, while responses from negative shocks tend to fade over time, positive shocks begin to exert significant influence in the medium to long term. For labor market variables, negative shocks appear to have the strongest impact. When we disaggregate labor market indicators across worker groups, results become more symmetric for disadvantaged workers, suggesting that a 'high-pressure economy' may partially reverse the scarring effects of demand-induced recessions.

16:40-18:20 Session 3A: Structural VAR Methods I
16:40
Estimation of Non-Gaussian SVAR Using Tensor Singular Value Decomposition

ABSTRACT. This paper introduces a tensor singular value decomposition (TSVD) approach for estimating non-Gaussian Structural Vector Autoregressive (SVAR) models. The methodology is applicable in cases of complete and partial identification of structural shocks. The estimation procedure relies on cumulants that capture variations in third and/or fourth order cumulants. We establish the asymptotic distribution of the estimator. A simulation study demonstrates its highly competitive performance in small samples compared to alternative methods under complete identification. In cases of partial identification, our estimator also exhibits very good performance in small samples. An application is proposed to illustrate the usefulness of the procedure in the case of partial identification.

17:05
Bayesian Inference for Heteroskedastic Proxy-SVARs
PRESENTER: Tommaso Tornese

ABSTRACT. In this paper we develop simulation methods for Bayesian inference on a class of SVAR models characterized by conditionally heteroskedastic shocks identified through non-triangular zero restrictions, which include Proxy-SVARs as a special case. The algorithm we propose is a Gibbs sampler with a Sequential Monte Carlo (SMC) step that exploits an auxiliary homoskedastic block triangular parameterization to form a proposal density and transform it gradually to resemble the posterior distribution implied by the heteroskedastic model. We assess the accuracy and efficiency of the proposed sampler through a Monte Carlo exercise and we use it to revisit well known empirical studies taken from the literature.

17:30
Generalised External-Instrument SVAR Analysis
PRESENTER: Giovanni Ricco

ABSTRACT. We extend the SVAR-IV (Proxy-SVAR) method to handle noninvertible and nonrecoverable shocks, providing tests for recoverability and invertibility. Our approach allows the estimation of unit variance shocks, absolute response functions, and variance decomposition for recoverable shocks. When the shock is invertible, the method aligns with the standard approach. Using simulations, we show that our method outperforms the Internal-Instrument approach, due to its greater flexibility. Applying our method to a monetary policy VAR, we find sizeable effects of monetary policy on prices differently from previous findings.

17:55
A test of exogeneity in Structural Vector Autoregressions with external instruments
PRESENTER: Luca Fanelli

ABSTRACT. This paper introduces a novel test for instrument exogeneity in Structural Vector Autoregressions with external instruments (SVAR-IV, or proxy-SVARs) that is based on standard asymptotics, is robust to proxy strength and does not require auxiliary information beyond SVAR restrictions and the instruments themselves. The test, which can be applied when r proxies are available for k target structural shocks, r>=k>=1, leverages the same moment conditions used, under instrument exogeneity, to build weak-instrument robust conÖdence sets for the (normalized) on-impact responses of the variables to the target shocks. To construct our Wald-type test statistic, we need an instrument-free, consistent estimator of these (normalized) on-impact responses, that we call elasticity parameters. We derive the instrument-free estimator of the elasticity parameters from a proper representation of the VAR as a simultaneous system of equations, under a set of overidentifying restrictions known as "Byron's restrictions". The test is implemented as a sequential procedure: we preliminarily assess the validity of the Byron's restrictions from which the instrument-free estimator of the elasticity parameters is derived and, conditional on not-rejecting the Byronís restrictions, we proceed to test instrument exogeneity using the Wald test statistic. We prove that the testing procedure controls the overall TypeI error asymptotically and consistently detects instrument contamination. Interestingly, when instrument exogeneity is rejected, our approach o§ers a natural, straightforward solution to "decontaminate" the proxies removing the influence of the non-target shock. Extensive Monte Carlo simulations are conducted to evaluate its Önite-sample performance, and we illustrate its practical utility by revisiting several prominent proxy-SVARs from the literature.

16:40-18:20 Session 3B: Time Series Theory II
16:40
Change-Point Detection in Time Series Using Mixed Integer Programming
PRESENTER: Anton Skrobotov

ABSTRACT. We use cutting-edge mixed integer optimization (MIO) methods to develop a framework for detection and estimation of structural breaks in time series regression models. The framework is constructed based on the least squares problem subject to a penalty on the number of breakpoints. We restate the l0-penalized regression problem as a quadratic programming problem with integer- and real-valued arguments and show that MIO is capable of finding provably optimal solutions using a well-known optimization solver. Compared to the popular l1-penalized regression (LASSO) and other classical methods, the MIO framework permits simultaneous estimation of the number and location of structural breaks as well as regression coefficients, while accommodating the option of specifying a given or minimal number of breaks. We derive the asymptotic properties of the estimator and demonstrate its effectiveness through extensive numerical experiments, confirming a more accurate estimation of multiple breaks as compared to popular non-MIO alternatives. Two empirical examples demonstrate usefulness of the framework in applications from business and economic statistics.

17:05
A Matrix-valued model with Time-varying Volatility
PRESENTER: Federico Carlini

ABSTRACT. In finance, economics and many other fields, observations in a matrix form are often observed over time. For example, several key economic indicators are regularly reported in different countries; similarly, various financial charac- teristics of many companies are reported over time. The resulting time series are inherently matrix-valued, as different features (rows) are observed for each subject (columns) over time. Standard approaches to model these data rely on the vectorization of matrix observations, resulting in highly parametrised and hardly interpretable models, as the information in the inherent structure of the data is lost. In fact, the rows and columns of the matrix typically represent different types of structures that are closely interrelated. Moreover, the volatility of economic and financial time series is related to the business cycle and geo-political events, which makes the common assumption of homoskedasticity not plausible. To address these issues, this article follows the observation-driven approach formodellingtime-varyingparametermodels. Itproposesanovelmatrix-valued time-varying volatility model that maintains and utilises the matrix structure to achieve more significant dimensional reduction and find more explicit and interpretable variance/covariance structures. The probabilistic properties of the matrix-valued model are investigated. Estimation procedures with their theoretical properties are presented, and their effectiveness is demonstrated on simulated and real-data examples.

17:30
Sequential Monte Carlo for Noncausal Processes

ABSTRACT. This paper proposes a Sequential Monte Carlo approach for the Bayesian estimation of mixed causal and noncausal models. Unlike previous Bayesian estimation methods developed for these models, Sequential Monte Carlo offers extensive parallelization opportunities, significantly reducing estimation time and mitigating the risk of becoming trapped in local minima, a common issue in noncausal processes. Simulation studies demonstrate the strong ability of the algorithm to produce accurate estimates and correctly identify the process. In particular, we propose a novel identification methodology that leverages the Marginal Data Density and the Bayesian Information Criterion. Unlike previous studies, this methodology determines not only the causal and noncausal polynomial orders but also the error term distribution that best fits the data. Finally, Sequential Monte Carlo is applied to a bivariate process containing S&P Europe 350 ESG Index and Brent crude oil prices.

17:55
First-order integer-valued autoregressive processes with Generalized Katz innovations
PRESENTER: Roberto Casarin

ABSTRACT. A new integer-valued autoregressive process (INAR) with Generalised Lagrangian Katz (GLK) innovations is defined. This process family provides a flexible modelling framework for count data, allowing for under and over--dispersion, asymmetry, and excess of kurtosis and includes standard INAR models such as Generalized Poisson and Negative Binomial as special cases. We show that the GLK--INAR process is discrete semi-self-decomposable, infinite divisible, stable by aggregation and provides stationarity conditions. Some extensions are discussed, such as the Markov-Switching and the zero-inflated GLK-INARs. A Bayesian inference framework and an efficient posterior approximation procedure are introduced. The proposed models are applied to 130 time series from Google Trend, which proxy the worldwide public concern about climate change. New evidence is found of heterogeneity across time, countries and keywords in the persistence, uncertainty, and long--run public awareness level.

16:40-18:20 Session 3C: Financial Econometrics: Theory and Empirics I
16:40
Joint extreme Value-at-Risk and Expected Shortfall dynamics with a single integrated tail shape parameter
PRESENTER: Enzo D'Innocenzo

ABSTRACT. We propose a robust semi-parametric framework for persistent time-varying extreme tail behavior, including extreme Value-at-Risk (VaR) and Expected Shortfall (ES). The framework builds on Extreme Value Theory and uses a conditional version of the Generalized Pareto Distribution (GPD) for peaks-over-threshold (POT) dynamics. Unlike earlier approaches, our model (i) has unit root-like, i.e., integrated autoregressive dynamics for the GPD tail shape, and (ii) re-scales POTs by their thresholds to obtain a more parsimonious model with only one time-varying parameter to describe the entire tail. We establish parameter regions for stationarity, ergodicity, and invertibility for the integrated time-varying parameter model and its filter, and formulate conditions for consistency and asymptotic normality of the maximum likelihood estimator. Using four exchange rate series, we illustrate how the new model captures the dynamics of extreme VaR and ES.

17:05
Asset pricing models with downside risk
PRESENTER: Elisa Ossola

ABSTRACT. Under the asset pricing constraint where expected returns are a linear combination of risk premia, we propose a linear factor model for asset returns that incorporates downside risk. Specifically, we extend the theoretical framework of Gagliardini et al. (2016) by introducing downside risk into the linear factor structure of returns as in Massacci et al. (2024). First, we propose a theoretical setting by assuming that the "disappointment" event that triggers the bad state of the world is known a priori, and we introduce a common structure of the factors, independent from the states considered. We then extend the framework by introducing an estimation procedure for the threshold that identifies the "disappointment" event, and we allow different factor structures for asset returns in good and bad states. We show how to consistently estimate the conditional risk premia of observable factors from the conditional model when asset pricing restrictions hold.

17:30
Estimation risk in conditional expectiles

ABSTRACT. We establish the consistency and asymptotic normality of a two-step estimator of conditional expectiles in the context of location-scale models. We first estimate the parameters of the conditional mean and variance by quasi-maximum likelihood and then compute the unconditional expectile of the innovations using the empirical quantiles of the standardized residuals. We show how replacing true innovations with standardized residuals affects the asymptotic variance of the expectile estimator. In addition, we also obtain asymptotic-valid bootstrap-based confidence intervals. Finally, our empirical analysis reveals that conditional expectiles are very interesting alternatives to assess tail risk in cryptomarkets, relative to traditional quantile-based risk measures, such as value at risk and expected shortfall.

17:55
Dynamic tail risk forecasting: what do realized skewness and kurtosis add?

ABSTRACT. This paper compares the accuracy of tail risk forecasts, focusing on including realized skewness and kurtosis in ”additive” and ”multiplicative” models. Utilizing a panel of 960 US stocks, we conduct diagnostic tests, employ scoring functions, and implement rolling window forecasting to evaluate the performance of Value at Risk (VaR) and Expected Shortfall (ES) forecasts. Additionally, we examine the impact of the window length on forecast accuracy. We propose model specifications that incorporate realized skewness and kurtosis for enhanced precision. Our findings provide insights into the importance of considering skewness and kurtosis in tail risk modeling, contributing to the existing literature and offering practical implications for risk practitioners and researchers.

16:40-18:20 Session 3D: Forecasting: Theory and Empirics I
16:40
Adaptive combinations of tail-risk forecasts

ABSTRACT. In order to meet the increasingly stringent global standards of banking management and regulation, several methods have been proposed in the literature for forecasting tail risk measures such as the Value-at-Risk (VaR) and Expected Shortfall (ES). However, regardless of the approach used, there are several sources of uncertainty, including model specifications, data-related issues and the estimation procedure, which can significantly affect the accuracy of VaR and ES measures. Aiming to mitigate the influence of these sources of uncertainty and improve the predictive performance of individual models, we propose novel forecast combination strategies based on the Model Confidence Set (MCS). In particular, consistent joint VaR and ES loss functions within the MCS framework are used to adaptively combine forecasts generated by a wide range of parametric, semi-parametric, and non-parametric models. Our results reveal that the proposed combined predictors provide a suitable alternative for forecasting risk measures, passing the usual backtests, entering the set of superior models of the MCS, and usually exhibiting lower standard deviations than other model specifications.

17:05
The Hedged Random Forest
PRESENTER: Michael Wolf

ABSTRACT. The random forest is one of the most popular and widely employed tools for supervised machine learning. It can be used for both classification and regression tasks; in this paper, the focus will be on regression only. In its standard form, the crux of the random forest is to use an equal-weighted ensemble of tree-based forecasts. Instead, we suggest a more general weighting scheme that borrows certain ideas from the related problem of financial portfolio selection and, in particular, allows for negative weights. Based on a benchmark collection of real-life data sets, we demonstrate the improved forecasting performance of our method not only relative to the standard random forest but also relative to two previous proposals for weighting the tree-based forecasts. It is noteworthy that our methodology is of a high-level nature and can also be applied to other forecast-combination problems, when forecasting methods are of arbitrary nature and not necessarily tree-based.

17:30
The judgmental strategy of professional forecasters

ABSTRACT. We introduce a new definition of forecasting coherence based on the Likelihood Principle and a “Scoring Structure” environment where a Forecast User interacts with a Forecast Producer and Reality to detect strategic interaction among economic agents’ forecasting bias. This mathematical object is necessary to identify and parametrize coherence in a feasible econometric model and give it a structural interpretation. Structural coherence is evaluated through a formal test that satisfies theoretical requirements in small samples. Three case studies illustrates the evidence of strategic judgment. The economic interpretation and the consequences of our approach in Central Banking are also discussed.

17:55
Nowcasting with Mixed Frequency Data Using Gaussian Processes

ABSTRACT. We develop Bayesian machine learning methods for mixed data sampling (MIDAS) regressions. This involves handling frequency mismatches and specifying functional relationships between many predictors and the dependent variable. We use Gaussian processes (GPs) and compress the input space with structured and unstructured MIDAS variants. This yields several versions of GP-MIDAS with distinct properties and implications, which we evaluate in short-horizon now- and forecasting exercises with both simulated data and data on quarterly US output growth and inflation in the GDP deflator. It turns out that our proposed framework leverages macroeconomic Big Data in a computationally efficient way and offers gains in predictive accuracy compared to other machine learning approaches along several dimensions.

16:40-18:20 Session 3E: Applied Microeconomics I
16:40
The Local Job Multipliers of Green Re-industrialization

ABSTRACT. What are the job multipliers of the green manufacturing re-industrialization? We tackle this question by combining green manufacturing production data with regional employment shares of 2-digit manufacturing industries. This results in a measure of green manufacturing penetration, which we relate to regional employment in manufacturing and non-manufacturing sectors in a long-difference model. To address endogeneity in regional green penetration, we exploit a shift-share instrument that exploits aggregate technological flows. Green penetration positively affects regional employment in manufacturing and non-manufacturing sectors. These results are homogeneous along the regional skill level and the employment in manufacturing technology's intensity. Instead, STEM professions' presence unveils heterogeneous patterns. Further, we extend these findings by applying an alternative estimation strategy that exploits large shocks to green penetration and performing an empirical policy evaluation exercise by assessing the effectiveness of the Just Transition Fund programme. Lastly, we construct a similar brown penetration measure and compare it with the green one. Brown penetration multipliers, albeit positive, are substantially smaller than the ones associated with green penetration. Region with baseline specialization in brown production still benefit from a green expansion.

17:05
Prenatal Sex Detection Technology and Mothers’ Labour Supply in India

ABSTRACT. We estimate the impact of prenatal sex detection technology (PSDT) on mothers’ labour supply in India. Our empirical strategy combines aggregate supply-driven changes in ultrasound availability over time with family-level variation in the incentive to sex-select due to first-born gender and local-level son preference. We find that the availability of PSDT had a significant negative impact on the labour supply of wealthy and educated mothers. For them, PSDT induced a substitution of girls with boys. The diminished need to save for daughters’ dowries explains the decline in labour supply in a model of fertility and labour supply decisions.

17:30
Work-from-Home Job Creation as a Health-Risk Mitigation Strategy: Firm-Level Evidence from the COVID-19 Pandemic
PRESENTER: Agata Maida

ABSTRACT. Adopting ``work from home'' (WFH) during a pandemic can help firms mitigate the health risks related to virus diffusion among their workers. However, WFH is likely to be especially productive in suitable jobs (i.e., ``WFH jobs''). In this paper, we hypothesize that during the COVID-19 crisis, firms' health risk perceptions were related to the local virus diffusion, and we test whether the latter induced a more extensive creation of WFH jobs. Using high-quality firm-level administrative data and a difference-in-differences strategy leveraging the considerable spatial heterogeneity in COVID-19 diffusion in Italy, our analysis demonstrates that a more intensive local exposure to the pandemic increased hires in WFH jobs. The effect on hires was short-term and only lasted one year. Yet, by mainly involving workers with permanent contracts, it is likely to have long-term consequences on firms' workforce composition.

17:55
The Effect of Temporary Employment on Labour Market Outcomes
PRESENTER: Enrico Rettore

ABSTRACT. We evaluate the causal effect of temporary contracts on subsequent labour market outcomes versus both a spell of unemployment and a spell of permanent employment over the period 2007-2016, using the panel of the Italian Labour Force Survey. We identify the causal effect imposing that, conditional on a suitable set of observable characteristics, the treatment status (temporary job) is ignorable for the outcome, and we build a placebo test to validate the ignorability assumption. Then, we propose a strategy to estimate the remaining selection bias due to the omission of an additional (potential) confounder, which we observe jontly with the treatment status but not with the outcome. Results indicate that a spell of temporary work vs a spell of unemployment increases the probability of being employed after twelve months by 35 percentage points and the probability of having a permanent contract by 7.3 p.p; individuals who had a temporary contract work also work longer and earn more. Instead, workers who experience a spell of temporary work vs a spell of permanent work are less likely to be employed 12 months later (-5 p.p.) and to have a permanent contract (-40 p.p.); they earn less and are also less satisfied. The placebo test indicates that most of the selection bias is accounted for even omitting the additional confounder implying that our preferred estimates of the causal effects are close to unaffected by selection bias.

16:40-18:20 Session 3F: Macroeconometric Methods II
16:40
The Identification Problem for Linear Rational Expectations Models
PRESENTER: Majid Al Sadoon

ABSTRACT. We consider the problem of the identification of stationary solutions to linear rational expectations models from the second moments of observable data. Observational equivalence is characterized and necessary and sufficient conditions are provided for: (i) identification under affine restrictions, (ii) generic identification under affine restrictions of analytically parametrized models, and (iii) local identification under non-linear restrictions. The results strongly resemble the classical theory for VARMA models although significant points of departure are also documented.

17:05
Estimating Heterogeneous DSGE Models
PRESENTER: Stefano Grassi

ABSTRACT. This paper addresses the estimation of Heterogeneous Dynamic Stochastic General Equilibrium models based on the Mixture of Student-$t$ by Importance Sampling weighted Expectation-Maximization. The proposed method can handle target distributions that exhibit non-elliptical shapes, such as multimodality and skewness, and is also well suited for parallel computing. Furthermore, to avoid weight degeneracy, a simple robustification of the algorithm is introduced. The proposed method is compared in a Monte Carlo exercise with standard Markov Chain Monte Carlo and the recently proposed Sequential Monte Carlo method. Monte Carlo results compute posterior moments for two canonical heterogeneous models of increasing complexity: a one-asset New Keynesian model and a two-asset New Keynesian model. The three methods deliver similar results, with the proposed method being faster and, therefore, could be a useful tool in the analysis of Heterogeneous Agent models.

17:30
Causality versus Serial Correlation: an Asymmetric Portmanteau Test

ABSTRACT. I study the problem of testing for noncausality in mean (one-sided conditional mean independence) between two multivariate time series within the class of testing procedures based on serial cross-correlation. Existing tests in this class either require parametrization of the joint process or are characterized under the null hypothesis of mutual independence. As a result, these tests may suffer from size distortions when misspecifying inverse causality, i.e., dependencies in the causal direction opposite to the one being tested. I propose a modified Portmanteau test statistic that incorporates a correction term to offset the influence of inverse causality, thereby eliminating the need to fully model the joint dynamics. I demonstrate that the proposed test statistic converges asymptotically to a standard normal distribution under the null hypothesis of noncausality in mean, resulting in correct asymptotic size. As an empirical application, I explore the statistical properties of my proposed test by studying three widely used measures of macroeconomic structural shocks, showing that the proposed test provides more reliable inference than the benchmark test.

17:55
The information matrix test for Markov switching autoregressive models with covariate-dependent transition probabilities
PRESENTER: Dante Amengual

ABSTRACT. The EM principle implies the moments underlying the information matrix test for multivariate Markov switching autoregressive models with covariate-dependent transition probabilities are the smoothed values of the moments tested if the latent Markov chain were observed. Thus, we identify components related to the conditional heteroskedasticity, skewness and kurtosis of the multivariate regression residuals for each of the regimes, the neglected multivariate heteroskedasticity of the generalised residuals for each of the columns of the transition matrix, and a final component that assesses the conditional independence of these generalised residuals and the regression residuals, their squares and cross-products given the observed variables.