ICEEE-7TH: SEVENTH ITALIAN CONGRESS OF ECONOMETRICS AND EMPIRICAL ECONOMICS
PROGRAM FOR FRIDAY, JANUARY 27TH
Days:
previous day
all days

View: session overviewtalk overview

08:50-10:30 Session 8A: Risk evaluation
Location: Room IV
08:50
Tree based methods for classifying risky financial institutions

ABSTRACT. We propose a tree-based approach to identify groups of risky financial institutions. We use a synthetic indicator based on the information arising from a sample of pooled systemic risk measures. The composition and am- plitude of the risky groups changes over time, emphasizing the periods of high systemic risk stress. We also calculate the probability that a bank can change risk group over the next month and show that a bank belonging to the lowest or highest risk group has a high probability to remain in that group.

09:15
Estimating Risk Premia Using Large Cross-Sections

ABSTRACT. Tens of thousands of stocks are traded every day in financial markets, providing an extremely rich information set to validate and estimate asset pricing models. At the same time, it is convenient to consider short time series, to avoid structural breaks and to mitigate the documented time variation of the distribution of stock returns. Based on these considerations, this paper presents a limiting theory for estimating and testing linear asset-pricing models when a very large number of assets $N$ is available together with a fixed, possibly very small, time-series dimension, applicable to both traded and non-traded factors. For this purpose, we focus on Shanken's (1992) estimator, which we show to exhibit many desirable properties. We demonstrate that: first, it is an OLS-based estimator that, unlike others, does not require preliminary estimation of the bias-adjustment; second, it converges at the true ex-post risk premia at rate $\sqrt{N}$; third, it has an asymptotically normal distribution; fourth, its limiting covariance matrix can be consistently estimated. Based on the pricing errors associated with the Shanken estimator, we propose a new test of the no-arbitrage asset pricing restriction, and establish its asymptotic distribution (assuming that the restriction holds) that only requires the number of assets N to diverge. Finally, we show how our results can be extended to deal with the more realistic case of unbalanced panels. The practical relevance of our findings is demonstrated using Monte Carlo simulations and an empirical application to asset pricing models with traded risk factors. Our analysis suggests that the market, size, and value factors are often priced in the cross-section of NYSE-AMEX-NASDAQ individual stock returns.

09:40
Asset Allocation Strategies Based on Penalized Quantile Regression

ABSTRACT. It is well known that quantile regression model minimizes the portfolio extreme risk, whenever the attention is placed on the estimation of the response variable left quantiles. We show that, by considering the entire conditional distribution of the dependent variable, it is possible to optimize different risk and performance indicators. In particular, we introduce a risk-adjusted profitability measure, useful in evaluating financial portfolios under a "cautiously optimistic" perspective, since the reward contribution is net of the most favorable outcomes. Moreover, as we consider large portfolios, we also cope with the dimensionality issue by introducing an l1-norm penalty on the assets weights.

10:05
Conditional alphas and realized betas

ABSTRACT. This paper proposes a two-step procedure to back out the conditional alpha of a given stock using high-frequency data. We first estimate the realized factor loadings of the stocks, and then retrieve their conditional alphas by estimating the conditional expectation of their risk- adjusted returns. We start with the underlying continuous-time stochastic process that governs the dynamics of every stock price and then derive the conditions under which we may consistently estimate the daily factor loadings and the resulting conditional alphas. We also contribute empiri- cally to the conditional CAPM literature by examining the main drivers of the conditional alphas of the S&P 100 index constituents from January 2001 to December 2008. In addition, to confirm whether these conditional alphas indeed relate to pricing errors, we assess the performance of both cross-sectional and time-series momentum strategies based on the conditional alpha estimates. The findings are very promising in that these strategies not only seem to perform pretty well both in absolute and relative terms, but also exhibit little, if any, systematic exposure to the usual risk factors (namely, market, size, value and momentum portfolios).

08:50-10:30 Session 8B: Testing
Location: Room VI
08:50
A test for monotonicity of the investment-cash flow sensitivity

ABSTRACT. We propose a test for monotonicity of the investment-cash ‡flow sensitivity with respect to the degree of fi…nancing constraints which is robust to sample separation. We find empirical evidence of a nonmonotonic relationship between the investment-cash fl‡ow sensitivity and the degree of fi…nancing constraints. Our empirical fi…ndings make clear that the confl‡icting …findings of the literature concerning the shape of the investment-cash ‡ow sensitivity are driven by the nonmonotonic behaviour of the investment-cash ‡ow sensitivity. Moreover, given the nonmonotonicity result, we echo the sentiment of Kaplan and Zingales (1997) that researchers should proceed with extreme caution when interpreting the investment-cash ‡ow sensitivity coefficients.

09:15
Normality tests for latent variables

ABSTRACT. We exploit the rationale behind the Expectation Maximisation algorithm to derive simple to implement and interpret score tests of normality in all or a subset of the innovations to the latent variables in state space models against Generalised Hyperbolic alternatives, including symmetric and asymmetric Student t. We decompose our tests into third and fourth moment components, and obtain one-sided Likelihood Ratio analogues, whose asymptotic distribution we provide. We perform a Monte Carlo study of the finite sample size and power of our procedures and previous proposals. Finally, we illustrate our tests in an application to US aggregate real output measurement.

09:40
Wald tests when restrictions are locally singular

ABSTRACT. This paper provides an exhaustive characterization of the asymptotic properties of the standard Wald test statistic for testing restrictions given by polynomial functions (or restrictions asymptotically equivalent to polynomial) when the vector of parameter estimators converges at some rate in law to a nondegenerate distribution and the variance matrix estimator converges to a limit matrix. With possible singularity, in addition to the well-known finite sample non-invariance there is also asymptotic non-invariance (non-pivotalness), the test may either under-reject or over-reject for the standard critical values and may even diverge under the null (not for a single restriction, but for two or more). All these situations are possible in testing restrictions proposed in the literature for examining specification of ARMA models, for causality at different horizons, for determinants arising e.g. in models of covolatility and many other situations when singularity in the restrictions cannot be excluded. We demonstrate that the limit distribution (and its very existence) can be characterized via evaluating various determinants defined by the restrictions. This characterization provides the possibility to adaptively identify the limit distribution from the estimators; we describe the algorithm that accomplishes this identification and permits to consistently estimate the correct asymptotic critical value. We also characterize and adaptively identify bounds on the distribution and on the asymptotic critical values.

10:05
Testing for Serial Correlation in Spatial Panels

ABSTRACT. We consider the issue of testing error persistence in spatial panels with individual heterogeneity. For random effects models, we review conditional Lagrange Multipliers tests from restricted models, and Likelihood Ratios or Wald tests via estimation of comprehensive models with correlation in space and time. We propose two ad-hoc tests for testing serial correlation in fixed effects panels, based either on time-demeaning or on forward orthogonal deviations. The proposed tests can be used under the RE assumption as well and are computationally less complicated than their RE counterparts. We evaluate them through Montecarlo simulations.

08:50-10:30 Session 8C: Firm dynamics and business cycle
Location: Room VII
08:50
Offshoring and firm overlap

ABSTRACT. We set up a model of offshoring with heterogeneous producers that captures two empirical regularities of German offshoring firms. There is selection of larger, more productive firms into offshoring. However, the selection is not sharp, and offshoring and non-offshoring firms coexist over a wide range of the revenue distribution. An overlap of offshoring and non-offshoring firms emerges in our model because, in contrast to textbook models of trade with heterogeneous producers, we allow firms to differ in two technology parameters thereby decoupling the offshoring status of a firm from its revenues. In an empirical exercise, we employ firm-level data from Germany to estimate key parameters of the model and show that ignoring the overlap lowers the estimated gains from offshoring by almost 67 percent and exaggerates the importance of the extensive margin for explaining the evolution of German offshoring over the last 25 years.

09:15
Firm Dynamics and Employment Protection: Evidence from Sectoral Data

ABSTRACT. We analyse the impact of employment protection legislation (EPL) on firms' entry and exit rates for a sample of industries of thirteen countries from the most recent version of the OECD Structural and Business Statistics Database. Using a difference-in-difference identification strategy, we find that more stringent EPL is associated to lower entry and exit, particularly in industries characterized by higher job reallocation intensity. We also find that both collective and individual dismissal regulations reduce firms' entry and exit and that the negative effect of EPL is stronger in the case of small firms. An extensive robustness analysis confirms our findings.

08:50-10:30 Session 8D: Financial econometrics II
Location: Room VIII
08:50
The seasonal heterogeneous INGARCH model

ABSTRACT. In this paper we propose an accurate and fast-to-estimate forecasting model for discrete valued time series with long memory and seasonality. The modelisation is achieved with a Integer GARCH process that features seasonality and heterogeneous autoregressive components (inspired by the HAR model of Corsi, 2009). As for the HAR case, the proposed model is not a long-memory process from a mathematical point of view, nevertheless it produces memory patterns that are indistinguishable from those observed in empirical data. In fact, we show that our model has superior forecasting performances of a process which is formally a long-memory one (the LMACP of Groß-KlußMann and Hautsch, 2013). The model is applied to forecast the bid-ask spread. Indeed, given the prominent role of the bid-ask spread as a transaction cost for trading, the possibility of having a good forecasting model for this quantity is of great importance for many applications, in particular for algorithms of optimal execution of orders. In the final part of the paper we propose a multivariate extension of our modeling framework. This extension is purposely conceived for the analysis of spillovers in the spread dynamics among equity stocks.

09:15
New Insights into Financial Markets Efficiency: A Present Value Approach

ABSTRACT. We provide new evidence against the efficiency of financial markets at the firm level by exploiting the present-value relation between stock prices and dividends. Adopting a simple approach, which do not require assumptions on the discount factors of dividends, we show that the estimated discount factors were in line with those computed from the historical U.S. treasury yield curve until the 90’s when we assume naive expectations. Moreover, prices systematically diverge from fundamentals, and the behavior of stock mispricing is consistent with short-run momentum and long-run mean-reversion dynamics also under the assumption of rational expectations. Finally, regardless of the assumption on expectations, past fundamentals and other factors, which are not related to fundamentals, explain on average respectively 20% and 5% of stock price variability, meaning that agents are not forward looking since they pay too much attention on the historical behaviour of fundamentals.

09:40
Comparing multivariate volatility forecasts by direct and indirect approaches

ABSTRACT. The evaluation of multivariate volatility models can be done through a direct or indirect approach. The former uses statistical loss functions (LFs) and a proxy providing consistent estimates of the unobserved volatility. The latter uses utility LFs or other instruments like the Value-at-Risk (VaR) and its backtesting procedures. Commonly, existing studies employ these procedures separately, focusing mostly on the multivariate generalized autoregressive conditional heteroskadisticity (MGARCH) models. This work aims to investigate and compare the two approaches in the model selection context. An extensive Monte Carlo simulation experiment is carried out including MGARCH models, based on daily returns, and, extending the current literature, models that directly use the realized covariance, obtained from intradaily returns. With reference to the direct approach, we empirically rank the set of competing models by means of four consistent statistical LFs and by deteriorating the quality of the volatility proxy. As regards to the indirect approach, we use standard backtesting procedures to evaluate if the number of VaR violations is acceptable and these violations are independently distributed over time.

10:05
Semiparametric Estimation of Multivariate GARCH Models

ABSTRACT. The paper introduces a new simple semiparametric estimator of the conditional variance covariance and correlation matrix (SP-DCC). While sharing a similar sequential approach to existing dynamic conditional correlation (DCC) methods, SP-DCC has the advantage of not requiring the direct parameterization of the conditional covariance or correlation processes, therefore also avoiding any assumption on their long-run target. In the proposed framework, conditional variances are estimated by univariate GARCH models, for actual and suitably transformed series, in the first step; the latter are then nonlinearly combined in the second step, according to basic properties of the covariance and correlation operator, to yield nonparametric estimates of the various conditional covariances and correlations. Moreover, in contrast to available DCC methods, SP-DCC allows for straightforward estimation also for the non-symultaneous case, i.e., for the estimation of conditional cross-covariances and correlations, displaced at any time horizon of interest. A simple ex-post procedure, to ensure well behaved conditional covariance and correlation matrices, grounded on nonlinear shrinkage, is finally proposed. Due to its sequential implementation and scant computational burden, SP-DCC is very simple to apply and suitable for the modeling of vast sets of conditionally heteroskedastic time series.

08:50-10:30 Session 8E: Productivity and business cycle
Location: Room IX
08:50
Cross-Section Dependence and Latent Heterogeneity to Evaluate the Impact of Human Capital on Country Performance: a Robust Nonparametric Frontier Model

ABSTRACT. Human Capital has been recognized as the most important force behind economic growth of countries. However, the effect of this important growth factor on economic growth remains ambiguous due to endogeneity and latent heterogeneity. By using a dataset of 40 countries over 1970-2007, we estimate the global frontier and explore the channels under which human capital and time affect the production process and its components: impact on the at- tainable production set (input-output space), and the impact on the distribution of efficiencies. We extend existing methodological tools - robust frontier in non parametric location-scale mod- els - to examine these interrelationships. We use a flexible nonparametric two-step approach on conditional efficiencies to eliminate the dependence of production inputs/outputs on common factors. We emphasize the usefulness of “pre-whitened” inputs/outputs to obtain more reliable measure of productivity and efficiency to better investigate the impact of human capital on the catching-up productivity process. Then, we take into account the problem of unobserved het- erogeneity and endogeneity in the analysis of the influence of human capital on the production process by extending the instrumental nonparametric approach proposed by Simar, Vanhems and Van Keilegom (JoE 2015) to account also for cross section and time dependence.

09:15
Natives and Migrants in Home Production: The Case of Germany

ABSTRACT. In this paper, we assess the impact of international migration, and the induced homecare service labour supply shock, on fertility decisions and labour supply of native females in Germany. Specifically, we consider individual data of native women from the German Socio-Economic Panel and we merge them with the data on the share of female immigrants and other regional labour market characteristics. We find that an increase of the share of female immigrants at the local level induces women to work longer hours and positively affects the probability to have a child. This effect strengthens for (medium) skilled women and, among them, for women younger than 35 years of age. The negative change in household work attitude confirms the behavioural validity of our results.

09:40
State dependence and unobserved heterogeneity in a double hurdle model for remittances: evidence from immigrants to Germany

ABSTRACT. The increasing availability of panel datasets makes it possible to explore the persistence in remittance decisions as a result of intertemporal choices, possibly consistent with several motivations to remit. Building a dynamic model with longitudinal data poses the additional problem of dealing with permanent unobserved heterogeneity; the specific censored nature of international transfers has also to be accounted for. We propose a dynamic, random-effects double hurdle model: we extend the traditional setting to account for state dependence and unobserved heterogeneity. Empirical evidence, based on the GSOEP dataset, suggests that there is significant state dependence in remitting behaviour consistent with migrants’ intertemporal allocation of savings; transaction costs are likely to affect the temporal steadiness of transfers.

11:00-12:40 Session 9A: Estimation II
Location: Room IV
11:00
Is a matrix exponential specification suitable for the modeling of spatial correlation structures?

ABSTRACT. This paper investigates the adequacy of the matrix exponential spatial specification (MESS) as an alternative to the widely used spatial autoregressive model (SAR). We first analyze the partial and marginal covariance structures, finding similar behavior for the MESS and SAR models in particular cases. We then propose a new implementation of Bayesian parameter estimation for the MESS model with vague prior distributions, which is shown to be precise and computationally efficient, and whose predictive accuracy is comparable to that of the SAR model. Our further proposal of a model including spatial splines among the regressors increases the predictive accuracy of the matrix exponential specification with regard to the modeling of the covariance matrix.

11:25
Copula-Based Random Effects Models for Clustered Data

ABSTRACT. Sorting and spillovers can create correlation in individual outcomes. In this setup, standard discrete choice estimators cannot consistently estimate the probability of joint and conditional events, and alternative estimators can result in incoherent statistical models, or intractable estimators. I propose a random effects estimator that models the dependence among the unobserved heterogeneity of individuals in the same cluster using a parametric copula. This estimator allows to compute joint and conditional probabilities of the outcome variable, and is statistically coherent. I describe its properties, establishing its efficiency relative to standard random effects estimators, and propose a test of hypotheses when the copula is misspecified. The likelihood function for each cluster is an integral whose dimensionality equals the size of the cluster, which may require high-dimensional numerical integration. To overcome the curse of dimensionality from which methods like Monte Carlo integration suffer, I propose an algorithm that works for Archimedean copulas. I illustrate this approach with an application of labor supply in married couples.

11:50
On the Estimation of Structural Gravity Models

ABSTRACT. This paper reconsiders the Poisson pseudo maximum likelihood estimation (PPML) of structural gravity models that account for the general equilibrium constraints imposed by the system of trade resistances. It is demonstrated that the widely used dummy PPML, does not yield the correct asymptotic distribution, if the DGP of trade flows actually adheres to these constraints. In contrast, an iterated projection based PPML approach as proposed by Heyde and Morton (1993) and Falocci, Paniccià and Stanghellini (2009) is the appropriate estimator. This paper establishes the asymptotic distribution of the iterated PPML estimator for structural gravity models as well as that of the comparative static predictions and the implied percentage changes. Monte Carlo simulations provide encouraging results for medium sized samples. Lastly, the proposed estimation procedure is illustrated by estimating the US-Canadian border effects as analyzed by Anderson and van Wincoop (2003).

12:15
Bootstrap inference under random distributional limits

ABSTRACT. Asymptotic bootstrap validity is usually understood and established as consitency of the distribution of a bootstrap statistic, conditional on the data, for the unconditional null limit distribution of a test statistic of interest. However, apart from possessing at most one unconditional limit distribution under a fixed asymptotic scheme, a test statistic in general may possess a host of conditional (random) limit distributions, depending on the choice of the conditioning sets. We discuss the appropriate probabilistic tools for establishing asymptotic bootstrap validity, in terms of asymptotic distributional uniformity of bootstrap P-values, in the case where the distribution of the bootstrap statistic conditional on the data estimates consistently a conditional null limit distribution of a test statistic, in a sense weaker than the usual weak convergence in probability. We then provide two general sufficient conditions for bootstrap validity in cases where weak convergence in probability fails. Finally, we argue that this is the appropriate ramework for a correct asymptotic analysis of the fixed-regressor bootstrap in the context of Hansen (2000) and its numerous applications, and apply our general conditions to Hansen's problem, thus contributing to the correction of a long-standing misunderstanding in the econometric literature.

11:00-12:40 Session 9B: Choice model
Location: Room VI
11:00
Swine Flu, Class Attendance, and Exam Performance: Should we force students to go to class?

ABSTRACT. The goal of the paper is twofold. First we explore the determinants in variations in marginal propensity to skip class across students. Next, we investigate the effects of the choice to skip class on scholastic outcomes. We exploit exogenous variation from a natural experiment in Greece that relaxed the class attendance regulation of high school students to identify the effect of absences on scholastic outcomes across the ability distribution. In the school year 2009-2010 high school students were allowed to skip 30% more hours of class due to a Swine flu outbreak in Europe in comparison to previous or following years with no penalty. Using data on swine flu cases we provide evidence that the treatment affected only absences and not school performance directly. We use an instrumental variables approach to identify the intention-to-treat effect of the policy as well as well-defined local average treatment effect of absences on grades. We find that the relaxed class attendance policy caused an increase in absences of roughly 10 hours. Our findings show a positive effect of absences upon grades for subjects with low cost of class work replication outside the classroom. In subjects with high out-of-class replication cost we find negative returns to absences in terms of end-of-the-year exam scores. Our results suggest that students who may have the resources or the human capital accumulation to learn outside the classroom may have lower performance when a strict attendance policy forces them to stay in class.

11:25
Hospital Choice in the NHS

ABSTRACT. In the NHS hospital care is free of charge and is provided in publicly funded hospitals. There is large literature on the characteristics of the hospital industry and its performance, especially in terms of quality of care and waiting times. One important question addressed in this literature is: How much market power do hospitals have? This typically involves considering hospital choice models and elasticity estimates. We study hospital choice in England for elective hip replacement 2006-2009. The procedure is interesting since there is substantial choice.In this paper we explicitly take into account that for many elective procedures patients can use a private Independent Sector (IS). As we document below, the share of hip replacements performed in IS hospital) is actually substantial, and has been changing over time. Considering the presence of this non NHS (outside) option, as we document below, clearly affects estimation results and policy implications. We estimate a Micro-BLP model, and nd that models of hospital choice which do not explicitly consider the outside option or do not control for endogeneity obtain seriously biased estimates of quantities used in health policy.

11:50
Intertemporal discrete choice

ABSTRACT. Choice probabilities of the multinomial logit depend on the scale of the value function. When applied to intertemporal choice, it implies a failure of stationarity, although future values are discounted geometrically. As a consequence, patterns of choice following from the structure of the logit may be attributed to non-geometric discounting. We solve this problem introducing the discounted Luce rule. It retains the flexibility of the logit while satisfying stationarity. Relaxations of stationarity give observable restrictions characterizing hyperbolic and quasi-hyperbolic discounting. In the estimation of time preference parameters from real data, the discounted Luce rule and the discounted logit produce substantially different results.

12:15
Do free choice and control affect individual preferences for redistribution? Evidence from a sample of 74 countries

ABSTRACT. Why do some people believe that social mobility may result in a fair distribution of income? We hypothesize that people who enjoy higher levels of free choice and control over life outcomes are more likely to consider the income distribution process to be fair, and as a consequence be less supportive of redistribution. We test this claim using a triangular system in order to endogeneize fairness in the process that determines income distribution. The empirical results support our hypothesis. Our findings represent a development in the empirical literature on the determinants of people’s preferences concerning income redistribution, in that until now analysis has generally treated the level of fairness in social mobility exogenously.

11:00-12:40 Session 9C: Factor models
Location: Room VII
11:00
Interpreting latent dynamic factors by threshold FAVAR model

ABSTRACT. This paper proposes a method to interpret factors which are otherwise difficult to assign economic meaning to by utilizing a threshold factor-augmented vector autoregression (FAVAR) model. We observe the frequency of the factor loadings being induced to zero when they fall below the estimated threshold to infer the economic relevance that the factors carry. The results indicate that we can link the factors to particular economic activities, such as real activity, unemployment, without any prior specification on the data set. By exploiting the flexibility of FAVAR models in structural analysis, we examine impulse response functions of the factors and individual variables to a monetary policy shock. Our results express that the proposed method yields an intensive layout on the interpretation of factors and shock transmission.

11:25
Unstable Diffusion Indexes: With an Application to Bond Risk Premia

ABSTRACT. This paper studies estimation and inference in diffusion indexes with structural instability. Factor model and factor augmented regression both experience a structural change with different unknown break dates. In the factor model, we estimate factors and loadings by principal components, and break fraction by concentrated least squares: convergence rates of the estimators are obtained, and model selection criteria robust to the unknown break date are introduced. We consider least squares estimation of the factor augmented regression and propose a break test. The empirical application uncovers potential instabilities in the linkages between bond risk premia and macroeconomic factors.

11:50
Evaluating Restricted Common Factor models for non-stationary data

ABSTRACT. We propose to evaluate exclusion or homogeneity restrictions on the loadings in non-stationary factor models on the basis of the number of factors estimated for the data partially de-factored under the null hypothesis, with the probability of rejecting true null hypothesis estimated by the bootstrap. We show analytically that the proposed bootstrap procedure is asymptotically valid, and by simulation that it has good small sample properties.

12:15
Estimating Stable Latent Factor Models by Indirect Inference

ABSTRACT. Cross-sections of financial returns are characterized by common underlying factors and exhibit fat tails that may be captured by alpha-stable distributions. This paper focuses on estimating factor models with independent latent factors and idiosyncratic noises featuring a multivariate $\alpha$-stable distribution constant over time (static factor models) or a time-varying conditional multivariate alpha-stable distribution (GARCH factor models). Although the simulation of such a distribution is straightforward, the estimation of its parameters encounters difficulties. These difficulties are overcome in this paper by implementing the indirect inference estimation method with the multivariate Student's t as the auxiliary distribution.

11:00-12:40 Session 9D: Monetary policy
Location: Room VIII
11:00
Changing Monetary Regime: Does It Improve Macroeconomic Performance?

ABSTRACT. Inflation targeting and the Euro adoption represent the major monetary development adopted in the recent decades. Their ultimate goal is represented by the aim of improving economic outcomes. Taylor (1979) argues that in evaluating monetary policy efficiency (i.e. its influence on macroeconomic performance), the analysis can be conducted on the basis of output and inflation variability, on which structural supply shocks have a detrimental effect which cannot be mitigated by the monetary authorities. We show that once accounted for the heterogeneity of the supply shocks hitting the different economies under investigation, results clearly show that inflation targeting has had a beneficial effect for countries which have adopted such regime. Opposite conclusion can be drawn for the Euro monetary regime, where the adoption of the new monetary framework has had a negative impact on his efficiency with respect to non-Euro countries.

11:25
Forecast uncertainty in the neighborhood of the effective lower bound: How much asymmetry should we expect?

ABSTRACT. The lower bound on interest rates has restricted the impact of conventional monetary policies over recent years and could continue to do so in the near future, with the decline in natural real rates not predicted to reverse any time soon. A binding lower bound on interest rates has consequences not only for point forecasts but also for the entire model forecast distribution. In this paper we investigate the ramifi cations of the lower bound constraint on the forecast distributions from DSGE models and the implications for risk and uncertainty. To that end we start out by making the case for regime-switching as a framework for imposing the lower bound constraint on interest rates in DSGE models. We then use the framework to investigate the implications of the lower bound constraint on the forecast distributions and try to answer the question of how much asymmetry we should expect when the lower bound binds. The results suggest that: i) a lower bound constraint need not in itself imply asymmetric fan charts, ii) the degree of asymmetry of fan charts depends on various factors such as the degree of interest rate smoothing and the degree of price rigidity, and iii) di fferent approaches to imposing the lower bound yield diff erent results for both the width of the fan charts and their asymmetry.

11:50
Macroeconomic responses to an independent monetary policy shock: a (more) agnostic identification procedure

ABSTRACT. This study investigates the effects of a monetary policy shock on real output and prices, by means of a novel distribution-free nonrecursive identification scheme for structural vector autoregressions. Structural shocks are assumed to be mutually independent. The identification procedure is agnostic in Uhlig [2005]’s sense, since the response of output to a monetary shock is not restricted. Moreover, assuming mutual independence of the shocks allows us to impose no additional constraints derived from economic theory.

11:00-12:40 Session 9E: Session SIE - Measuring crime and politics in economics
Location: Room IX
11:00
The Legacy of Political History 1000-1800 for Attitudes Towards the State: Disaggregated Analysis for Italy

ABSTRACT. The need to build fiscal capacity in the territories that gained independence from the Holy Roman Empire after 1000 AD lead to the emergence of political entities with more inclusive economic institutions. The long term exposure to higher individual freedom (of economic and political initiative) and more productive public policies affected the economic and political attitudes of the affected populations and their descendants that persist still today. To test this hypothesis we reconstruct the political history of each location in Italy by building a yearly cell-level panel collecting information of all political entities ruling in the Italian territory over the period 1000-1800. The results document that in municipalities that belonged to these independent political entities tax evasion is today significantly lower, that the effect is stronger in those places that first achieved independence with respect to places annexed later, and it differs depending on the type of republican institution implemented (communal or maritime).

11:25
War of the Waves: Radio Propaganda, Violence and Political Polarization

ABSTRACT. Can counter-propaganda by a foreign democratic country help to overthrow an authoritarian regime? We analyze this question in the context of the Nazi-Fascist occupation in Italy during WWII by studying the effect of BBC radio counter-propaganda (Radio Londra) on the activities of partisan resistance fighters aimed at overthrowing the fascist regime. Using variation in monthly sunspots activities affecting the sky-wave propagation of BBC broadcasting towards Italy, we show that BBC radio had a significant impact on political violence. In particular, an increase in the signal to noise ratio of the BBC radio in a given municipality-month was associated with an increase in the victims of nazi-fascist retaliations in response to partisan insurgents attacks.

11:50
The Political Cost of Being Soft on Crime: Evidence from a Natural Experiment

ABSTRACT. We provide evidence about voters' response to crime policies. Our research design exploits a natural experiment arising from the Italian 2006 collective pardon bill suddenly releasing more than one third of the prison population. The design of the bill created idiosyncratic incentives to recidivate across pardoned individuals and municipalities. We show that municipalities where the incentive to recidivate of resident pardoned individuals was higher, experienced higher recidivism. At the same time, in these municipalities: i) newspapers were more likely to report crime news involving pardoned individuals; ii) voters held worse beliefs on the incumbent government's ability to control crime. Moreover, with respect to the previous elections, the incumbent national government experienced a signicantly worse electoral performance in the April 2008 elections relative to the opposition coalition. In terms of political cost, our estimates suggest that a one standard deviation increase in the incentive to recidivate (i.e., the random component in the effects of the policy) reduced the margin of victory of the incumbent national government by 3.3%. Overall, our findings indicates that voters keep incumbent politicians accountable by conditioning their vote on the observed effects of their policies.