ABSTRACT. An insidious form of market inefficiency, by which prices lose their informativeness and wealth is distributed arbitrarily, translates into V-shapes, that is sudden changes of the sign of the price drift. We use this insight to develop a new tool for the detection of reverting drift, the V-statistic. We apply this tool to (i) quantify the extent of this kind of market inefficiency in the U.S. stock market during the Covid-19 pandemic; and (ii) show the harmful consequences of V-shapes on financial stability by estimating the huge loss suffered by Italian taxpayers (0.45B euros) in May 2018, when a transient crash hit the secondary bond market during a Treasury auction.
Do jumps matter in Realized Volatility modeling and forecasting? Empirical evidence and a new model
ABSTRACT. Building on an extensive empirical analysis I investigate the relevance of jumps and signed variations in predicting Realized Volatility. I show that properly accounting for intra-day volatility patterns and staleness sensibly reduces the identified jumps. Realized Variance decompositions based on intra-day return size and sign improve the in-sample fit of the models commonly adopted in empirical studies. I also introduce a novel specification based on a more informative decomposition of Realized Volatility, which offer improvements over standard models. From a forecasting perspective, the empirical evidence I report shows that most models, irrespective of their flexibility, are statistically equivalent in many cases. This result is confirmed with different
samples, liquidity levels, forecast horizons and possible transformations of the dependent and explanatory variables.
A Structural Model of Market Friction with Time-varying Volatility
ABSTRACT. We deal with the problem of extracting the volatility of a financial security when its prices are not frequently updated over time.
We propose a model of price formation in which the observed price varies only if the value of the information
signal is large enough to guarantee a profit in excess of transaction costs. Using transaction data only,
we extract: (i) the conditional volatility of the underlying security, which is thus cleaned out by market frictions, (ii) an estimate of transaction costs. We apply the model to a large dataset of intraday prices. The analysis reveals that, when correcting for transaction costs, the risk of illiquid securities is substantially different from what predicted by traditional volatility models.
ABSTRACT. We study the properties of realized high-order moments under a data generating process accounting for key stylized features: infrequent discontinuities in unobserved equilibrium prices and staleness in observed prices, a phenomenon linked to volume dynamics. Our focus is on identification and pricing. In terms of identification, we show how the interplay between price discontinuities and prices staleness will, in general, lead to biased and/or noisy high- order moment estimates. We also show how a combination of thresholding and corrections for staleness-induced biases can be deployed to extract reliable information about high-order continuous and discontinuous variation. Regarding pricing, the use of thresholding and de- biasing leads to ample evidence about the negative cross-sectional pricing of idiosyncratic price discontinuities at high frequency. We show that accounting for staleness is (1) important for the correct identification of high-order moments, (2) revealing about these moments’ cross-sectional pricing and (3) informative about the pricing of illiquidity, for which staleness is a rich proxy.
Nowcasting with Large Bayesian Vector Autoregressions
ABSTRACT. Monitoring economic conditions in real time, or nowcasting, is among the key tasks routinely performed by economists. Nowcasting entails some key challenges, which also characterize modern Big Data analytics, often referred to as the three "Vs": the large number of time series continuously released (Volume), the complexity of the data covering various sectors of the economy, published in an asynchronous way and with different frequencies and precision (Variety), and the need to incorporate new information within minutes of their release (Velocity).
In this paper, we explore alternative routes to bring Bayesian Vector Autoregressive
(BVAR) models up to these challenges. We find that BVARs are able to effectively handle the three Vs and produce, in real time, accurate probabilistic predictions of US economic activity and, in addition, a meaningful narrative by means of scenario analysis.
A Bayesian Model Averaging Analysis for Propensity Score Matching in Tax Rebate
ABSTRACT. Propensity Score Matching (PSM) is a popular approach to evaluate treatment effects in observational studies. While model selection for the PS estimation is often naive in practice, the choice of variables in this step is crucial because the related treatment effect estimate is highly dependent on it. We propose dealing with such model uncertainty by Bayesian Model Averaging (BMA) for the PS model and, in particular, an empirical application based on the 2014 Italian tax credit reform (the so-called “Renzi bonus”) is presented. We show that model uncertainty importantly affects the estimated treatment effects and how the proposed BMA-based estimator help to drastically reduce it.
ABSTRACT. This paper extends the horseshoe prior of Carvalho et al. (2010) to the Bayesian quantile regression (HS-BQR) and provides a fast sampling algorithm that speeds up computation significantly in high dimensions. The performance of the HS-BQR is tested on largescale Monte Carlo simulations and a high dimensional Growth-at-Risk (GaR) forecasting exercise for the U.S.. The Monte Carlo design considers several sparsity structures (sparse, dense, block) and error structures (i.i.d. errors and heteroskedastic errors). Compared to alternative shrinkage priors, the proposed HS-BQR yields at worst similar, or better performance considered when evaluated using coefficient bias and forecast error. We find that the HS-BQR is particularly potent in sparse designs and when estimating extreme quantiles. The simulations also highlight that in order to identify quantile specific location and scale effects for individual regressors in dense DGPs, a lot of data are necessary. In the GaR application, we forecast tail risks as well as complete forecast densities using the McCracken database. Quantile specific and density calibration scoring functions show that the HS-BQR provides the best performance, especially at short and medium run horizons. The ability to produce well calibrated density forecasts and accurate downside risk measures in the face of large data contexts makes the HS-BQR a promising tool for nowcasting applications and recession modelling in the face of the Covid-19 pandemic.
Sampling properties of the Bayesian posterior mean with an application to WALS estimation
ABSTRACT. Many statistical and econometric learning methods rely on Bayesian ideas, often applied or reinterpreted in a frequentist setting. Two leading examples are shrinkage estimators and model averaging estimators, such as weighted-average least squares (WALS). In many instances, the accuracy of these learning methods in repeated samples is assessed using the variance of the posterior distribution of the parameters of interest given the data. This may be permissible when the sample size is large because, under the conditions of the Bernstein--von Mises theorem, the posterior variance agrees asymptotically with the frequentist variance. In finite samples, however, things are less clear. In this paper we explore this issue by first considering the frequentist properties (bias and variance) of the posterior mean in the important case of the normal location model, which consists of a single observation on a univariate normal distribution with unknown mean and known variance. Based on these results, we derive new estimators of the frequentist bias and variance of the WALS estimator in finite samples. We then study the finite-sample performance of the proposed estimators by a Monte Carlo experiment with design derived from a real data application about the effect of abortion on crime rates.
Testing the Adequacy of the Fixed Effects Estimator in the Presence of Cross-section Dependence
ABSTRACT. A large literature on modelling cross-section dependence in panels has been developed
via interactive e¤ects. One area of contention in this literature is the important hypothesis
concerned with whether the regressors are correlated with factor loadings or not. Under the null
hypothesis that the regressors are uncorrelated with loadings, we can simply use the consistent
but robust two-way xed e¤ects (FE) estimator without employing any complex econometrics
such as the principal component (PC) or the common correlated e¤ects estimators. Recently,
Kapetanios, Serlenga, and Shin (2020) propose a Hausman-type test which addresses this
issue, which requires us to estimate the PC estimator for its implementation. In this paper
we propose an LM-type test that addresses this drawback. Furthermore, we show that the
Hausman-type test is sensitive to the misspeci cation of the panel model causing the incorrect
rejection of the null while the LM-type test is invariant to such misspeci cation. Monte Carlo
simulation results con rm the satisfactory size and power performance of the LM-type test
even in small samples. Finally, we provide quite extensive empirical evidence in favour of the
no correlation between the regressors and factor loadings in practice. In this situation, the FE
estimator would provide a simple but robust estimation strategy evn for the cross-sectionlly
correlated panels.
The identication of time-invariant variables in a fixed effect framework
ABSTRACT. This paper proposes a new set of moment conditions for the identification
of the effects of time-invariant variables in a linear panel data framework
with fixed effects. We show that identification of time-invariant variables
can be achieved under the assumptions that (i) the correlation between the
individual effect and the regressors is constant over time, and (ii) the time-
invariant variables are correlated with the within variability of the time-
variant variables (e.g., their growth rate). A set of Monte Carlo experiments
is performed, and the proposed methodology is applied for the identification
of the role of appropriability conditions in fostering R&D efforts.
Spatial Technological Clubs across Europe: A Panel Data Model with Cross Sectional Dependence
ABSTRACT. The aim of this paper is to evaluate whether the European regions form regional clusters that differ from the political borders. We analyse interdependence across regions of production efficiency and technology clusters and estimate the degree of regional technological interdependence generated by the level of spatial externalities. The net effect of these spatial externalities on the level of productivity of each region depends on the relative connectivity between this region and its neigh- bors and the closeness to the efficiency frontier regions (with highest production levels). The more a given region is connected to its neighbors, the more it benefits from spatial externalities. The more efficient is a region, the more it profits from technological externalities.
By proposing a new approach to model spatial heterogeneity, we consider both strong and weak spatial dependence and model European regions TFP spatial clusters in a production frontier panel data model with global factors. Our approach differs from the others, because we model heterogenous spatial dependence in stochastic frontier panel data model in a framework where we control also for strong cross-sectional dependence due to global factors (Mastromarco et al., 2013, 2016). We use data from Cambridge Econometrics European Regional Database contains annual observations for the period 1980-2015 for NUTS3 EU 25 regions on: employment (thousands of people); hours worked; gross fixed capital formation in millions of Euros 2005 prices; gross value added in millions of Euros 2005 prices.
Partial effects estimation for fixed-effects logit panel data models
ABSTRACT. We propose a multiple step procedure to estimate Average Partial Effects (APE) in fixed-effects panel logit models. Because the incidental parameters problem plagues the APEs via both the inconsistent estimates of the slope coefficients and of the individual parameters, we reduce the bias by evaluating the APEs at a fixed-T consistent estimator for the slopes and at a bias reduced estimator for the unobserved heterogeneity. The proposed estimator has bias of order O(1/T^2) as n goes to infinity and performs well in finite sample, even when n is much larger than T. We provide a real data application based on the labor supply of married women.
Too Good is Bad? Exuberance Indicators and the Business Cycle
ABSTRACT. Following the recent revival of the endogenous business cycle theories which highlight the predictability of changes of the business cycle phases, we provide a first comprehensive test of the associated ``too good is bad" hypothesis. The literature so far has convincingly shown that a rapid growth of credit predicts financial instability and is followed by a deeper recession. We employ the concept of ``exuberance indicators" and extend the list of these indicators beyond the credit by considering several other domestic overheating and external imbalance indicators. In a panel of 25 countries, we show that our exuberance indicators - besides credit market exuberance - convey important information about the probability of future recession, controlling for the classical recession predictors (term spread, stock market return, short-term interest rates) and sentiment indicators. In a causal analysis, we show that domestic credit supply and global financial shocks push exuberance indicators to hazardous values associated with a higher recession risk. Our counterfactual analysis indicates that, if we shut down global financial shocks during 2002--2006, the probability of the 2007--2009 recession would decline on average by 11 percentage points.
ABSTRACT. We apply a new time series model – the Generalized Logistic Smooth Transition Autoregression – to macroeconomic and fiscal data for the U.S. economy. The model capable to incorporate cyclical oscillations characterized by asymmetry in levels and duration (or dynamic asymmetry). A specific-to-general econometric strategy and two LM-type tests are implemented in order to exclude the null hypotheses that these variables are linear or symmetric. The evidence suggests that dynamic asymmetry is a reasonable hypothesis on the postwar data up to 2008 and that the fiscal multiplier computed from dynamic asymmetric specifications is radically different with respect the ones obtained by an equivalent symmetric model, specially in the medium run. The public spending is largely effective in re- cession either in the short, either in medium run. Finally, the improved flexibility of the transition function here used constitutes a new methodological challenge for a multivariate analysis.
Modeling and Forecasting Macroeconomic Downside Risk
ABSTRACT. We investigate the relation between downside risk to the economy and the financial
markets within a fully parametric model. We characterize the complete predictive distribution of GDP growth employing a Skew-t distribution with time-varying location, scale, and shape, for which we model both secular trends and cyclical changes. Episodes of downside risk are characterized by increasing negative asymmetry, which emerges as a clear feature of the data. Negatively skewed predictive distributions arise ahead and during recessions, and tend to be anticipated by tightening of financial conditions. Indicators of excess leverage and household credit outstanding are found to be significant drivers of downside risk. Moreover, the Great Recession marks a significant shift in the unconditional distribution of GDP growth, which has featured a distinct negative skewness since then. The model delivers competitive out-of-sample (point and density) forecasts, improving upon standard benchmarks, especially due to financial conditions providing a strong signal of increasing downside risk.
Fiscal space and the size of the Fiscal Multiplier
ABSTRACT. This paper investigates the interaction between fiscal policy transmission and
fiscal sustainability, captured through the concept of fiscal space. In order to measure the evolution of fiscal space over time we propose four indicators, drawing from different concepts available in the literature. We use these indicators to define periods of ample and tight fiscal space. We then estimate the effects of government spending shocks in the United States according to the level of fiscal space, for the period 1929:Q1-2015:Q4. The main finding of the paper is that the fiscal multiplier is above one when fiscal space is ample, while it is below one when fiscal space is tight. Moreover, such difference is always significant. This result is very robust across different identification methods and samples.
ABSTRACT. Asset prices are stale. We define a measure of systematic (market-wide) staleness as the percentage of small price adjustments over multiple assets. A notion of idiosyncratic (asset-specific) staleness is also established. For both systematic and idiosyncratic staleness, we provide a limit theory based on joint asymptotics relying on increasingly-frequent observations over a fixed time span and an increasing number of assets. Using systematic and idiosyncratic staleness as moment conditions, we introduce novel structural estimates of market liquidity and funding liquidity based on transaction prices only. The estimates yield revealing information about the dynamics of the two notions of liquidity and their interaction.
ABSTRACT. We suggest the Doubly Multiplicative Error class of models (DMEM) for modeling and forecasting realized volatility, which combines two components accommodating low–, respectively, high–frequency features in the data. We derive the theoretical properties of the Maximum Likelihood and Generalized Method of Moments estimators. Two such models are then proposed, the Component −MEM, which uses daily data for both components, and the MEM− MIDAS, which exploits the logic of MIxed–DAta Sampling (MIDAS). The empirical application involves the S&P 500, NASDAQ, FTSE 100 and Hang Seng indices: irrespective of the market, both DMEM’s outperform the HAR and other relevant GARCH–type models.
Shrinkage for Gaussian and t Copulas in Ultra-High Dimensions
ABSTRACT. Copulas are a convenient framework to synthesize joint distributions, particularly in higher dimensions. Currently, copula-based high dimensional settings are used for as many as a few hundred variables and require large data samples for estimation to be precise. In this paper, we employ shrinkage techniques for large covariance matrices in the problem of estimation of Gaussian and t copulas whose dimensionality goes well beyond that typical in the literature. Specifically, we use the covariance matrix shrinkage machinery of Ledoit and Wolf to estimate correlation matrices of Gaussian and t copulas for up to thousands of variables, using up to 20 times lower sample sizes. The simulation study shows that the shrinkage estimation significantly outperforms traditional estimators, especially in high dimensions. We also apply this approach to the problem of allocation of large portfolios.
LSTUR Regression Theory and the Instability of the Sample Correlation Coefficient Between Financial Return Indices
ABSTRACT. It is well known that the sample correlation coefficient between many financial return indices exhibit substantial variation on any reasonable sampling window. This stylized fact contradicts a unit root model for the underlying processes in levels, as the statistic converges in probability to a constant under this modeling scheme. In this paper we establish asymptotic theory for regression in local stochastic unit root (LSTUR) variables. An empirical application reveals that the new theory explains very well the instability, in both sign and scale, of the sample correlation coefficient, between gold, oil and stock return price indices. In addition, we establish spurious regression theory for LSTUR variables, which generalizes the results known hitherto, as well as theory for balanced regression in this setting.
The Global Information Effect: Central Bank Information, International Spillovers, and Flight to Quality
ABSTRACT. In this paper, we provide evidence that when unexpected increases in the US policy rate are associated with increases in equity prices (defined as a positive central bank information shock), the US dollar depreciates. We argue that this phenomenon occurs because investors revise their assessment of the level of financial risk in the economy in response to Federal Reserve announcements. Downward revisions in the level of financial risk perceived by investors leads capital to flow towards emerging markets in pursuit of higher yields. Conversely, upward revisions in the level of financial risk perceived by investors leads capital to flow towards safe-haven currencies, causing an appreciation of the US dollar and safe-haven currencies vis-à-vis emerging market currencies. In support of this hypothesis, we provide evidence of large spillover effects onto global safe-haven currencies, risk premia, cross-border credit and portfolio flows, and ultimately, on global economic activity. We argue that these findings suggest the presence of a global information effect channel, whereby information released by the Federal Reserve drives international capital flows and affects economic activity in the rest of the world.
ABSTRACT. We investigate the effects of Basel II and III accords on banks’ cost and profit inefficiency regarding profit-oriented banks and mutual cooperative banks in Italy over the 1994-2015 period. We also consider the impact of regulation in different market structures. To measure banks’ efficiency, a Stochastic Frontier approach has been proposed. Moreover, we evaluate the impact of the introduction of Basel II and III modelling the banks’ inefficiency component. We show that the implementation of the Basel regulation had asymmetric effects on the Italian banking system. Evidence provided in this paper is somewhat mixed. Indeed, the general picture that emerges from our analysis is that Basel II and III worsened the profit inefficiency of the overall Italian banking system, tough cost efficiency among mutual cooperative banks increased after the introduction of Basel II. Moreover, the impact of the regulation is found to be not statistically significant once we control for the degree of market competition.
The Upside Down: French banks under negative policy rates
ABSTRACT. Using confidential bank-firm level data on France, we show that the introduction of negative rates is associated with an increase in lending by banks with greater reliance on deposits, especially by those with lower capital and larger shares of liquid and households deposits. Consistent with portfolio rebalancing, negative rates elicit reallocation toward riskier and long-term assets, as banks shrink their share of interbank liquidity and grow that of corporate loans and debt securities. These results suggest that negative rates encourage banks most reliant on deposits to engage in riskier activities to restore profitability and confirm that deposits play a key role in the transmission of monetary policy rates below the zero lower bound.
ABSTRACT. We analyze this relationship between market structure and financial stability both theoretically and empirically, by considering two types of agents in credit market: profit-oriented banks (PBs) and mutual cooperative banks (MBs). The main theoretical finding shows that, under the condition that soft information, provided by MB banks, is effective in reducing credit risk, a less concentrated market structure reduces instability for both types of banks. The empirical evidence, based upon Italian context, reveals a U-shaped relationship between market concentration and the instability of banks, in the sense that a lower concentration reduces the instability, but when the concentration is higher the instability increases.
Bargaining power and the Phillips curve: a micro-macro analysis
ABSTRACT. We use a general equilibrium model to show that a decrease in workers' bargaining power amplifies the relative contribution of adjustments along the extensive margin of labour utilization to output gap. This mechanism reduces the cyclical movements of marginal cost (and inflation) relative to the ones of the output gap.
We show that the relationship between bargaining power and adjustments along the extensive margin (relative to the intensive one) is supported by microdata, by relying on panel data from the Italian survey of industrial firms. The Bayesian estimation of the model with euro area aggregate data over the 1970-1990 and 1991 - 2016 samples confirms that the decline in workers' bargaining power has weakened the inflation - output gap relationship.
Micro level data for macro models: the distributional effects of monetary policy.
ABSTRACT. In this paper we investigate the effect of standard and non-standard monetary policy implemented by the ECB on income inequality in Italy.
We use for the first time the survey micro level data on Income and Living Conditions (EU-SILC, Istat) in a repeated cross-section experiment to build measures of inequality and the distribution over time for incomes and subgroups of individuals. The identification strategy is based on surprises estimated in the EA-MPD database for the Euro Area.
Using a battery of Local Projections, we evaluate the impact of monetary policy by comparing the performance of the impulse response functions of our inequality measures in different policy scenarios (pre and post-QE).The main findings show that an expansionary unconventional monetary policy shock compressed inequality of disposable, labor and financial income more persistently than a conventional monetary shock. These effects are heterogeneous and seem to benefit mostly the bottom of the distribution. The impact on financial wealth is ambiguous favoring the wealthy households mainly in the short-run. Overall, our evidence suggests that QE is associated with a decrease in Italian households inequality.
Labor Market Power and Between-Firm Wage (In)Equality
ABSTRACT. I study how labor market power affects between-firm wage differences using German manufacturing sector data from 1995 to 2016. Over time, firm labor market power (the difference between wages and marginal revenue products of labor (MRPL)) increasingly moderated rising between-firm wage inequality. This is because high-paying firms possess high and increasing labor market power and pay wages below their MPRL, whereas wages equal the MRPL in low-wage firms. Over time, large, high-wage, high-MRPL firms generate increasingly large labor market rents while being active on competitive product markets. This provides novel insights on why such “superstar firms” are profitable and successful.
ABSTRACT. We propose a large structural VAR model with identification scheme based on sign restriction to identify several structural shocks. We disentangle supply and demand shocks from labor market shocks and quantify their importance for economic fluctuations. We take advantage of data on labor force participation rate to impose additional restrictions, gauge its responsiveness to shocks and identify an additional shock (the price mark-up shock) that enhances participation in the labor market. For our experiment, we use quarterly Italian data spanning the period 2000Q1-2018Q4 in an attempt to shed light on the ongoing debate among researchers and policymakers about the development of the Italian labor market. Our results show that labor market shocks are among the largest drivers of macroeconomic fluctuations. We find that the wage bargaining shock plays a central role in driving up macroeconomic volatility in the short-run while the relevance of labor supply shocks is pronounced in the middle-run and long-run. The role of the mismatch shock in explaining macroeconomic dynamics is sizable but limited for the business cycle. The dominance of labor market shocks is amplified across horizons and various alternative specifications.
ABSTRACT. In this paper we investigate the relationship between firm-specific investor sentiment, measured by applying text analysis on news stories published by Thomson
Reuters, and merger and acquisition (M&A) deals announced by US listed companies
between 1997 and 2018. We find that a more positive investor sentiment increases the
probability of firms to announce acquisitions; moreover, we investigate a number of
potential channels able to explain such a relationship. In this respect, we do not find
that the overvaluation hypothesis or the catering theory are able to give account for
the impact of investor sentiment on acquisition announcements. Instead, by studying the short- and long-run stock market reaction to merger announcements and its
relationship with investor sentiment, we find a positive short-run correlation which is
reversed in the long-run. These results provide evidence for the overoptimism theory
of mergers, which states that in periods characterized by more optimistic investor
sentiment managers are more induced to pursue acquisitions and these are better
perceived by the stock market, even though they perform worse in the long-run.
A sentiment-based risk indicator for the Mexican financial sector
ABSTRACT. We apply sentiment analysis to Twitter messages in Spanish to build a Sentiment Risk Index for the financial sector in Mexico. Using a sample of tweets that covers the period 2006-2019, we classify the tweets considering whether they reflect a positive or negative shock on Mexican banks, or whether they are merely informative. We compare the performance of three classifiers: the first based on word polarities from a pre-defined dictionary, the second on a Support Vector Machine Classifier and the third on Neural Networks. We find that the Support
Vector Machine classifier has the best performance of the three we test. We also compare this proposed Sentiment Risk Index with existing indicators of financial stress based on quantitative variables. We find that this novel index captures the eect of sources of financial stress that are not explicitly reported in quantitative risk measures, such as financial frauds, fails in payment systems and money laundering. We also show that a shock in the Twitter Sentiment Index increases stock market volatility and foreign exchange rate volatility, having a significant effect on overall financial market risk, especially for the private sector.
Do words hurt more than actions? The impact of trade tensions on financial markets
ABSTRACT. In this paper, we apply textual analysis and machine learning algorithms to US President's tweets to construct an index capturing trade tensions between US and China. Our indicator matches well-known events in the US-China trade dispute and is exogenous to the developments on global financial markets. By means of local projection methods, we show that US markets are largely unaffected by rising trade tensions, with the exception of those firms that are more exposed to China, while the same shock negatively affects stock market indices in EMEs and China. Higher trade tensions also entail: i) an appreciation of the US dollar; ii) a depreciation of EMEs currencies; iii) muted changes in safe haven currencies; iv) portfolio re-balancing between stocks and bonds in the EMEs.
We also show that trade tensions account for around 15% of the variance of Chinese stocks while their contribution is muted for US markets. These findings suggest that the US-China trade tensions are interpreted as a negative demand shock for the Chinese economy rather than as a global risk shock.
ABSTRACT. Investor sentiment is measured at both global and local levels as the common component of pricing errors investors make when valuing stocks. Investor sentiment and macroeconomic factors are jointly modelled within a hierarchical dynamic factor model allowing for time-varying parameters and stochastic volatility. We extend existing methods to enable estimation of the model with the prescribed hierarchy which permits cross-country analysis. Our approach allows us to control for macroeconomic conditions that may contaminate investor sentiment indices. We find that global investor sentiment is a key driving force behind domestic sentiment and global economic conditions.