# Publications

## Articles in journals

We focus on forecasting the probability that euro-area inflation will fall into one of three intervals by employing an ordered multinomial model augmented with macroeconomic variables. We directly forecast the probability that the expected euro area HICP price index inflation rate (12-month percent changes) over the next 12 and 24 months will be less than 1.5 percent, exceed 2 percent, or be between these two values. The model includes many predictors, and deal with dimensionality issues by an approach which mixes factor models with Bayesian shrinkage. Our results show that macroeconomic variables’ inclusion improves the model’s forecast quality, especially at the longer horizon considered. The Deflationary Pressure Index coincides with the probability that inflation is below 1.5 percent on average in the next 24 months, and it is useful as a policy monitoring tool.

We map ECB policy communications onto yield curve changes and study the information flow on monetary policy decision dates. We find that different monetary policy measures exert effects on different segments of the interest rate term structure, with policy rate changes mostly influencing the short end of the curve and quantitative easing measures acting more on the long end. The impact of forward guidance policies, by contrast, reaches its peak at intermediate maturities. A very useful by-product of this work is the publicly available Euro Area Monetary Policy Event-Study Database (EA-MPD), containing intraday asset price changes.

We map ECB policy communication into yield curve changes and study the information flow on policy dates. A byproduct is the publicly available Euro Area Monetary Policy Event-Study Database (EA-MPD), containing intraday asset price changes. We find that Policy Target, Forward Guidance and Quantitative Easing factors capture about all the variation in the yield curve, with different factors appearing in the windows covering the policy decision announcement and the press conference, and having time-varying variance shares. We study sovereign yields, exchange rates, stock prices, persistence of effects and response asymmetry. Our methodology can be implemented for any policy-related event.

Catholic countries of Europe pose a demographic puzzle -fertility is unprecedentedly low (total fertility=1.3) despite low female labor force participation. We model three channels of religious effects on demand for children: through changing norms, reduced market wages, and reduced costs of childrearing. We estimate their effects using new panel data on church attendance and clergy employment for thirteen European countries from 1960-2000, spanning the Second Vatican Council (1962-65). Catholic theology is uniform across countries. Yet service varied considerably across countries and over time, especially before the Council, reflecting differences in Church provision of education, health, welfare and other social services. We use differential declines in service provision --measured by nuns/capita-- to identify its effect on fertility, controlling for secular trends. They are large: 300 to 400 children per nun. Reduced religiosity (measured by church attendance) has no effect for Protestants, but predicts fertility decline for Catholics. The data suggest that service provision and religiosity complement each other -a finding consistent with preferential provision of services to church attendees. Nuns outperform priests in predicting fertility, suggesting that the childrearing cost channel dominates theology and norms.

The dynamic behavior of the term structure of interest rates is difficult to replicate with mod- els, and even models with a proven track record of empirical performance have underperformed since the early 2000s. On the other hand, survey expectations can accurately predict yields, but they are typically not available for all maturities and/or forecast horizons. We show how survey expectations can be exploited to improve the accuracy of yield curve forecasts given by a base model. We do so by employing a flexible exponential tilting method that anchors the model forecasts to the survey expectations, and we develop a test to guide the choice of the anchoring points. The method implicitly incorporates into yield curve forecasts any information that survey participants have access to - such as information about the current state of the economy or forward-looking information contained in monetary policy announcements - without the need to explicitly model it. We document that anchoring delivers large and significant gains in forecast accuracy relative to the class of models that are widely adopted by financial and policy institutions for forecasting the term structure of interest rates.

In this paper we show that results presented in the seminal paper by Yogo, A Consumption Based Explanation of Expected Stock Returns, cannot be replicated. We find different estimates for the parameters and we obtain values of over-identified statistics that being much larger than those in the original paper indicate rejection of the durable consumption asset pricing model. By careful inspection of Yogo’s replication files, we were able to track down the inconsistency to a coding bug. The rejection of the durable model is exemplified by its inability to simultaneously explain the risk-free rate and excess stock returns.

We consider Bayesian estimation of state space models when the measurement density is not available but estimating equations for the parameters of the measurement density are available from moment conditions. The most common applications are partial equilibrium models involving moment conditions that depend on dynamic latent variables (e.g., timevarying parameters, stochastic volatility) and dynamic general equilibrium models when moment equations from the first order conditions are available but computing an accurate approximation to the measurement density is difficult.

In this paper we propose primitive conditions under which a projection of a conditional density onto a set defined by conditional moment restrictions exists and is unique. Moreover, we provide an analytic expression of the obtained projection. The range of applications where conditional density projections are used is wide. The derived results are potentially useful in a variety of areas including: semiparametric efficient estimation and optimal testing in (conditional) moment models, Bayesian prior determination and inference in semiparametric models, density forecasting, and simulation-based econometric analysis. Regarding existence, we propose three different combinations of assumptions that are all sufficient to show that the projection exists and is unique. The proposed conditions exhibit a clear trade off between restrictions put on the divergence between the conditional densities and on the moment function which defines the projection set. Depending on the nature of the application, the researcher can pick and choose which set of conditions to use. Our second set of results characterizes the projection. The expression for the projected density is new though not surprising given the previously obtained results for the unconditional case. The projection is characterized by the dual of the original projection problem. In establishing the strong duality, however, we work with a constraint qualification condition that is weaker than that used by Borwein and Lewis (1991a, 1992a, 1993) in their seminal work concerning the unconditional case.

We study labor-market discrimination of individuals with in Italy. We conduct a field experiment in two Italian cities: Rome and Milan, by sending "fake" CVs to real ads. We find that there is a strong penalty for homosexuals, i.e., about 30\% less chance to be called back compared to a heterosexual male and even more so if they are highly skilled. On the other hand, we find no penalty for homosexual females. We also find a beauty premium for females only but this premium is much lower when the pretty woman is skilled.

We consider a method for producing multivariate density forecasts that satisfy moment restrictions implied by economic theory, such as Euler conditions. The method starts from a base forecast that might not satisfy the theoretical restrictions and forces it to satisfy the moment conditions using exponential tilting. Although exponential tilting has been considered before in a Bayesian context (Robertson et al. 2005), our main contributions are: (1) to adapt the method to a classical inferential context with out-of-sample evaluation objectives and parameter estimation uncertainty; and (2) to formally discuss the conditions under which the method delivers improvements in forecast accuracy. An empirical illustration which incorporates Euler conditions into forecasts produced by Bayesian vector autoregressions shows that the improvements in accuracy can be sizable and significant.

Census data show that since 1980 low-skill workers in the United States have been increasingly employed in the provision of non-tradeable time-intensive services – such as food preparation and cleaning – that can be broadly thought as substitutes of home production activities. Meanwhile the wage gap between this sector and the rest of the economy has shrunk. If skilled workers, with their high opportunity cost of time, demand more of these time-intensive services, then wage gains at the top of the wage distribution (such as those observed in the last three decades) are expected to raise the consumption of these services, consistent with these stylized facts. Using both consumption expenditure data and city-level data on employment and wages of workers of different skills, we provide several pieces of evidence in favor of these demand shifts, and we argue that they provide a viable explanation for the growth in wages at the bottom quantiles observed in the last fifteen years.

Background: Rapid identification of eligible cord blood units (CBUs) for banking is an important issue in hematopoietic stem cell procurement. Distinct contents of CD34+ cells in CBU can contribute to identify grafts that may be banked also for unrelated transplants or limited to family-directed or autologous use. Study design and methods: Considering thresholds of CD34+ cell content of 3 × 10(6) , 2 × 10(6) , and 1 × 10(6) CD34+ cells, we analyzed a consecutive series of 1309 CBUs. CBUs were collected for autologous banking without any volume-based preselection criteria. Predictors of distinct content of CD34+ cells have been assessed by receiver operating characteristic (ROC) curve analysis. Results: Median total nucleated cell (TNC) and CD34+ cell counts of the series were 6.97 × 10(8) (range, 0.36 × 10(8) -34.9 × 10(8) ) and 1.47 × 10(6) (0-20.56 × 10(6) ). Volumes ranged from 21 to 163 mL, with a median of 73.8 mL. For the CD34+ target of 1 × 10(6) , the best predictor was TNC count with a threshold of 6.63 × 10(8) ; volume results were less predictive with a value of 68.1 mL. For CD34+ targets of 2 × 10(6) and 3 × 10(6) , ROC curves confirmed a stronger predictive power of TNC, above the collected volume, with thresholds of 7.55 × 10(8) and 8.98 × 10(8) . ROC analysis by combining all predictors (TNC, volume, TNC(2) , volume(2) , age of mothers, types of delivery, birthweight) gave worse results than TNC count alone. Conclusions: This analysis, carried out on a large, unrestricted CBU series, shows that TNC alone is the best predictor of distinct targets of hematopoietic potential with the chance to predict CBU potentially useful for unrelated recipients or limited for family-directed or autologous use.

This paper studies the Minimum Divergence (MD) class of estimators for econometric models specified through moment restrictions. We show that MD estimators can be obtained as solutions to a computationally tractable optimization problem. This problem is similar to the one solved by the Generalized Empirical Likelihood estimators of Newey and Smith (2004), but it is equivalent to it only for a subclass of divergences. The MD framework provides a coherent testing theory: tests for overidentification and parametric restrictions in this framework can be interpreted as semiparametric versions of Pearson-type goodness of fit tests. The higher order properties of MD estimators are also studied and it is shown that MD estimators that have the same higher order bias as the Empirical Likelihood (EL) estimator also share the same higher order Mean Square Error and are all higher order efficient. We identify members of the MD class that are not only higher order efficient, but, unlike the EL estimator, well behaved when the moment restrictions are misspecified.

## Articles in books

The US labor market has become increasingly polarized both in terms of jobs and wages, and the routinization explanation is well established for these trends. Recent papers have found job polarization patterns also in Europe, while few evidence is available for wages. The goal of the paper is to investigate the dynamics of unconditional and conditional—on technology—wages in Europe, using industry (EU KLEMS) data. As for unconditional wages, there are no wage polarization trends at work, as the wage structure is broadly constant over time. For the conditional polarization, we investigate the impact of ICT intensity on wages and hours worked by three skill groups by education levels. Our analysis does not provide evidence supporting the conditional polarization of wages, while we detect job polarization trends.

## Working papers

This paper investigates the dynamics of the distribution of unconditional and conditional -on technology—wages in Europe, using both industry and individual level data for the period 1995-2007. We find that the unconditional wage distribution shows scant signs of polariza- tion in Europe. On the other hand, the effect of technology is more nuanced. At the industry level, technological changes have an effect on polarization of jobs, but not on polarization of wages. At the individual level, we use a counterfactual distributional analysis which accounts for the heterogeneity of tasks across occupations, and we find only mild evidence of wage polarization. Technology affects the lower and upper part of the wage distribution in different ways, with service tasks affecting the lower quantiles and abstract tasks affecting the higher ones.

The contribution of generalized method of moments (Hansen and Singleton, 1982) was to allow frequentist inference regarding the parameters of a nonlinear structural model without having to solve the model. Provided there were no latent variables. The contribution of this paper is the same. With latent variables.

We use data from the Survey of Income and Program Participation covering the period 1989- 2006 to investigate the impact that time limits on receipt of Temporary Assistance for Needy Families have on female-headed family outcomes, including welfare use, employment and living arrangements. The effects of time limits depend on the stock of remaining months of eligibility, which in turn depends on the state time limit and on family’s welfare use since the policy was implemented. Since the latter is potentially endogenous to current outcomes, we form a prediction of remaining eligibility based on state rules and observable family characteristics. For families who are predicted to have hit the limit, we find evidence of enforcement of the policy, which causes monthly income from welfare to drop by an average of \$250. This loss is not offset by increases in other income sources: not only there is no significant change in earnings (despite a sizable increase in the likelihood that the mother works), but also income from other transfer programs (such as SSI and Food Stamps) decreases – resulting in increasing rates of deep poverty among these families. Additional analyses suggest that doubling up is a way for families who timed out of welfare to share housekeeping expenses.

Bayesian inference in moment condition models is difficult to implement. For these models, a posterior distribution cannot be calculated because the likelihood function has not been fully specified. In this paper, we obtain a class of likelihoods by formal Bayesian calculations that take into account the semiparametric nature of the problem. The likelihoods are derived by integrating out the nuisance parameters with respect to a maximum entropy tilted prior on the space of distribution. The result is a unification that uncovers a mapping between priors and likelihood functions. We show that there exist priors such that the likelihoods are closely connected to Generalized Empirical Likelihood (GEL) methods.