Giuseppe Ragusa, “Minimum divergence, generalized empirical likelihoods, and higher order expansions.” Econometric Reviews, 30(4):406-456

doi:10.1080/07474938.2011.553541

 download pdf

This paper studies the Minimum Divergence (MD) class of estimators for econometric models specified through moment restrictions. We show that MD estimators can be obtained as solutions to a computationally tractable optimization problem. This problem is similar to the one solved by the Generalized Empirical Likelihood estimators of Newey and Smith (2004), but it is equivalent to it only for a subclass of divergences. The MD framework provides a coherent testing theory: tests for overidentification and parametric restrictions in this framework can be interpreted as semiparametric versions of Pearson-type goodness of fit tests. The higher order properties of MD estimators are also studied and it is shown that MD estimators that have the same higher order bias as the Empirical Likelihood (EL) estimator also share the same higher order Mean Square Error and are all higher order efficient. We identify members of the MD class that are not only higher order efficient, but, unlike the EL estimator, well behaved when the moment restrictions are misspecified.


« Bayesian likelihoods for moment condition models | Publications List | Incorporating theoretical restrictions into forecasting by projection methods »

about

Giuseppe Ragusa teaches in the Department of Economics and Business and in the Business School at Luiss University. His research is mostly about econometrics.

Where

subscribe

To receive updates from this site, you can subscribe to the  RSS feed of all updates to the site in an RSS feed reader

search