Λέσχη Φίλων Στατιστικής - GrStats forum
AUEB Stats Seminars 24/3/2021: Improved estimation of partially-specified models by I. Kosmidis (Univ. of Warwick) Forumgrstats

Join the forum, it's quick and easy

Λέσχη Φίλων Στατιστικής - GrStats forum
AUEB Stats Seminars 24/3/2021: Improved estimation of partially-specified models by I. Kosmidis (Univ. of Warwick) Forumgrstats
Λέσχη Φίλων Στατιστικής - GrStats forum
Would you like to react to this message? Create an account in a few clicks or log in to continue.
Για προβλήματα εγγραφής και άλλες πληροφορίες επικοινωνήστε με : grstats.forum@gmail.com ή grstats@stat-athens.aueb.gr

Go down
grstats
grstats
Posts : 965
Join date : 2009-10-21
http://stat-athens.aueb.gr/~grstats/

AUEB Stats Seminars 24/3/2021: Improved estimation of partially-specified models by I. Kosmidis (Univ. of Warwick) Empty AUEB Stats Seminars 24/3/2021: Improved estimation of partially-specified models by I. Kosmidis (Univ. of Warwick)

Wed 24 Mar 2021 - 0:36
AUEB Stats Seminars 24/3/2021: Improved estimation of partially-specified models by I. Kosmidis (Univ. of Warwick) 2021-013




AUEB Stats Seminars

Improved estimation of partially-specified models

Ioannis Kosmidis,
Department of Statistics, University of Warwick, United Kingdom,
The Alan Turing Institute, London, United Kingdom
(Web: http://www.ikosmidis.com - Twitter: https://twitter.com/IKosmidis_)

Joint work with:
Nicola Lunardon, University of Milano-Bicocca, Milan, Italy

Ημερομηνία Εκδήλωσης:  
Tετάρτη, Μάρτιος 24, 2021 - 11:30
Teams link: https://bit.ly/3opz2kK

ΠΕΡΙΛΗΨΗ

Many popular methods for the reduction of estimation bias rely on an approximation of the bias function under the assumption that the model is correct and fully specified. Other bias reduction methods, like the bootstrap, the jackknife and indirect inference require fewer assumptions to operate but are typically computer-intensive, requiring repeated optimization. We present a novel framework for reducing estimation bias that:

i) can deliver estimators with smaller bias than reference estimators even for partially-specified models, as long as estimation is through unbiased estimating functions;

ii) always results in closed-form bias-reducing penalties to the objective function if estimation is through the maximization of one, like maximum likelihood and maximum composite likelihood.

iii) relies only on the estimating functions and/or the objective and their derivatives, greatly facilitating implementation for general modelling frameworks through numerical or automatic differentiation techniques and standard numerical optimization routines.

The bias-reducing penalized objectives closely relate to information criteria for model selection based on the Kullback-Leibler divergence, establishing, for the first time, a strong link between reduction of estimation bias and model selection. We also discuss the asymptotic efficiency properties of the new estimator, inference and model selection, and present illustrations in well-used, important modelling settings of varying complexity.

Related preprint:
http://arxiv.org/abs/2001.03786
Back to top
Permissions in this forum:
You cannot reply to topics in this forum