TY - BOOK AU - Pardo, Leandro AB - This book presents new and original research in Statistical Information Theory, based on minimum divergence estimators and test statistics, from a theoretical and applied point of view, for different statistical problems with special emphasis on efficiency and robustness. Divergence statistics, based on maximum likelihood estimators, as well as Wald’s statistics, likelihood ratio statistics and Rao’s score statistics, share several optimum asymptotic properties, but are highly non-robust in cases of model misspecification under the presence of outlying observations. It is well-known that a small deviation from the underlying assumptions on the model can have drastic effect on the performance of these classical tests. Specifically, this book presents a robust version of the classical Wald statistical test, for testing simple and composite null hypotheses for general parametric models, based on minimum divergence estimators. DO - 10.3390/books978-3-03897-937-1 ID - OAPEN ID: 33643 KW - n/a KW - mixture index of fit KW - Kullback-Leibler distance KW - relative error estimation KW - minimum divergence inference KW - Neyman Pearson test KW - influence function KW - consistency KW - thematic quality assessment KW - asymptotic normality KW - Hellinger distance KW - nonparametric test KW - Berstein von Mises theorem KW - maximum composite likelihood estimator KW - 2-alternating capacities KW - efficiency KW - corrupted data KW - statistical distance KW - robustness KW - log-linear models KW - representation formula KW - goodness-of-fit KW - general linear model KW - Wald-type test statistics KW - Hölder divergence KW - divergence KW - logarithmic super divergence KW - information geometry KW - sparse KW - robust estimation KW - relative entropy KW - minimum disparity methods KW - MM algorithm KW - local-polynomial regression KW - association models KW - total variation KW - Bayesian nonparametric KW - ordinal classification variables KW - Wald test statistic KW - Wald-type test KW - composite hypotheses KW - compressed data KW - hypothesis testing KW - Bayesian semi-parametric KW - single index model KW - indoor localization KW - composite minimum density power divergence estimator KW - quasi-likelihood KW - Chernoff Stein lemma KW - composite likelihood KW - asymptotic property KW - Bregman divergence KW - robust testing KW - misspecified hypothesis and alternative KW - least-favorable hypotheses KW - location-scale family KW - correlation models KW - minimum penalized ?-divergence estimator KW - non-quadratic distance KW - robust KW - semiparametric model KW - divergence based testing KW - measurement errors KW - bootstrap distribution estimator KW - generalized renyi entropy KW - minimum divergence methods KW - generalized linear model KW - ?-divergence KW - Bregman information KW - iterated limits KW - centroid KW - model assessment KW - divergence measure KW - model check KW - two-sample test KW - Wald statistic L1 - https://mdpi.com/books/pdfview/book/1298 LA - English LK - https://directory.doabooks.org/handle/20.500.12854/54566 PB - MDPI - Multidisciplinary Digital Publishing Institute PY - 2019 SN - 9783038979364 SN - 9783038979371 TI - New Developments in Statistical Information Theory Based on Entropy and Divergence Measuresnull ER -