Show simple item record

dc.contributor.authorPardo, Leandro*
dc.date.accessioned2021-02-11T20:53:55Z
dc.date.available2021-02-11T20:53:55Z
dc.date.issued2019*
dc.date.submitted2019-06-26 08:44:06*
dc.identifier33643*
dc.identifier.urihttps://directory.doabooks.org/handle/20.500.12854/54566
dc.description.abstractThis book presents new and original research in Statistical Information Theory, based on minimum divergence estimators and test statistics, from a theoretical and applied point of view, for different statistical problems with special emphasis on efficiency and robustness. Divergence statistics, based on maximum likelihood estimators, as well as Wald’s statistics, likelihood ratio statistics and Rao’s score statistics, share several optimum asymptotic properties, but are highly non-robust in cases of model misspecification under the presence of outlying observations. It is well-known that a small deviation from the underlying assumptions on the model can have drastic effect on the performance of these classical tests. Specifically, this book presents a robust version of the classical Wald statistical test, for testing simple and composite null hypotheses for general parametric models, based on minimum divergence estimators.*
dc.languageEnglish*
dc.subjectHM401-1281*
dc.subjectHA1-4737*
dc.subjectH1-99*
dc.subject.classificationbic Book Industry Communication::J Society & social sciences::JF Society & culture: general::JFF Social issues & processes::JFFP Social interactionen_US
dc.subject.othern/a*
dc.subject.othermixture index of fit*
dc.subject.otherKullback-Leibler distance*
dc.subject.otherrelative error estimation*
dc.subject.otherminimum divergence inference*
dc.subject.otherNeyman Pearson test*
dc.subject.otherinfluence function*
dc.subject.otherconsistency*
dc.subject.otherthematic quality assessment*
dc.subject.otherasymptotic normality*
dc.subject.otherHellinger distance*
dc.subject.othernonparametric test*
dc.subject.otherBerstein von Mises theorem*
dc.subject.othermaximum composite likelihood estimator*
dc.subject.other2-alternating capacities*
dc.subject.otherefficiency*
dc.subject.othercorrupted data*
dc.subject.otherstatistical distance*
dc.subject.otherrobustness*
dc.subject.otherlog-linear models*
dc.subject.otherrepresentation formula*
dc.subject.othergoodness-of-fit*
dc.subject.othergeneral linear model*
dc.subject.otherWald-type test statistics*
dc.subject.otherHölder divergence*
dc.subject.otherdivergence*
dc.subject.otherlogarithmic super divergence*
dc.subject.otherinformation geometry*
dc.subject.othersparse*
dc.subject.otherrobust estimation*
dc.subject.otherrelative entropy*
dc.subject.otherminimum disparity methods*
dc.subject.otherMM algorithm*
dc.subject.otherlocal-polynomial regression*
dc.subject.otherassociation models*
dc.subject.othertotal variation*
dc.subject.otherBayesian nonparametric*
dc.subject.otherordinal classification variables*
dc.subject.otherWald test statistic*
dc.subject.otherWald-type test*
dc.subject.othercomposite hypotheses*
dc.subject.othercompressed data*
dc.subject.otherhypothesis testing*
dc.subject.otherBayesian semi-parametric*
dc.subject.othersingle index model*
dc.subject.otherindoor localization*
dc.subject.othercomposite minimum density power divergence estimator*
dc.subject.otherquasi-likelihood*
dc.subject.otherChernoff Stein lemma*
dc.subject.othercomposite likelihood*
dc.subject.otherasymptotic property*
dc.subject.otherBregman divergence*
dc.subject.otherrobust testing*
dc.subject.othermisspecified hypothesis and alternative*
dc.subject.otherleast-favorable hypotheses*
dc.subject.otherlocation-scale family*
dc.subject.othercorrelation models*
dc.subject.otherminimum penalized ?-divergence estimator*
dc.subject.othernon-quadratic distance*
dc.subject.otherrobust*
dc.subject.othersemiparametric model*
dc.subject.otherdivergence based testing*
dc.subject.othermeasurement errors*
dc.subject.otherbootstrap distribution estimator*
dc.subject.othergeneralized renyi entropy*
dc.subject.otherminimum divergence methods*
dc.subject.othergeneralized linear model*
dc.subject.other?-divergence*
dc.subject.otherBregman information*
dc.subject.otheriterated limits*
dc.subject.othercentroid*
dc.subject.othermodel assessment*
dc.subject.otherdivergence measure*
dc.subject.othermodel check*
dc.subject.othertwo-sample test*
dc.subject.otherWald statistic*
dc.titleNew Developments in Statistical Information Theory Based on Entropy and Divergence Measures*
dc.typebook
oapen.identifier.doi10.3390/books978-3-03897-937-1*
oapen.relation.isPublishedBy46cabcaa-dd94-4bfe-87b4-55023c1b36d0*
oapen.relation.isbn9783038979364*
oapen.relation.isbn9783038979371*
oapen.pages344*
oapen.edition1st*


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record

https://creativecommons.org/licenses/by-nc-nd/4.0/
Except where otherwise noted, this item's license is described as https://creativecommons.org/licenses/by-nc-nd/4.0/