Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich.
mehr Informationen...
We wrote this book to introduce graduate students and research workers in various scientific disciplines to the use of information-theoretic approaches in the analysis of empirical data. In its fully developed form, the information-theoretic approach allows inference based on more than one model (including estimates of unconditional precision); in its initial form, it is useful in selecting a "best" model and ranking the remaining models. We believe that often the critical issue in data analysis is the selection of a good approximating model that best represents the inference supported by the data (an estimated "best approximating model"). Information theory includes the well-known Kullback-Leibler "distance" between two models (actually, probability distributions), and this represents a fundamental quantity in science. In 1973, Hirotugu Akaike derived an estimator of the (relative) Kullback-Leibler distance based on Fisher's maximized log-likelihood. His measure, now called Akaike 's information criterion (AIC), provided a new paradigm for model selection in the analysis of empirical data. His approach, with a fundamental link to information theory, is relatively simple and easy to use in practice, but little taught in statistics classes and far less understood in the applied sciences than should be the case. We do not accept the notion that there is a simple, "true model" in the biological sciences