Seminar: "Entropy-driven statistical modelling" by prof. Martin Schlather

Date: Friday 28 June 2024 11.00AM - Friday 12 July 2024 12.00PM
Location: University of Plymouth, Zoom
room 101, 2-5 Kirkby Place and on Zoom (https://plymouth.zoom.us/j/93438941274)
Local Group Meeting
Book now


Share this event

Speaker: Prof. Martin Schlather (University of Mannheim)

Date: 11:00-12:00, 12 July 2024
Venue: room 101, 2-5 Kirkby Place and on Zoom (https://plymouth.zoom.us/j/93438941274)

Title: Entropy-driven statistical modelling

Abstract: Bartlett (1950) links Fisher's Information to Shannon's entropy through a Taylor expansion of the log-likelihood. Already Akaike (1973) stated in his famous paper that "since then in the theory of statistics [...] Fisher's information could not enjoy the prominent status of Shannon's entropy [...]". This dominant position is particularly surprising, as Shannon himself cautioned against the broader application of his concept. Until now, alternative measures of entropy have been considered only very rarely. In my talk, I generalize the notion of entropy from a strict modelling point of view, so that very natural measures of entropy are included, e.g., the variance. The price to pay is an abstract, algebraic formulation of the entropy. A uniqueness result for the choice of the entropy measure in the univariate case throws new light on the importance of Fisher's information from an entropy point of view, here by means of the Cramer-Rao bound. If time is left, I will also sketch my modelling approach to Kullback-Leibler's diverence.
 
Speaker: Prof. Martin Schlather (University of Mannheim)

Date: 11:00-12:00, 12 July 2024
Venue: room 101, 2-5 Kirkby Place and on Zoom (https://plymouth.zoom.us/j/93438941274)

Title: Entropy-driven statistical modelling

Abstract: Bartlett (1950) links Fisher's Information to Shannon's entropy through a Taylor expansion of the log-likelihood. Already Akaike (1973) stated in his famous paper that "since then in the theory of statistics [...] Fisher's information could not enjoy the prominent status of Shannon's entropy [...]". This dominant position is particularly surprising, as Shannon himself cautioned against the broader application of his concept. Until now, alternative measures of entropy have been considered only very rarely. In my talk, I generalize the notion of entropy from a strict modelling point of view, so that very natural measures of entropy are included, e.g., the variance. The price to pay is an abstract, algebraic formulation of the entropy. A uniqueness result for the choice of the entropy measure in the univariate case throws new light on the importance of Fisher's information from an entropy point of view, here by means of the Cramer-Rao bound. If time is left, I will also sketch my modelling approach to Kullback-Leibler's diverence.
 
Prof. Martin Schlather (University of Mannheim)
 
Malgorzata Wojtys (malgorzata.wojtys@plymouth.ac.uk)
 
Book now