Jessica Doohan from the University of Limerick will speak on GAMs v Neural Nets. The statistical group in Limerick have been attempting to explore convergence between these modelling schemes while trying to give Neural Nets a more familiar feel and statistical interpretation.
RSSNI - April Talk on the 29th at 1pm BST
Jessica Doohan from the University of Limerick will speak on GAMs v Neural Nets. The statistical group in Limerick have been attempting to explore convergence between these modelling schemes while trying to give Neural Nets a more familiar feel and statistical interpretation.
The talk will be held in room G/005 in the Maths and Physics Teaching Centre, QUB (see campus map). It will be a hybrid event with the speaker and audience in the room and the remote audience on-line on MS Teams (link).
Title: Comparing and Combining Generalised Additive Models and Neural Networks in Practice
Abstract:
Over the past two decades, advancements in machine learning, particularly neural networks, have reshaped predictive modelling. While neural networks are often regarded as “black-box” algorithms, their universal approximation properties and depth-induced parsimony have made them highly effective predictors, albeit difficult to interpret. Much of this development has occurred outside the field of statistics, which has led to a difference in the terminology and methodology used. However, works have highlighted their parallels with traditional statistical approaches, showing that multilayer perceptrons can be viewed as non-parametric generalisations of regression models. Despite these connections, comparative studies have largely focused on simple models such as linear regression, often within specific application domains. Such comparisons were yet to consider more flexible statistical models with neural networks across a wide range of applications.
Generalised Additive Models (GAMs), which extend linear regression by allowing non-linear predictor effects while retaining interpretability, provide a more advanced benchmark. No previous review examined how GAMs perform relative to neural networks across a range of applications. Our work addresses that gap through a systematic review of over 140 papers and 400 datasets. Beyond summarising comparisons, we analyse reported performance metrics using mixed-effects modelling. This allowed us to investigate and quantify how characteristics such as sample size, number of predictors, application area, and neural network complexity are associated with differences in performance. For further details see https://doi.org/10.1016/j.eswa.2025.131082 .
Motivated by these findings, we explore hybrid approaches combining the strengths of both model classes. In particular, we propose a statistical mixture regression model in which component-specific mean functions belong to different model classes. Specifically, each component mean may be (i) a linear regression model, (ii) a GAM, or (iii) a neural network. Estimation is carried out using the Expectation–Maximisation (EM) algorithm, treating component memberships as latent variables. This extends traditional mixture regression, where all components are typically assumed to share the same functional form. In practice, some observations may be explained by a simple, interpretable structure, while other observations require a more flexible model which this formulation can account for.
All welcome!
Gilbert MacKenzie
Chair
Hannah Mitchell
Meetings Host
RSSNI
See previous talks here.
Read previous write-ups here.
Follow RSSNI on Linkedin
Jessica Doohan (University of Limerick)
Gilbert MacKenzie
gilbert.mackenzie.cbs@gmail.com
Book now