New evidence in understanding panel conditioning in longitudinal data

Date: Thursday 19 January 2023, 4.00PM
Location: Online
Section Group Meeting


Share this event

Panel conditioning or mere-measurement effect refers to how the experience of participating in data collection impacts attitudes and behaviours. This can lead to biased estimates when collecting repeated measurements from the same individuals. As a result, this phenomenon has received considerable attention in the social sciences. In this event we will explore recent developments with an emphasis on new forms of data and new methods for investigating this topic.
 
Timetable
16:00 – 16:25: Panel conditioning and measurement effects in novel types of data collection Bella Struminskaya, Utrecht University

16:25 – 16:50: Satisficing Response Behavior Across Time: Assessing Negative Panel Conditioning Using an Experimental Design with Six Repetitions Fabienne Kraemer, GESIS – Leibniz Institute for the Social Sciences

16:50 – 17:15: Participating in a panel survey changes respondents’ labour market behaviour — Ruben Bach, University of Mannheim

17:15 – 17:40: Understanding panel conditioning in web surveys using digital trace data – Florian Keusch, University of Mannheim
17:40 – 18:00: Panel discussion
 
Panel conditioning and measurement effects in novel types of data collection
Bella Struminskaya, Utrecht University

Abstract
Changing one’s behavior due to the mere fact of being measured has received much attention in the past. In social sciences, measurement error specific to panel surveys and signifies changes of attitudes, behavior, knowledge or reporting behavior due to participation in previous survey rounds is called panel conditioning. Prior studies provide no consensus on the prevalence and magnitude of panel conditioning effects. In recent years, however, there has been a rise of randomized experiments and study designs specifically developed for studying panel conditioning. Recommendations for practice have been developed that can help gauge how much of a problem panel conditioning is and how to prevent/correct it. With novel sources of data such as apps, sensors, and wearables that are increasingly used in the social and health sciences, problem of measurement reactivity might be especially aggravated by the fact that participants expect receive feedback about their own behavior (‘quantified self’ paradigm) and most commercial apps provide it by default. While such effects are desirable in intervention studies, for social and behavioral scientists they are undesirable. In this talk, I will summarize the current state of knowledge of learning through participation in (social science) studies, how traditional panel conditioning studies can inform those employing new data collection methods, and formulate the research agenda for future studies in the context of apps, sensors, and wearables studies and measurement/feedback effects on participants.  
 
 
Satisficing Response Behavior Across Time: Assessing Negative Panel Conditioning Using an Experimental Design with Six Repetitions
Fabienne Kraemer, GESIS – Leibniz Institute for the Social Sciences, Mannheim, Germany

Abstract
Satisficing response behavior can be a threat to the quality of survey responses. Past research has provided broad empirical evidence on the existence of satisficing and its consequences on data quality, however, relatively little is known about the extent of satisficing over the course of a panel study and its impact on response quality in later waves. Drawing on panel conditioning research, we use question design experiments to investigate whether learning effects across waves of a panel study cause changes in the extent of satisficing and if so, whether general survey experience (process learning) or familiarity with specific question content (content learning) accounts for those changes. We use data from a longitudinal survey experiment comprising six panel waves administered within a German non-probability online access panel. To investigate the underlying mechanism of possible learning effects, the experimental study randomly assigned respondents to different frequencies of receiving identical question content over the six panel waves. Our results show the existence of satisficing in every panel wave, which is in its magnitude similar to the extent of satisficing in the probability-based GESIS Panel that we use as a benchmark study. However, we did not find changes in the extent of satisficing across panel waves, nor did we find moderation effects of the interval between the waves, respondents’ cognitive ability, or motivation. Additional validity analyses showed that satisficing does not only affect the distribution of individual estimates by 15 percent or more but also can have an effect on associations between variables.
 
 
Participating in a panel survey changes respondents’ labour market behaviour
Ruben Bach, University of Mannheim

Abstract
In this talk, I highlight methodological challenges when analyzing panel conditioning effects. I demonstrate how to tackle these challenges using results from a study where I study unintended changes in respondents’ behaviour due to repeated panel survey participation. Using administrative data linked to a large panel survey, I analyse whether the survey brings about changes in respondents’ labour market behaviour. I estimate the causal effect of panel participation on the take‐up of federal labour market programmes by using instrumental variables. Results show that panel survey participation leads to an increase in respondents’ take‐up of these measures. These results suggest that panel survey participation not only affects the reporting of behaviour, as previous studies have demonstrated, but can also alter respondents’ actual behaviour.
 
 
Understanding panel conditioning in web surveys using digital trace data
Florian Keusch, University of Mannheim

Abstract
Conditioning or mere-measure effects have long been a concern in social science data collection. This is especially problematic in longitudinal survey data collection where respondents can change their behaviours and attitudes because of repeated questioning. In this presentation, we show that digital trace data collected from PCs and smartphones allows us to investigate the effects of web survey participation on news and political media consumption. Using a non-probability panel of respondents in Germany, we combine the digital trace data with that from three online surveys regarding the federal election. We show that while participation in the survey does not influence online news and political media consumption, we find weak evidence that respondents with previous high media consumption are less likely to be influenced by doing the survey compared to those with low media consumption.
 
 
Contact Alexandru Cernat
 
 
Free for RSS Fellows 
£10 non-Fellows