Social Statistics Section / Data Ethics SIG meeting: GDPR and surveys

On Monday 14 January 2019, the Social Statistics Section and the Data Ethics special interest group held a meeting on how surveys were affected by the provisions of the General Data Protection Regulation (GDPR). Nick Moon of Moonlight Research chaired the meeting and introduced by noting how GDPR had captured the zeitgeist in the form of satire about protecting personal data from Santa and family members. More seriously, we find family members seeking advice about data sharing and privacy policies, but where do surveys sit within this?

Ross Young, head of data governance at the UK Statistics Authority (UKSA) presented features of recent guidance for the Government Statistical Service (GSS) about GDPR. He highlighted the focus on governance, accountability and transparency - striking a balance between profligacy and rigidity. Definitions and responsibilities are set out to make use of ‘personal data’ proportionate, lawful and up-to-date, typically used for a public interest task. Use of ‘special category’ data types which may be sensitive eg health, need additional justification, such as the statistics or research public tasks of the GSS. While GDPR requires people to have control over their data, for this to be proportionate UKSA focus on transparency and activities such as targeted communication prior to a telephone interview. Statisticians are fundamentally not interested in the identities of individuals, except for the purposes of data linkage, for which there are now codes of practice for the Digital Economy Act (DEA).

Maria Sigala from the Data & Infrastructure team at the Economic and Social Research Council (ESRC) began with how they monitored and lobbied collaboratively for research protections in GDPR during the legislative process. UK Research and Innovation (UKRI) policy views data as an asset but having a coherent policy on GDPR was a challenge. Ultimately, researchers need to understand the legal basis for their use of data and Maria described a number of confusions and misconceptions. Confusion at some universities about consent and third party data was identified from the CLOSER team but UKRI feels that guidance has identified everything although it is unfamiliar and they are Interested in researcher experience.

David Hand from Imperial College, London offered a broader perspective of how legitimate personal data use has been swept up with questionable research practices - with surveys being collateral damage. He reiterated that individuals in a survey are not the interest, and noted that 'survey' appears only once in the text of GDPR, as part of a definition of statistics. Some impersonal transaction data has no particular need to be retained but some could have legitimate applications to improve service models, so by intention, GDPR disrupts unjustified data hoarding. He closed by noting that privacy can protect officials as well as members of the public.

Some discussion considered outstanding questions, such as how we realistically inform people of rights and the extent to which the public care. There is good practice from the Health Research Authority (medical authorities were among the first to publish guidance) but understanding of respondents and the public lacks evaluation.

Scott Summers from the UK Data Service looked at GDPR within the context of other legal responsibilities. The longstanding duty of confidentiality is in Common Law and data sharing for other projects needs ethical consent, whereas data deposit requires permission from the original source for copyright. EU member states are responsible for their own definitions of public interest and research and for the UK this was the Data Protection Act (DPA). Truly anonymised data will be rare in research uses so although anonymity would exclude research from GDPR compliance, this is is not advised as a basis.

Debrah Harding from the Market Research Society began by noting that the fundamental principles of GDPR are part of professional practice of researchers. The new data protection legislation has been published in draft form and Private Electronic Communication Regulation will be updated, but may be stricter going further. Definitions are clear and they need to be applied accurately: anonymous data needs tests to assure its non-disclosive status; researchers can be either controllers or processors or even both jointly; public interest research is separate from scientific research. New joint SRA-MRS guidance to be issued by June is widely anticipated, as detailed private sector survey research guidance on GDPR is currently lacking.

A final discussion picked up some further points, that sharing outcomes of research instruments with participants needs to be specified at recruitment. Feedback from participants has shown that sharing guidance was helpful but more sharing of experiences would be useful and issues of how best to use the consent basis are unclear.

Load more