by David Mathieu and Jannie Møller-Hartley
In the autumn of 2018, European email inboxes overflowed with ‘your data is safe’ messages from all sorts of obscure companies and business that we once had engaged with. This was a direct consequence of the GDPR regulation implemented in the European Union. As researchers we were puzzled with how citizens might respond to such messages, and what it meant that citizens in their daily lives were entangled with platforms and apps that collect, analyse, and make predictions based on their data. Both as we move around the city with transportation and as we engage with teachers in schools, or with our doctors via email consultation. Denmark, being the context of this study, forms an interesting case study because of the extremely high internet-penetration, the high degree of digitalisation and the high trust in public authorities in the country.
We asked ourselves: how is trust in datafied media negotiated in our datafied lives? What does the use of media, our dependency and our entanglement to them mean for such negotiations? Our society relies heavily on media, and hence citizens’ perception of data must be made through the prism of their relationship with media. Additionally, we were wondering how might the implementation of GDPR – a framework meant to protect citizens’ data – affect the negotiation of our datafied everyday lives?
We decided that the questions were so complex that we needed to engage qualitatively with all the nuances in citizen’s daily negotiations – negotiations they might not even be aware they are having, or that they might not recall in an interview situation. We opted for the method of focus group, having different groups of citizens discuss these issues and hoped that some prompts to make them discuss would bring some of these inner negotiations to the surface. During a cold and dark winter in Roskilde, we brought people from different occupations and of different ages together to discuss with us. We made them map all the platforms and apps they meet and use on a daily basis, and discuss how they trust each of them and what rationales they use for doing so (our method helped participants make explicit what we in the analysis call ‘heuristics of trust’.
We saw five sets of heuristics guiding the trust assessments of citizens: 1) characteristics of media organisations, 2) old media standards, 3) context of use and purpose, 4) experiences of datafication and 5) understandings of datafication. Interestingly, the participants used their previous experiences to guide their understanding of how they could respond to the chilling effects or anxieties they were experiencing with having their data collected and their media use monitored. They considered what risk is associated with which type of data. In practice, this meant that they did not mind giving up data about their shoe-size, film preferences or what they have recently bought on Amazon, but were naturally more worried about data on their health and their financial situation.
All in all, citizens are guided by a partial, embedded ‘structures of perception’ and are enticed into trusting datafied media in the context of their everyday lives. They may be highly concerned by the datafication of the media they use, but they use them heavily nevertheless. In fact, they recognised that some of the media they use the most are also the ones they trust the least with their data. Rationally, we would expect that a lack of trust should lead to an adjustment in their use of media, but our data show that this is not the case at all. Their assessment of trust is an ongoing and practical negotiation that is weighted against the benefits they get from media and the entanglement of their everyday life in media. We also found that trust is not necessarily improved by users having to read pages and pages of consent forms, cookie declarations, etc. Guided by their immediate need of use, adapting their previous understandings of datafication (as knowledge increases) on an ongoing basis, and positioning themselves towards an urge to consider data privacy seriously, citizens’ trust assessment is a complex process affected by many other things than ‘just’ regulation or rational behaviour. This is a paradox, which regulators, businesses, and scholars should recognise in the next generation of GDPR regulative measures.