Light at the digital shadow

Light at the digital shadow – report of the expert group Big data and privacy


October 19th, 2016


The expert group Big data and Privacy has recently offered its report to the minister of Economic Affairs. The PI.lab has delivered an active contribution to the report. Mireille Hildebrand (Digital security, Radboud University) and Ronald Leenes (TILT, Tilburg University) participated to the expert group. Marc van Lieshout, assisted by Somayeh Djafari (both Strategy & Policy, TNO) acted as secretary for the expert group. Chair of the expert group was Jeroen van den Hoven, professor at the Technical University Delft (Applied ethics).


Participants were selected in order to cover academia, industry and relevant branch organisations. A broad selection with different backgrounds and interests. The request of the minister, on the basis of a request of Dutch Parliament, was to advice on the reconciliation of privacy (fundamental rights) with big data innovations, and to come with practical suggestions how to cope with the apparent dilemma, also in the light of the adoption of the General Data Protection Regulation.


The expert group adopts the reality of the GDPR. It concludes that there are no obvious reasons to challenge the GDPR as legal framework covering big data innovations. This view is unanimously shared. As a consequence, it also assumes that a change in perspective regarding the role of fundamental rights such as privacy in the societal debate on innovation is needed. Contrary to perceiving privacy as a threat towards innovation, the expert group unanimously adopts the position that the strive to have both privacy and innovation should be seen as an enabler for innovation. This might imply that specific forms of innovation will become prohibited, due to their implications on fundamental rights. But other forms of innovation will be promoted, since these capture the interests of the innovator and the public at large simultaneously. As such, and as a third pillar, the expert group considers privacy and fundamental rights not only to be of relevance for individuals but to be a societal value as well. This implies that a society that lacks adequate means for the promotion of privacy and other fundamental rights, is in danger of losing legitimacy towards its citizens and may contribute to promoting distrust with negative consequences for the economic viability

of, in this case, big data innovations.


The expert group makes a clear reference to what happened with the greening of industry. Started in the eighties and nineties of the past century (with its roots in the sixties and seventies), a transformation with respect to the relevance of offering sustainable products and production methods can be clearly noticed: sustainability has turned from a differentiator towards an indispensable feature for industry today. The expert group formulates the vision that the responsible processing of personal data will follow a similar track.


The slogan of the expert group can thus be summarized as the ‘call for a responsible processing of personal data’. During the past year, the expert group has organized various meetings with a large variety of stakeholders. These meetings revealed that many parties are still in a very early stage of ‘privacy maturity’, and requested support in order to achieve a higher maturity level. The expert group does not deny that there is an obvious tension between today’s data driven innovations, especially in combination with advanced machine learning strategies and artificial intelligence tools, and basic requirements for the protection of

persons with respect to the processing of their data. The legal obligation to determine a goal for the processing of personal data at least at the moment of collecting the data can be very problematic when data have been collected for one situation and will be exploited in another situation by means of machine learning algorithms. In the latter situation it may be unclear whether the analysis will reveal useful results and for what precise purposes the results will be used. The expert group concludes that – notwithstanding the typical problems these situations are faced with – a responsible processing implies a careful design of the data processing approach. In its recommendations the expert group makes an urgent call to the supervisory authorities to organize a regulatory sandbox that might help organisations in exploring opportunities for interesting new services while respecting rights of data subjects.


The overview the expert group presents on tools and methods that already are available to support responsible processing of personal data covers organizational, technological and legal tools. Many tools exist, but are neither well-known nor well-explored and require further elaboration. Technological innovations are needed to offer alternatives for existing practices. These need to cover the entire breadth of data processing activities, including identity and access management techniques, novel encryption techniques and standardized tools covering privacy by design strategies and patterns next to novel approaches to develop privacy respecting data analytics. Clearly issues related to research performed within the PI.lab!


Finally, there is a need for a further exploration of the potential implications of advanced data analytics on issues such as unfair treatment, discrimination, stigmatization and exclusion. Attention for this topic is growing. Algorithmic complexity and biased data samples may lead to undesired and unexpected outcomes that are hard to repair. The obligation to be able to explain the logic of a decision could freeze specific developments if it shows it is impossible to do so. The expert group pleads for new research in this field.


The report is available (in Dutch only), click here.


Marc van Lieshout