Cambridge Analytica and the concept of fairness by design

by: in Law
Cambridge Analytica blog Privacy Maastricht University

Just a few days ago the ICO published its “Investigation into the use of data analytics in political campaigns Investigation update” report that provides details with respect to the office of Information Commissioner Elizabeth Denham’s investigation of the widespread use of data analytics in electoral campaigns. 

The report largely focuses on Facebook and Cambridge Analytica as targets of the investigation as a result of their failure to safeguard the information of individuals, allowing the data of an estimated 87 million users to be harvested without their express knowledge.  

In considering the issue at hand, I am reminded of the EDPS Preliminary Opinion on privacy by design (Opinion 5/2018) which cites what we might consider to be an “ancient” but still extremely relevant text, “Kranzberg’s Laws” from 1986, which reads: “Technology is neither good nor bad, nor is it neutral”.  

Indeed, the development of technology “is not subject to inherent determinism and […] it can be shaped” leading us to develop an ethical discourse, something which the EDPS itself has been exploring since the 2015 foundation of the Ethics Advisory Group.  

The relationship between ethics and data protection can be successfully illustrated through the example of the Cambridge Analytica case in which the original purposes for the profiling system developed by the Psychometrics Centre of Cambridge University had the potential capability of leading to useful academic, societal and business insights concerning psychological targeting as a tool to influence behaviour, something that is undoubtedly important to understand.  Such technology and specifically, algorithms, however, also present significant risks as they have effectively been transformed into a means to influence democratic processes, essentially hampering the fundamental principles of democratic society which must be protected.  In fact, technology is neither good, bad, or neutral as Melvin Kranzberg, wisely pointed out.  It is now possible to create an algorithm that is capable of predicting individual’s behaviour, and the logical consequence of this technology is improved regulation on the part of legislators and responsibility on the part of governments and businesses. The duty of law makers, politicians and digital businesses is to ensure that such incredible technologies are used in a fair way, so to respect fundamental rights and freedoms and to grant dignity to the digital society.  

We are all familiar with the concept of Data Protection by Design/Default as they are laid down in Article 25 and Recital 78 of the GDPR, and encourage organisations to build technical and organisational measures into the design of their processing operations as to safeguard privacy and provide data protection from the start (by design).  Additionally, this means that by default such entities ensure that personal data is always processed with the highest possible level of privacy protection, meaning that only necessary data are processed and such data is not stored for more than necessary and only relevant authorised people have access to such data (by default).

Data Protection by Design/Default is indeed one of the most effective ways to fully achieve compliance with the fundamental data protection principles as they are established in Article 5 of the GDPR.  The time, however, has arrived to go one step further, towards a concept of “Fairness by design” where fairness relates to balanced and proportionate data processing. In line with this principle, organisations should take into account the interests and reasonable expectations of privacy of data subjects. The processing of personal data should not intrude unreasonably upon the privacy, autonomy and integrity of data subjects, and organisations should not exert pressure on data subjects to provide personal data.

Fairness goes beyond what is strictly prescribed by the law, taking into consideration an ethical dimension as discussed above.  Like Data Protection by Design, it should be built into the very design of data processing activities, whether they be  products, services, or applications and – most importantly – the algorithms that underpin the information/data processing should be designed and developed in a way that is compatible with the concept of “fairness by design”.

Fairness by design may be seen as a further specification of the principle of data protection by design aimed at complementing the legal with the ethical dimensions of privacy and protection of personal data for the development of a healthy and democratic digital society.

  Originally posted on Paolobalboni.eu 
  More blogs on Law Blogs Maastricht