Privacy-by-Design in ROXANNE technologies

Privacy is an important ethical, social, and legal issue. Yet, some infringement on a person’s privacy by law enforcement agencies (LEAs) might need to take place for effective criminal investigations; this is generally seen as socially acceptable and in compliance with human rights law when it is necessary, proportionate, and done according to legal rules. As such, there are limitations on LEAs processing data about people’s private lives that need to be respected, and the way we design processing technologies is part of that respect.

In the research and development of technologies, Privacy-by-Design involves considering privacy at each stage of decision-making so that it can be respected as far as practically possible. The concept was first developed by the then Information and Privacy Commissioner of Ontario, Dr. Ann Cavoukin who suggested 7 foundational principles. It has also found a legal manifestation in data protection by design and default under Art.25, GDPR, which is mirrored in Art.20 of the Law Enforcement Directive. Both of these approaches are useful for including privacy in technology design. However, they are not limited to technology and can also apply, for example, to including privacy when thinking about developing organisational policies.

 

A more concrete approach to Privacy-by-Design in technology research can be found in privacy design strategies, which can then be given more granularity as privacy patterns. For Hoepman (pg.197), privacy design strategies cover:

Data-orientated strategies:

  1. Minimise; limit the processing of personal data as much as possible.
  2. Separate; separate the processing of personal data as much as possible.
  3. Abstract; limit the detail with which personal data is processed as much as possible.
  4. Hide; protection personal data or make it unlikable or unobservable. Make sure it does not become public or known.

Organisational-orientated strategies:

  1. Inform; inform data-subjects about the processing of their personal data in a timely and adequate manner.
  2. Control; provide data-subjects adequate control over the processing of their personal data.
  3. Enforce; commit to processing personal data in a privacy-friendly way, and adequately enforce this.
  4. Demonstrate; demonstrate you are processing personal data in a privacy-friendly way.

 

Before applying these strategies to the ROXANNE technologies, and those like it, it is important for us to remember that the technologies being researched are intended to be useful for criminal investigations. These investigations frequently involve uncovering previously private information. This means that application of the above strategies needs to be adapted to the context in which the technologies are intended for use. We could imagine Privacy-by-Design being implemented in commercial systems with a focus on minimising processing of personal data as a priority. For example, when buying tickets to an event online we are often asked for a phone number, email address, name, and postal address in addition to our payment information. Yet, as Hoepman describes (pg.150-152), the box office does not need most of this information, in the same way they are unlikely to collect it if you bought tickets in person. You could simply be provided with a downloadable eTicket after purchase, thereby minimising the amount of personal data processed by the box office.

 

Yet, could be minimisation is problematic for LEA investigations where lots of highly personal information might need to be collected in order to effectively investigate a crime. With technologies intended to be used in such investigations, Privacy-by-Design needs to be implemented in a way that allows for LEAs to infringe on privacy during lawful and legitimate operations, with appropriate oversight and accountability. As such, Privacy-by-Design for LEA investigative technologies needs to be focussed on limiting privacy impacts where possible and practicable.

 

In a technology research project, it is tempting to focus on technological solutions to inappropriate privacy intrusions. For example, a key discussion that has taken place in ROXANNE is whether technologies that present risks of misuse and mass surveillance should be made in such a way that specific unwanted uses of the tools are made impossible through the design process (i.e., technologically managing the undesirable action away). For example, analysis of facial images in video files could present privacy risks, in addition to ethical concerns. One suggestion for minimising privacy impacts was that facial images could be searched only after the presence of a person of interest is indicated by other means (e.g., voice identification). Yet, this would make the tool sub-optimal and less useful for LEAs; it could even prevent unforeseen ethical uses of the technology in future. Partners, therefore, decided not to include this function, and instead chose other technological means to reduce privacy intrusions.

 

This example illustrates that the circumstances of a technology’s expected use are likely to alter how privacy design strategies are implemented. Indeed, as Koops and Leenes argue, coding privacy and data protection compliance into technologies is incredibly difficult due to the nuance and complexity of applying generalised rules to specific situations. This is especially the case for researching LEA technologies, where some privacy intrusions often need to be permitted, and coding different levels of privacy protection for different circumstances is likely to be even more difficult than with general systems. Therefore, organisational measures need to play a significant role so that overall compliance is achieved.

 

In ROXANNE, several privacy patterns are being implemented into the platform itself as a result of thinking about privacy design strategies. Out of this range of patterns and design features, lets focus on the organisational-orientated control strategy as an example. It is important to remember that highly sensitive information might be uncovered during an LEA investigation and this should be given an appropriate level of protection and respect, bearing in mind that persons under investigation could be innocent. Therefore, preventing free access to data and analytical tools by investigators can be a key step in enacting that protection and respect, and this has been done in ROXANNE with technical partners agreeing to implement two key privacy patterns. First, the ‘Selective Access Control’ pattern will be implemented so that not all data is made available to all ROXANNE users all the time, as this might not be appropriate or necessary. Similarly, the ‘Enable/Disable Functions’ pattern will be implemented so that not all ROXANNE tools are available to all end-users by default. This protects privacy through preventing a potentially intrusive analysis of personal data from taking place where it is not needed, but allows access where required by the needs of a lawful investigation.

 

These patterns, along with other actions such as including logging mechanisms and restricting which organisations can access ROXANNE tools after the project to prevent them getting into the hands of irresponsible persons or organisations, contribute to minimising the risks that the ROXANNE tools will be misused or employed in mass surveillance. Firstly, by implementing privacy patterns, the use of ROXANNE for overly intrusive activities is made much more difficult. Second, secure logging mechanisms will leave a clear audit trail for any disciplinary or professional standards investigation, thus acting as a deterrent. Third, careful restrictions on how ROXANNE tools are brought to market in future should prevent people in countries and organisations with a poor human rights record from acquiring the ROXANNE tools.

 

Overall then, a technology focussed interpretation of Privacy-by-Design can, and does, have things to contribute to LEA technology research to protecting the privacy of people who are under investigation, or whose data is incidentally captured in an investigation, as far as is practicable given the circumstances. But, Privacy-by-Design in a narrow sense is not, and cannot be, an overarching solution. Organisational privacy strategies need to be implemented to complete the picture, and take the additional weight of allowing socially and legal acceptable privacy intrusions to take place during lawful and legitimate LEA investigations. Further, implementation of privacy patterns makes nefarious use of tools like ROXANNE more difficult and less likely, thereby contributing to reducing the risks of misuse and mass surveillance. As a research project, the ROXANNE technologies will still need further development at the conclusion of the project before they are ready to use, and so we will have further recommendations about the implementation of further privacy patterns to enhance the privacy protections available in future iterations of the technologies.