Management, compliance & auditing

Technologies for Conducting Privacy Compliance Assessments

Daniel Dimov
October 17, 2016 by
Daniel Dimov

Section 1. Introduction

Most jurisdictions worldwide have developed comprehensive privacy laws which impose a number of obligations on organizations collecting personal data. However, the processes of ensuring compliance with privacy laws may be complex, lengthy, and costly. This is because such compliance work is usually undertaken by highly qualified experts, who need to examine in detail organizations' flow of personal data across its lifecycle, from collection to disposal. Therefore, there is a pressing social need for technological solutions which facilitate such privacy compliance. The purpose of this article is to examine five such solutions, namely, Fujitsu's technology for assessing the risks related to de-anonymization of personal data (Section 2), the technology "Privacy Perfect" for mapping the processing of personal data (Section 3), the web application Privacy Compliance Assessment (Section 4), AvePoint Privacy Impact Assessment System (Section 5), and the Toolkit for Automated Privacy Policy Analysis (Section 6). At the end of this article, we provide concluding remarks (Section 7).

Section 2. Fujitsu's technology

In general, privacy laws regulate the collection and processing of personal data only. Personal data is usually defined as information which can be used for (1) the identification of an individual or (2) making him/her identifiable. Hence, by anonymizing personal data (i.e., irreversibly severing a data set from the identity of the individual to whom the data relates), an organization can exclude the anonymized data from the scope of most privacy laws. However, with the advent of modern data processing technologies which allow linkage of data originating from multiple sources, the value of anonymization as a method for ensuring privacy has come into question. Nevertheless, scientists generally agree that, as long as proper anonymization techniques are employed, the de-anonymization of personal data is a rather difficult task. A report published by the Information and Privacy Commissioner of Ontario noted that there are few cases of de-anonymization of properly anonymized personal data.

To be on the safe side of privacy laws, organizations using anonymization techniques should ensure that the chance of de-anonymization is low. Normally, the assessment of the risk of de-anonymization requires lengthy investigations. However, a new technology developed by the Japanese ICT giant Fujitsu promises to revolutionize the way of conducting de-anonymization risk assessments. The technology needs only three minutes to evaluate the risks of de-anonymization related to the personal information of 10,000 people.

Fujitsu's technology has three main functions, namely, (1) extracting attributes that should be assessed (e.g., gender information, age, and address); (2) searching for combinations of attributes in the examined data which can be used for identification of individuals; and (3) quantifying the ease of identification. For example, if an organization de-anonymizes the personal data of an individual by deleting his/her name but keeping his/her height and postal code, Fujitsu's technology may indicate a high risk of de-anonymization due to the fact that person's height is unusual (215 cm) for the underpopulated area to which the postal code refers.

It should be noted that Fujitsu's technology allows organizations to determine only the first of four commonly accepted criteria for determining the risks of de-anonymization. These four criteria are: (1) the difficulty of de-anonymization; (2) the measures aiming to prevent de-anonymization; (3) the motives and the capacity of the data recipient to de-anonymize the personal data; and (4) the impact of the privacy violation resulting from the de-anonymization.

Organizations willing to decrease the risks of de-anonymization should focus on the other three criteria as well. For instance, measures aiming to prevent de-anonymization can be included in a data sharing agreement, which makes the data recipient contractually liable for attempts to de-anonymize the received data. The motives of the data recipient to de-anonymize the personal data may be assessed on the basis of the business activities carried by the recipient (e.g., a data broker of health data is likely to have motives to de-anonymize health-related data), whereas the capacity to de-anonymize personal data can be assessed on the basis of the financial resources and technical expertise of the data recipient. The impact of the privacy violations resulting from the de-anonymization can be assessed by the nature of the data and the number of the individuals to which it relates (e.g., an unauthorized disclosure of sensitive data may have a stronger impact than unauthorized disclosure of other data).

Section 3. PrivacyPerfect

PrivacyPerfect is a software application which provides its users with an overview of the personal data flows within their organizations. The overview is presented in the form of a graphical user interface. The interface shows various aspects of organizations' data processing practices, including, but not limited to, the types of data collected by each organizational unit (e.g., a sales department), the purposes for which each of those units use the collected information (e.g., receiving payment transactions and sending newsletters), and the recipients of personal data (e.g., third party vendors). By providing organizations with valuable insight about their data protection practices, PrivacyPerfect helps organizations to meet privacy requirements, including requirements obliging them to be transparent about their handling of personal data.

Section 4. The web application Privacy Compliance Assessment

The web application Privacy Compliance Assessment aims to facilitate the compliance with the privacy laws of Mauritius. The application, which is licensed under the GNU General Public License (GPL), works in most web browsers. To evaluate user's compliance with the privacy laws of Mauritius, the application asks a series of questions which can be answered by clicking on "Yes" and "No" buttons. The Data Protection Commissioner of Mauritius ("the Commissioner") reviewed the application multiples times and proposed changes aiming to ensure the accurate operation of the technology. The Commissioner described the application as "a valuable tool for businesses and governments which take privacy seriously." The application was freely downloadable on the website of the Commissioner. In 2015, it was removed from the website for unknown reasons.

Section 5. AvePoint Privacy Impact Assessment System (APIA)

APIA is a free web application which is distributed by the International Association of Privacy Professionals (IAPP). The tool is designed for the following purposes: (1) analyzing how personal information is handled by organizations; (2) creating automatic privacy impact assessments; (3) generating organization specific privacy impact assessment reports and sending them to privacy officers, and (4) facilitating the creation of security and vulnerability assessments.

The term "privacy impact assessment" can be defined as an assessment of a project aiming to identify the impact of the project on the privacy of individuals. Privacy impact assessments may help organizations to ensure their compliance with applicable privacy standards. This is because such assessments may help organizations to identify and avoid potential privacy violations.

To generate automatic privacy impact assessments through APIA, organizations need to create questionnaires containing questions related to their data protection practices. APIA allows organizations to create such questionnaires either (1) by selecting questions from a prepopulated bank of questions which are commonly used for the creation of privacy impact assessments or (2) by adding their own questions. In this regard, one of the companies using the services of APIA stated: "the questions available in APIA were very relevant and gave us a great framework to start our PIA [privacy impact assessment] process."

Section 6. The Toolkit for Automated Policy Analysis (TAPPA)

TAPPA, a software application that allows its users to analyze privacy policies quickly, was developed by CMU Usable Privacy and Security Laboratory through the IBM Open Collaborative Research Initiative on Privacy and Security Policy Management. TAPPA can be particularly helpful for organizations operating a large number of websites. Normally, the privacy compliance assessment of one privacy policy takes at least 30 minutes. Thus, an organization operating, for example, 300 websites will need to devote a minimum of 150 hours for the examination of its privacy policies. By using TAPPA, an organization can conduct such an examination within a couple of minutes. At present, the application is freely available to academicians.

Section 7. Concluding remarks

This article discussed five examples of current technological solutions that can facilitate the process of ensuring privacy compliance in different ways. While Fujitsu's advanced technology is designed to help organizations to comply with privacy rules related to de-anonymization of personal information, Privacy Perfect provides entities processing personal data with an understanding of their data protection practices which is necessary for meeting the requirements of the applicable privacy laws. The web application "Privacy Compliance Assessment" enables its users to quickly and inexpensively check their compliance with the privacy laws of Mauritius. Individuals and organizations can deploy APIA for conducting automatic privacy impact assessments and TAPPA for assessing the privacy compliance of a large number of privacy policies promptly.

In the future, we can expect the appearance of more solutions for conducting privacy compliance assessments. There are two reasons for this to happen. First, the present manual approaches are expensive and time-consuming. For instance, an organization operating at an international level may need to hire privacy experts located in multiple countries. Such specialists will bill numerous hours to ensure the cross-border privacy compliance of complex data processing schemes. Technological solutions seem to offer a fast and affordable route out of the labyrinth of privacy legislation. Second, there is a tendency for governments to complicate privacy laws. A typical example of this tendency is the new European data protection regulation, which will become applicable on 25th of May 2018. The regulation adds new obligations to organizations processing personal data, including (1) obligations to conduct privacy impact assessments for risk processing and (2) obligations to implement data protection by design and by default. Technological solutions promise to transform the complicated legalese of privacy laws into easy-to-follow practical steps.

References

  1. 'AvePoint Privacy Impact Assessment System,' AvePoint. Available at http://www.avepoint.com/assets/pdf/Fast_Facts_AvePoint_Privacy_Impact_Assessment.pdf.
  2. Cavoukian, A., El Emam, K., 'Dispelling the myths surrounding de-identification: Anonymization remains a strong tool for protecting privacy.' Information and Privacy Commissioner of Ontario, Canada, 2011. Available at https://www.ipc.on.ca/images/Resources/anonymization.pdf.
  3. Determann, L., 'Determann's Field Guide to International Data Privacy Law Compliance,' Edward Elgar Publishing, 2012.
  4. 'Fujitsu Develops Novel Technology to Automatically Assess Personal Data Privacy Risks,' Fujitsu Laboratories, 19 July 2016. Available at http://www.fujitsu.com/global/about/resources/news/press-releases/2016/0719-01.html.
  5. 'Guide to undertaking privacy impact assessments,' The Office of the Australian Information Commissioner. Available at https://www.oaic.gov.au/agencies-and-organisations/guides/guide-to-undertaking-privacy-impact-assessments.
  6. Herold, R., Beaver, K., 'The Practical Guide to HIPAA Privacy and Security Compliance, Second Edition,' CRC Press, 2014.
  7. 'H3 Solutions Gives Customers Confidence in Cloud Services by Conducting Privacy Impact Assessments with the AvePoint Privacy Impact Assessment (APIA) System', AvePoint. Available at http://www.avepoint.com/assets/pdf/case_study/Case_Study_H3_Solutions.pdf.
  8. 'Linux Meetup: An Introduction to Flask by Avinash Meetoo,' Lugm.org, 26 September 2016. Available at http://lugm.org/2015/09/.
  9. Mather, T., Kumaraswamy, S., Latif, S., 'Cloud Security and Privacy: An Enterprise Perspective on Risks and Compliance,' O'Reilly Media, Inc., 2009.
  10. 'Privacy Perfect,' available at https://privacyperfect.com.
  11. 'Privacy Perfect: improved handling of personal data.' Available at https://privacyperfect.com/wp-content/uploads/2016/02/Productinfo_PP_UK.pdf.
  12. 'Reform of EU data protection rules', European Commission, 2 August 2016. Available at http://ec.europa.eu/justice/data-protection/reform/index_en.htm .
  13. Sookun, I., 'Developers Conference 2015 at Voilà Hotel, Day 1', Hacklog, 23 April 2015. Available at https://hacklog.mu/developers-conference-2015-at-voila-hotel-day-1/ .
  14. Sookun, I., 'Privacy Compliance Assessment App taken down from the Data Protection Office website,' Hacklog, 29 December 2015. Available at https://hacklog.mu/privacy-compliance-assessment-app-taken-down-from-data-protection-office-website/.
  15. 'The EU General Data Protection Regulation,' Allen & Overy, 2016. Available at http://www.allenovery.com/SiteCollectionDocuments/Radical changes to European data protection legislation.pdf.
  16. 'Toolkit for Automated Policy Policy Analysis,' CyLab Usable Privacy and Security Laboratory. Available at
    http://cups.cs.cmu.edu/tappa/.

Co-Author

Rasa Juzenaite works as a project manager in an IT legal consultancy firm in Belgium. She has a Master degree in cultural studies with a focus on digital humanities, social media, and digitization. She is interested in the cultural aspects of the current digital environment.