"JDPP is a leading peer-reviewed journal that addresses the global concerns of data protection and privacy with cutting edge insights from the thought leaders in judiciary, industry and academia."
Volume 5 (2022-23)
Each volume of Journal of Data Protection & Privacy consists of four 100-page issues.
The articles published in Volume 5 are listed below.
Volume 5 Number 4
Special Issue: The Internet of Things
-
Editorial: Navigating the future of the Internet of Things and balancing innovation and privacy
Ardi Kolah, Founding Editor-in-Chief, Journal of Data Protection & Privacy -
Practice papers
Benefits and disadvantages of IoT for government, business and consumer sectors
Ioannis Giokaris, Lecturer, Frederick University
The Internet of Things, or IoT, is the growing network of interconnected devices that are able to communicate with each other and with other internet-enabled devices and systems. This can include a wide range of devices, from smart home appliances and security systems to industrial equipment and wearable technology. IoT is important because it allows for the seamless exchange of data and information between devices, enabling them to work together in new and innovative ways. This can lead to a number of benefits, such as improved efficiency, cost savings and enhanced functionality. Nevertheless, the excessive use of IoT applications may have some setbacks for the users. This paper aims to examine the benefits, but also the disadvantages of IoT applications in government, business and consumers sectors.
Keywords: Internet of Things; governments; business; consumers; technology; data protection -
Key considerations for companies in the rapid adoption of IoT technologies and its impact on privacy and the data protection of consumers
Joanna Antoniewska, Privacy and Data Protection Specialist, Dentons
The Internet of Things (IoT) is an electronics-based ecosystem in which electronic devices are interconnected and capable of communicating with one another over the internet, and the technology has experienced an increase in usage around the world in recent years. Although the benefits of the IoT technology are many, there have also been many ethical and privacy-related issues and risks that have emerged over the past decades that can be linked to it. Activists and academics have been actively trying to identify, address and tackle the issues and risks that the IoT technology poses to the everyday consumer. This has not always been an easy task, given the complex nature of the technology itself, its increasingly changing purposes and its extraterritorial reach. This paper will focus on four privacy and data protection topics that have a major impact on the rights and freedoms of the everyday IoT technology user, or the consumer. These include the processing of the geolocation data, the risks of personal data breaches within the IoT environment, the issue of securing personal data during and following cross-border transfers, and the risks stemming from data localisation requirements. When discussing these issues, practical solutions to tackle them will also be discussed and presented in more detail.
Keywords: Internet of Things; sensitive data; geolocation data; risk; data localisation; cross-border transfers -
Protecting patient confidentiality in the Internet of Medical Things through confidential computing
Richard Searle, Vice President of Confidential Computing and Prabhanjan Gururaj, Solutions Engineering Manager, Fortanix
The Internet of Medical Things (IoMT) provides a network of distributed devices that generate a wealth of data for clinicians and medical researchers. The global COVID-19 pandemic has demonstrated the benefits that IoMT data has brought about for remote medical services and clinical diagnosis. While the security of remote IoMT devices is an established area of concern, enforcing the privacy of the data that they both generate and process requires a data-first approach to network design. How can a distributed IoMT network simultaneously ensure the integrity of distributed devices and maintain the privacy and confidentiality of protected healthcare information (PHI)? In this positioning paper, we outline the issues that must be addressed by manufacturers of IoMT devices and those responsible for the system architectures that process gathered healthcare and contextual data. We consider how the nascent technology of confidential computing addresses the dual requirements of systemic security and data confidentiality, and we provide a conceptual architecture based on current developments within the field. Our analysis of the practical considerations associated with IoMT deployment reveals a fundamental requirement for a data-first approach to security that is governed by patient consent and zero-trust principles.
Keywords: IoMT; data security; confidential computing; privacy; consent -
Research papers
Enable the metaverse and smart society with trustworthy and sustainable ‘things’
Abhik Chaudhuri, Domain Consultant — Digital Transformation Governance, Tata Consultancy Services and Ambuj Anand, Assistant Professor of Information Systems and Business Analytics, Indian Institute of Management Ranchi
The Internet of Things (IoT) is a promising technological advancement that offers several benefits to society and may be effective in addressing sustainability challenges. New digital business models are using the power of information to replace traditional products with innovative solutions and services leveraging IoT technology. The benefits of metaverse applications, smart services in cities with IoT and digital twins are apparent. To realise the true potential of this technology, security and privacy concerns need to be effectively addressed. Ethically aligned design of autonomous and intelligent systems with IoT components is a necessity for human well-being and to develop trust in these applications and services. In addition to self-regulation, a structured and well-defined policy for technology governance of IoT deployments is necessary to establish trustworthy smart services. This paper discusses the potential of IoT applications in digital twins, the futuristic metaverse and smart cities. It highlights how IoT implementations are addressing the UN's sustainable development goals (SDGs), the emerging security and privacy concerns with IoT, and how emerging global standards and policies are useful in addressing IoT's trustworthiness related challenges for creating sustainable, trusted metaverse and smart society.
Keywords: Internet of Things; digital twin; metaverse; smart city; IoT security; privacy; data protection; trustworthiness; algorithmic accountability; sustainable development goals; standards; policy -
Implementing privacy and data confidentiality within the framework of the Internet of Things
Ayush Goel, PhD Research Scholar and Gurudev Sahil, Assistant Professor at School of Law, CHRIST University
Throughout the current and future worldwide Web network infrastructure, the notion of the Internet of Things (IoT) foresees the pervasive interconnection and cooperation of intelligent things. As such, the IoT is simply the next logical step in the expansion of the Web into the real world, ushering in a plethora of unique services that will enhance people's lives, give rise to entirely new economic sectors and smarten up the physical infrastructure upon which we rely, including buildings, cities and transportation networks. As smart devices permit widespread information collection or tracking, the IoT will not be able to reach its full potential if the vision for the IoT is not implemented appropriately. These helpful characteristics are countered by concerns over confidentiality, which have, to date, hindered the viability of IoT aspirations. In the face of widespread surveillance, the management of private information and the development of tools to limit or evade pervasive monitoring and analysis are two examples of the new difficulties brought about by such dangers. This paper considers the privacy concerns raised by the Internet of Things in depth.
Keywords: Internet of Things; artificial intelligence; data protection; confidentiality of data; GDPR; security of data; blockchain -
Privacy and security concerns: A systematic review of older adults’ perceptions surrounding the use of technology
Thora Knight, PhD Candidate, Xiaojun Yuan, Associate Professor and DeeDee M. Bennett Gayle, Associate Professor, College of Emergency Preparedness, Homeland Security, and Cybersecurity at the University at Albany, State University of New York
The COVID-19 pandemic catalysed the adoption of technology in every facet of our lives. With growing reliance on technology, it is vital to ensure that efforts to promote or enhance older adults' use, adoption or interaction with technology remain salient. Using a systematic review and a search of five databases bounded over a five-year period, this review summarises the state of the literature on older adults' interactions and perceptions of technology, emphasising the interplay between privacy and types of technology. The antecedent-privacy concern-outcome (APCO) model was used to frame the findings and type of technology was proposed as an associated privacy-related research factor. The review reveals older adults' privacy perceptions align with the principles underlying the APCO model including their awareness, experiences and demographic differences and their willingness to use a particular type as a privacy outcome spotlights contextual relationships that calls for exploration.
Keywords: older adults; elderly; privacy and security; technology; APCO; technology use -
IoT legislation’s loopholes: The governmental blessing for statutory surveillance?
Dr Paweł Kuch, University of Zurich
Little has been discussed in the context of the material scope of privacy and data protection legislation. The rapid development and progress in the design of internet-connected wearables, appliances or movables — the Internet of Things (IoT) — and biotechnology and bionics that want to connect the human body with computers — Internet of Body (IoB) — and Artificial Intelligence (AI) in general, should bring more focus to the actual value of existing laws. Are the available legal tools sufficient against private and public actors? Are they enforceable and indeed being enforced? Do citizens trust their governments and vice versa? Or are we at the threshold of social credit scoring dystopia? These questions arise when reading current legislation and observing its real-life application. The picture is grim — in many places, the privacy and data protection laws are non-existent or insufficient. In the USA, the focus is more on developing new technologies than securing people's rights. In the European Union, the situation is seemingly better. Still, the facade is full of cracks when taking a closer look. The loopholes can be found on every level: in member states' constitutions, in the treaties, charters, regulations and directives. It is time to rethink the current approach before the pretence of democracy kills fundamental freedoms.
Keywords: AI; bionics; biotechnology; democracy; IoB; IoT; data protection; law; privacy; rights and freedoms -
Book review
Regulatory Insights on Artificial Intelligence: Research for Policy
Reviewed by Ardi Kolah, Founding Editor-in-Chief, Journal of Data Protection & Privacy
Volume 5 Number 3
-
Editorial: Elon Musk’s cost-cutting at Twitter raises fresh data protection concerns and puts the social media platform on a collision course with regulators on both sides of the Atlantic
Ardi Kolah, Founding Editor-in-Chief, Journal of Data Protection & Privacy -
Comment
Who are you on Web 3.0?
Lothar Determann, Partner, Baker McKenzie -
Practice papers
African Union’s Data Policy Framework and Data Protection in Africa
Kinfe Yilma, Assistant Professor of Law, Addis Ababa University
The African Union (AU) has in recent years launched a number of initiatives with the goal of enabling member states to benefit from the emerging digital economy. One such initiative is the AU Data Policy Framework endorsed by the Executive Council in February 2022. This paper considers what value the Data Policy Framework adds to the protection of data privacy in Africa. It finds that the Data Policy Framework offers little that would further data protection systems at the continental, subcontinental and national levels. With far more progressive international best practices already available, member states are likely to ignore the Policy as they ignored preceding standards like the Malabo Convention.
Keywords: data protection, data privacy, data policy, regional economic communities, African Union, Africa -
The AI Act in light of the EU Digital Agenda: A critical approach
Konstantinos Kouroupis, Assistant Professor of EU and Data Rights Law, Department of Law, Frederick University
This paper deals with the issue of the impact of artificial intelligence (AI) on our modern societies as well as on fundamental rights and freedoms, mainly those of privacy and security. AI constitutes one of the most essential pillars of the EU Digital Agenda. Despite its use in almost every field of private and public life there is a significant gap regarding its legal regulation. Subsequently, in relation to the proposed AI Act, which corresponds to the first binding legal instrument on AI anywhere, a key consideration arises regarding the governing of AI. Following a brief presentation of the nature and content of the proposed rules, this paper focuses on two major matters which give rise to serious concerns and demand further investigation: those of facial recognition and AI liability. Even though the draft AI Act encompasses provisions relating to these issues, there are still significant points which need to be clarified, such as the potential use of biometric identification tools by private organisations or the liability for AI decisions or actions. Through a comparative analysis of the existing European legal framework covering data protection as well as via the study of significant rulings of European courts and national competent authorities on data protection, this paper aims to demonstrate how to balance the rights, freedoms and interests of the individual against the challenges imposed by the ongoing technological evolutions. Moreover, an original theory is put forward in the area of AI liability which ideally keeps pace with the latest evolutions marked by the proposal of an AI Liability Directive. In addition, an effective mechanism is suggested for the lawful use of facial recognition systems. Therefore, the goal of the study is to indicate fruitful solutions of AI governance in order to build a trustworthy and productive technological environment, with respect to the consolidation of digital privacy and the deployment of modern technological tools.
Keywords: privacy, security, EU Digital Agenda, AI Act, facial recognition, liability, data protection -
Deconstructing the regulatory impact of the US CLOUD Act: An optimal regulatory approach to ensuring access to data in the cloud?
Nick Roudev, Managing Associate, Linklaters and Lori Baker, Director of Data Protection, Dubai International Financial Center
The following discussion sets out a framework based on optimal regulatory theory and how to practically implement smart, agile cloud policy in a way that supports business and national/international concerns. The emergence of regulations such as the CLOUD Act, when viewed against the impact of the General Data Protection Regulation (GDPR) and related case law, may contribute to the conflicts that the data protection authorities are experiencing, moving from a ‘blanket’ solution attitude towards a more ‘optimal’ approach. While certain regions and countries are exploring and even opting for data localisation policies, there are a fair number of regulators of privacy, technology, etc demonstrating an evolving understanding of the adverse economic impact of a ‘nodata-outside-the-country’ mentality. Data can be properly protected, not despite leaving a country’s borders, but because of it.
Keywords: CLOUD Act, localisation, regulatory theory, optimal, MLAT, storage, cloud -
UK data protection and digital information bill explained
Steve Wilkinson, Freelance Data Protection Officer
It has been more than four years since the EU General Data Protection Regulation (GDPR) was applied to the UK, as supplemented by the Data Protection Act (DPA) 2018. Since then, the UK has left the EU. Having reflected on the advantages and disadvantages of the current framework, the government identified and consulted on several areas where it considered improvements could be made that would benefit those who process personal data while retaining high data protection standards. Any revised legislation may have impacts on the UK’s adequacy as well as trading abilities throughout the EU. This article will review the current status of the bill together with a review relating to the impact of the proposed legislation within the UK.
Keywords: data protection and digital information, UK adequacy, data protection, risk assessment, UK GDPR, senior responsible individual, biometrics and surveillance, cookies. -
Observing 2021–2 data breach decisions of the Irish Data Protection Commission
Marie C. Daly, Special Counsel, Covington & Burling
The Irish Data Protection Commission (DPC) regulates many of the top global technology companies and as such its decisions have a significant impact on the companies and on the many users of their platforms. This article examines a number of recent data breach decisions of the DPC and finds them forensic, focused, reasoned and formulaic in approach. The decisions deal with key General Data Protection Regulation (GDPR) provisions, notably on requirements for data breach notification and communication with data subjects. In a change of strategy earlier this year, the DPC no longer offers guidance to controllers dealing with a breach, as was its previous practice. Decisions such as these are likely to help fill that vacuum.
Keywords: data breach, breach notification, DPC, data subjects -
Privacy nutrition labels, app store and the GDPR: Unintended consequences?
Miloš Novović, Associate Professor of Law, BI Norwegian Business School
In an effort to increase the transparency of personal data processing carried out via applications listed on their mobile store, Apple recently announced the launch of privacy nutrition labels (PNLs). Aimed at informing users about an application’s use of data, these card-like labels are prominently visible on each application’s App Store page. This paper explores whether such disclosures made via PNLs can help data controllers fulfil their duty of transparency under the EU General Data Protection Regulation (GDPR). It establishes that the PNLs, in their current, highly standardised fashion, cannot convey the mandatory obligations required by the GDPR. Added to this, they cannot adequately supplement existing privacy policies, either — as they neither serve an adequate role as a ‘first layer’ of a privacy notice, nor help communicate information more efficiently. However, the paper finds that the PNLs might serve another purpose: enhancing data controllers’ internal compliance routines. PNLs, even with their current limitations, can bring tangible improvements to cross-functional communication, third-party sharing awareness, records of processing accuracy, adherence to the data protection principles and adequate resource assignment. The overall conclusion of the paper, counterintuitive as it might appear, is that PNLs should be viewed as an organisational measure-enhancing mechanism rather than a transparency tool.
Keywords: privacy labels, Apple App Store, transparency, data processing notice, software development, compliance -
Technical controls that protect data when in use and prevent misuse
Magali Feys, Founder, AContrario.law, IP, IT & Data Protection Lawyer, et al.
Global data processing flowing across geographic borders and increasing risks of external data breach and misuse beyond lawful purposes requires careful evaluation of technical controls that prevent privacy violations before they occur. This paper details the specific requirements for, and certain benefits from, implementing technical controls satisfying the heightened requirements for statutory pseudonymisation as defined in the General Data Protection Regulation (GDPR) in the context of (i) surveillance-proof processing, (ii) lawfulness of processing, (iii) more secure processing and (iv) data supply chain defensibility. The interconnectedness of these issues is presented within the confluence of conflicting interests among four different groups: governments, courts, enforcement agencies and non-governmental organisations (NGOs).
Keywords: pseudonymisation, international data transfer, cloud, data breach, analytics, artificial intelligence (AI), machine learning (ML) -
Right to be forgotten in case of search engines: Emerging trends in India as compared to the EU
Indranath Gupta, Professor of Law and Dean of Research and Paarth Naithani, Assistant Lecturer and Research Fellow, O.P. Jindal Global University, Jindal Global Law School
In India, the right to be forgotten (RTBF) is relatively new and has been discussed in different courts. A timely discussion concerning RTBF in India is necessary as several judgments are beginning to shape its dimensions. Further, India is considering enacting comprehensive data protection legislation. Comparing the developments in India to the rich and long-standing jurisprudence on RTBF in the European Union (EU) can help shape the discourse in India. RTBF has been established and exercised in the EU for almost a decade. In fact, the EU has had a right to erasure since 1995. Thus, this paper examines how India and the EU have handled RTBF. The paper considers a data fiduciary in India (or a data controller in the EU), namely search engines. The paper compares India with the EU and suggests the way ahead for RTBF in India. It reflects on the fact that the implementation of RTBF would depend on the nature of data fiduciaries and their services.
Keywords: Data Protection Bill, 2021, European Union, General Data Protection Regulation, India, right to erasure, right to be forgotten -
Book review
Taming the Algorithm. The Right Not to Be Subject to an Automated Decision in the General Data Protection Regulation
Reviewed by Prof Dr Chris Bellamy, Member, Editorial Board Journal of Data Protection & Privacy
Volume 5 Number 2
-
Editorial: Can blockchain technology protect organisations against the escalating threat of personal data and cyber security breaches?
Ardi Kolah, Founding Editor-in-Chief, Journal of Data Protection & Privacy -
Practice papers
The former Indian DPB, California’s CCPA and the European GDPR: A comparative analysis
Mathew Chacko, Head of the Technology, Media & Telecommunications practice group and Shambhavi Mishra, Associate, Data Protection, Privacy and Cybersecurity practice, Spice Route Legal, India
This paper intends to critically analyse the major substantive areas of divergence between India’s latest version of their draft data protection law (the Data Protection Bill, 2021), the European General Data Protection Regulation and the California Consumer Privacy Act, 2018. The paper further identifies aspects of the General Data Protection Regulation and the California Consumer Privacy Act, 2018 that must be adopted in the Indian context from a business perspective.
Keywords: GDPR, CCPA, Data Protection Bill, 2021 -
A decade after the Personal Data Protection Act 2010 (PDPA): Compliance of communications companies with the notice and choice principle
Ali Alibeigi, Faculty of Law, Abu Bakar Munir, Holder of Tun Ismail Ali Chair, Faculty of Law, Malaysia and Adeleh Asemi, Faculty of Computer Science and IT, University of Malaya
The massive and implausible advancements in the fields of information and communications technology, and especially the internet, have increased both the value and threats to the information privacy of individuals. The Malaysian Personal Data Protection Act 2010 (PDPA) was a governmental endeavour to protect the information privacy of the citizens. However, the Act’s output and the level of compliance by the data users are in a halo of ambiguity. This qualitative study using the document analysis aimed to find out to what extent the communications companies comply with the Act. Hence, the privacy policies of these companies were evaluated in line with the requirements of the Act. The results indicated that more or less all samples failed to satisfy the PDPA requirements. The solutions provided by this research can be used as practical guidelines to draft a Standard Privacy Policy. The suggestions also would benefit the Personal Data Protection Commissioner in performing his duties and functions.
Keywords: data protection officer, data user, Malaysia, PDPA, personal data, privacy -
New directions for data governance in health data? Examining the role of anonymisation and pseudonymisation
Anna Aurora Wennäkoski, Senior Specialist, Data Business Unit, Finnish Ministry of Transport and Communications
Data governance can be considered a commonly shared priority for many organisations. This paper examines the roles of anonymisation and pseudonymisation as privacy-enhancing technologies (‘PETs’) as part of data governance focusing on the health and medical sector. Ultimately, it asks whether organisations should reframe their data governance to favour anonymisation or pseudonymisation respectively. To that, the paper concludes that while both anonymisation and pseudonymisation appear important technical tools, they ought to be understood as part of the wider frame of data governance, where no stand-alone tool seems to suffice. Rather, a wider outlook that also includes organisational and legal measures is needed. Regarding the latter, the guidance seems to proliferate, and in those, reasonable protective measures, along with risk-based approaches, appear common.
Keywords: data governance, health data, data protection law -
Transparent communication under Article 12 of the GDPR: Advocating a standardised approach for universal understandability
Indranath Gupta, Professor of law, Jindal Global Law School and Dean of Research, O.P. Jindal Global University and Paarth Naithani, Academic Tutor and Teaching & Research for Intellectual Pursuit (TRIP) Fellow, Jindal Global Law School, O.P. Jindal Global University
This paper suggests the way forward for the transparency requirement under the GDPR for all data subjects in the context of two recent developments in 2021. The first was the decision of the Dutch DPA against TikTok, and the second was the release of the UK Children’s Code by the ICO. The paper positions understanding of terms of use and privacy policy as an essential attribute in the overall intelligible, clear and plain language requirement under the GDPR. It indicates that a standardised approach can guarantee that data subjects understand the information data controllers share. This approach will have a script with standardised and universal tools. Such an approach would overcome the limitations of a particular language, including the variable perception of ‘privacy’ among individuals.
Keywords: transparency, intelligible, clear and plain language, standardised approach, GDPR -
Research papers
Artificial intelligence and automated decision making: The new frontier of privacy challenges and opportunities
Joseph Srouji, Avocat à la cour and Founding Partner, Srouji Avocats and Stefano Bellè, Graduate law student at Université Paris-Panthéon-Assas
This paper addresses the privacy component of broader artificial intelligence (AI) ethical considerations. We begin with an overview of the regulatory landscape, or lack thereof, and then call out the specific provisions of EU data protection law applicable to AI while focusing on examples of country-specific approaches, including some recent regulatory action. This regulatory action is particularly insightful since it identifies the key challenges that companies face, or will eventually face, when adopting AI-based solutions. These challenges include how to anticipate and prevent bias in automated decision making (ADM) and how to provide transparency to data subjects, despite the complexity of machine learning processes, while protecting business secrets and know-how.
Keywords: artificial intelligence, AI, data protection, data privacy, machine learning, regulations, European Union, GDPR, Artificial Intelligence Act, automated decision making, digital ethics, enforcement -
Worthy of trust: Protecting minority privacy in diversity reporting
Matthew Bellringer, Meaningbit Ltd, The Old Casino
This paper highlights potential risks to privacy in diversity monitoring and reporting. It explores the reputational drivers behind reporting, and the specific challenges presented when reporting on ‘hidden’ or ‘invisible’ aspects of diversity. It raises operational considerations that come about as a result of these challenges and suggests that an integrated approach aimed at increased trust among all stakeholders is likely to yield the greatest business benefits from reporting activities.
Keywords: environmental, social and governance (ESG), diversity, equity and inclusion (DEI), GDPR, hidden minorities, reporting, employee data, sensitive data, accidental disclosure, disability, sexual orientation, gender -
The UK’s Online Safety Bill: The day we took a stand against serious online harms or the day we lost our freedoms to platforms and the state?
Alexander Dittel, Partner in Technology, Wedlake Bell
This paper discusses the UK’s Online Safety Bill, which is intended to protect vulnerable individuals online, although at the risk of promoting surveillance techniques and mandating proactive content removal by platforms. It analyses how the Bill, a very ambitious project, tries to safeguard vulnerable people through means which could be easily abused, and asks whether the risk of abuse that could affect everyone is worth the protection of a minority of online users. Recently demonstrated authoritarian approaches to solving the COVID-19 crisis make this concern palpable. The paper concludes by saying that once we take a path, it will be difficult to walk it back.
Keywords: online harms, Online Safety Bill, lawful but harmful, user-generated content, content monitoring, user, monitoring, cyber offences -
Book review
Determann’s Field Guide to Data Privacy Law: International Corporate Compliance
Reviewed by Ardi Kolah
Volume 5 Number 1
-
Editorial: Why International Holocaust Remembrance Day is more relevant than ever in a world where privacy and data protection are increasingly rare
Ardi Kolah, Founding Editor-in-Chief, Journal of Data Protection & Privacy -
Research papers
China’s PIPL and DSL: Is China following the EU’s approach to data protection?
Ziwen Tan, LL.M. Candidate, China University of Political Science and Law and Channing Zhang, Privacy Protection Manager, Kingnet Network
On 20th August, 2021, China, a market with over one billion consumers, passed the Personal Information Protection Law (PIPL), effective as of 1st November, 2021. The PIPL aims to provide individuals with comprehensive sets of data protection rights and will unquestionably impact how businesses ensure compliance in the upcoming years. Not surprisingly, like many other jurisdictions, the PIPL resembles the General Data Protection Regulation (GDPR) to a great extent, but it also diverges from the GDPR in many regards. Therefore, many companies, having already built a GDPR-compliant program, still face the challenge of demonstrating compliance with the PIPL. Also noteworthy is the recently enacted Data Security Law (DSL), a unique law (if not the only one) around the world targeting data security individually. The DSL not only supplements the PIPL, but also has a distinctive aim regarding national security, which exposes companies to obligations beyond those imposed by the PIPL. Furthermore, there is no denying that there are still some ambiguous parts of the PIPL and the DSL. However, China is enacting other laws, industry-specific regulations, guidelines and standards to complement the application of the two laws, which are also worth attention. This paper discusses the material differences between the GDPR and the PIPL (as well as the DSL when applicable) and creates a roadmap to achieve compliance with the PIPL and the DSL.
Keywords: Personal information protection law, data security law, china, data protection, privacy -
Teleology: The missing piece to solving the GDPR puzzle
Paweł Kuch, Attorney-at-Law, University of Zurich
The Court of Justice of the European Union (CJEU) plays an essential role as the supreme interpreter of primary and secondary European law. Since its inception, the CJEU adopted four methods of interpretation — grammatical, contextual, historical and teleological — and often used all of them to clarify a provision in question. For many reasons, the teleological method is often avoided or misunderstood by many. Regardless of personal opinions, however, the teleological method’s significant impact on the legal interpretation of European Union (EU) legislation must be acknowledged. Rightly or not, the majority of interested parties perceive the provisions of the General Data Protection Regulation (GDPR) as ambiguous. The EU’s 24 official languages hinder the use of the grammatical method of interpretation as the primary and absolute one. The contextual and historical methods are often inconclusive. Consequently, the teleological method of legal interpretation, focusing on the goals and objectives of the legislation, allows the CJEU to adjudicate coherently with the whole EU legal system and its objectives. Recognising that the goals and objectives are part of the GDPR complements the grammatical, contextual and historical interpretation methods and help see the regulation in its intended light: a legal framework for personal data processing, which respects everyone’s fundamental right to personal data protection, aiming to balance it with other fundamental rights. This paper aims to draw attention to the often forgotten or only reluctantly applied teleological interpretation method in implementing the GDPR.
Keywords: Teleology, GDPR, data protection, EU legislation -
On the advent of environmental, social and governance reporting and its intersection with privacy
Martijn ten Bloemendal, Global Privacy Counsel, AbbVie
An emerging area of regulation in the form of environmental, social and governance (ESG) reporting is aimed at long-term sustainability and addressing the challenges of climate change and social inequality. This paper explores how ESG reporting intersects in interesting ways with well-established privacy principles under the European Union (EU) General Data Protection Regulation. The paper analyses these privacy implications in the context of the first formal ESG law of comprehensive scope — the EU Sustainable Finance Disclosure Regulation — as well as existing global ESG disclosure standards. In particular the ‘S’ of ESG encompasses issues in the employment context relating to gender equality and diversity and inclusion (D&I), thereby implicating sensitive personal data and complicating the collection of such data. In addition, the paper looks towards the question of whether there is, or should be, a ‘P’ of privacy incorporated in ESG and considers the potential development of measurable privacy metrics for this purpose.
Keywords: ESG, GDPR, Sustainable Finance Disclosure Regulation (SFDR), Sustainability Accounting Standards Board (SASB), Global Reporting Initiative (GRI), gender, race, ethnicity, diversity and inclusion, sensitive data, employee data -
Practice papers
Enforcing the right to be forgotten as a human right
Saheed Alabi, Director of Legal Research, Alidson Global Network
This paper examines the application of the right to be forgotten as a human right by analysing the provisions of the GDPR, the jurisprudence of the Court of Justice of the European Union (CJEU) and the European Court of Human Rights (ECtHR). Furthermore, the implication of Brexit as a result of the United Kingdom exiting the European Union in the context of the application of the GDPR is analysed generally.
Keywords: Right to be Forgotten, Human Rights, GDPR, Court of Justice of the European Union (CJEU), European Court of Human Rights (ECtHR), Brexit -
The new EU Standard Contractual Clauses as a type of appropriate safeguard in the international transfer of personal data
Anna Popowicz-Pazdej, Privacy Lawyer, Dentons, CIPP/E and doctorate researcher, University of Wroclaw
Personal data flow more freely across borders and represent one of the most significant forces behind the process of globalisation. Although this process is inevitable, it has to respect human rights, especially the right to data protection. One of the most commonly used transfer tools under the General Data Protection Regulation are standard contractual clauses. These tools are used in case of the transfer of personal data to third countries in the absence of adequacy decision. In June 2021 the European Commission adopted the new Standard Contractual Clauses. There is a multifaceted change concerning the new law’s scope, structure and substance compared with the previous versions. This change reflects the new requirements in light of the technological development and the Schrems II judgment. Above all, this was an opportunity to articulate all data protection requirements for international data transfers expressly. Especially, to capture lacking relations between data controllers and data processors by way of a modular approach. This paper summarises the functionality of the new standard contractual clauses in light of the content and rationale of the international transfer of personal data and provides an in-depth oversight of its scope and structure. The ultimate aim is to reach a compromise in order to both ascertain a sufficient level of data protection on the one hand, and not create unnecessary obstacles to cross-border data flows on the other. Hence, it also attempts to answer the question as to whether these standard contractual clauses will stand the test of time, especially bearing in mind some burdensome obligations.
Keywords: GDPR, standard contractual clauses, international data transfer, appropriate safeguards, transfer adequacy -
Getting connected: Providing IT services to the German healthcare sector subject to ecclesiastical data protection law
Tamara Bukatz, Data protection and privacy consultant
Due to the pandemic, digitisation is advancing at a quicker pace in Germany, especially in the healthcare sector. Many institutions in the German healthcare sector are owned by religious associations or church bodies. Software providers that consider entering into a contractual relationship, in particular with hospitals, may find themselves faced with ecclesiastical data protection laws of the different churches in Germany as well as the corresponding system of ecclesiastical courts and supervisory authorities. European countries generally have one supervisory authority for data protection. Germany, however, has not one but a myriad of supervisory authorities on both, a secular and ecclesiastical level, which are explored in this paper. This paper aims to give businesses interested in the opportunities arising from the digitisation development an overview of what it entails to conclude a contract for processing personal data of German entities in the healthcare and related sectors and points out the differences between the General Data Protection Regulation (GDPR) and the Catholic as well as Evangelical data protection law.
Keywords: Article 91 GDPR, Evangelical and Catholic data protection laws, system of secular and ecclesiastical data protection supervisory authorities in Germany, digitisation of the healthcare sector -
GDPR Glasnost: Spain’s AEPD raises the transparency bar and sanctions two banks
Philipp Fischer, Partner, Banking & Finance/Data Protection Department, Oberson Abels and Julien Levis, Head of Data Privacy at an International Group
This paper is a commentary on two recent decisions issued by the Spanish data protection authority (DPA): the AEPD (Agencia Española de Protección de Datos). Both decisions — issued one month apart — developed similar motives and grievances primarily arising from the alleged lack of clarity in the two banks’ privacy notifications to their clients as well as in the consent-collection process and in the formulation of their legitimate interest in processing personal data. These two decisions combined with one issued just a couple of months earlier by the French DPA (CNIL [Commission Nationale de l’Informatique et des Libertés]) appear to draw a new trend: one towards a heightened scrutiny on the details of the data protection documentation set forth by data controllers. Sanctions issued over General Data Protection Regulation’s (GDPR) first two years of implementation had largely focused on penalising manifest disregard for GDPR (primarily in the form of a lack of appropriate technical and organisational measures or the absence of a lawful basis for personal data processing). In each of the three decisions, the data controller was a bank (Banco Bilbao Vizcaya Argentaria, SA [BBVA] and CaixaBank in the two AEPD decisions under review, Carrefour Banque in the CNIL decision previously commented by the co-authors). In the two Spanish decisions, the fines issued were, respectively, for €5m and €6m against BBVA and CaixaBank. Privacy professionals in the banking sector will need to factor in these regulatory developments and reassess the formulation of their privacy notifications. The industry has thus been invited to reassess its duty of privacy information from a new, more rigorous perspective. What degree of detail regarding the specifics of the data processing do regulators expect in a privacy notice? How should data controllers structure the collection of data subject consent to ensure it may constitute a legitimate basis for data processing? What are the elements they need to demonstrate to validly invoke a legitimate interest in the data processing? The two recent AEPD decisions under review set a high bar. While the two decisions are primarily remarkable in their substantive motivation (I), we will also highlight some particularly interesting procedural developments (II).
Keywords: GDPR, duty of information, consent, legitimate interest, impartiality, due process -
Book review
Data Protection Around the World: Privacy Laws in Action
Reviewed by Dr Jacob Kornbeck, Policy Officer, European Commission, Youth Unit -
Book review
Privacy is Hard and Seven Other Myths: Achieving Privacy Through Careful Design
Reviewed by Ardi Kolah