"Journal of Payments Strategy and Systems provides a host of useful, actionable and informative articles and papers that demonstrate the extraordinary opportunities for improving the payments systems. These are written by subject matter experts: corporate practitioners; consultants; bankers; vendors and scholars. The variety of the topics and the points of view make this a must read, even for those who think they know all there is to know about payments."
Volume 7 (2024-25)
Each volume of Journal of Data Protection & Privacy consists of four 100-page issues published both in print and online.
The articles published in Volume 7 are listed below.
Volume 7 Number 3
-
Editorial
The revolt on regulation
Ardi Kolah, Founding Editor-in-Chief, Journal of Data Protection & Privacy -
Practice Papers
Ethics and privacy in AI regulation: Navigating challenges and strategies for compliance
Marta Dunphy-Moriel, Privacy Lawyer/Data Protection Officer, Dunphy-Moriel Legal Services, and Laura Berton, Partner, Kepler Wolf
A new summer of artificial intelligence (AI) started a year ago, promising tantalising technical development and efficiencies of scale, while in parallel the Internet is flooded with advice, notes and analysis of AI’s impact and risks. Although the potential use of AI is promising and could help solve very real human challenges, the risks and societal impact are real too. With AI infiltrating all areas of life, such as online platforms, work, healthcare, social services and the justice system, it is essential that it is developed within key safety parameters. Furthermore, it is no secret that for AI to be effective it needs to process vast quantity of data, which is at odds with the General Data Protection Regulation (GDPR) principles of data minimisation. Businesses are repeatedly told to mitigate such risks on fundamental rights, privacy, discrimination, biases, etc. with stringent privacy and AI governance, all within an ethical framework and in compliance with existing legislation. Among the bombardment of information, this paper seeks to provide practical guidelines to comply with existing privacy regulation while implementing safe and trustworthy AI. The first part considers compliance with the GDPR while developing or using AI, while the second part provides practical recommendations in relation to the implementation of an ethical AI framework.
Keywords: ethics; privacy; AI; artificial intelligence; AI regulation; AI ethics; data ethics; compliance -
Federated learning in healthcare: Addressing AI challenges and operational realities under the GDPR
Federico Vota, UK DPO | Ireland and ANZ Data Protection Manager, Dedalus UK and Ireland, Francesca Pediconi, Privacy Specialist, and Alessandro Liscio, Data Scientist, Dedalus Italia S.p.A.
The fundamental characteristic of machine learning (ML) algorithms is their ability to learn to solve problems autonomously, based solely on the data provided to them; to do so, ML models require a huge amount of data (ie training data) to learn how to solve the problems they are subjected to. When talking about the training of artificial intelligence (AI) algorithms, especially for healthcare use, the matter of personal data included in the training datasets cannot be ignored. This includes data belonging to ‘special categories’ (including ‘health data’), which require more robust measures than those put in place for processing of so-called ‘common data’ in order to be processed under the General Data Protection Regulation (GDPR). Among those in the healthcare sector who develop or use AI, this issue is highly relevant. Federated learning is a cooperative ML technique capable of exploiting the knowledge stored in multiple datasets without the need to pool them out, as an innovative approach to privacy-preserving AI model training. In particular, in the healthcare sector, this technique can be used to enable multiple organisations to collaborate without sharing sensitive patient data. The aim of this paper is to showcase how federated learning integrates seamlessly with existing healthcare IT systems to address privacy concerns by presenting use cases that offer a concrete perspective on federated learning’s potential and operational challenge.
Keywords: federated learning; healthcare data privacy; GDPR; privacy-preserving machine learning -
Mitigating AI risks: A comparative analysis of Data Protection Impact Assessments under GDPR and KVKK
Arzu Galandarli, Legal Counsel, World Medicine İlaç
This paper critically examines the Data Protection Impact Assessment (DPIA) frameworks under the European Union’s (EU) General Data Protection Regulation (GDPR) and Turkey’s Personal Data Protection Law (KVKK), with a particular focus on mitigating the risks posed by artificial intelligence (AI) technologies. It identifies significant gaps and challenges within each framework, especially regarding AI-specific risks such as data inference, re-identification and algorithmic bias. By analysing the regulatory landscapes and enforcement practices in key jurisdictions including Germany, France and Ireland, the paper draws lessons that could strengthen KVKK’s ability to address emerging AI-related challenges. The study adopts a comparative approach, detailing the similarities and differences between GDPR and KVKK in their application of DPIAs, their approaches to cross-border data transfers and their regulatory strategies for automated decision-making systems. The research highlights practical challenges faced by organisations, including balancing innovation with compliance, managing cross-border data flows and conducting effective risk assessments for high-risk data processing activities involving AI. Key findings include the need for Turkey’s KVKK to develop explicit AI-focused regulatory guidance, introduce mandatory DPIAs for high-risk activities and enhance transparency and accountability mechanisms. The paper also identifies best practices such as adopting privacy by design and default, leveraging technical measures such as federated learning and differential privacy, and engaging proactively with supervisory authorities to align with global standards. The paper concludes with actionable recommendations for policy makers and practitioners to harmonise KVKK with GDPR, improve cross-border data protection and foster trust in AI systems while maintaining innovation. These insights aim to provide a roadmap for building a robust data protection framework that addresses both current and future challenges posed by AI technologies.
Keywords: Data Protection Impact Assessments; GDPR; KVKK; artificial intelligence; privacy by design; algorithmic transparency -
Research Papers
Revisiting personal data: Ownership theories and comparative legal perspectives from Europe, Indonesia and the United States
Diah Pawestri Maharani, Assistant Professor, Afifah Kusumadara, Professor, Hanif Nur Widhiyanti, Associate Professor, and Reka Dewantara, Associate Professor, Universitas Brawijaya
The growing importance of personal data in the digital era has sparked global debates on whether it should be treated as property or a fundamental right. Different jurisdictions adopt varying approaches to personal data ownership, resulting in significant legal, regulatory and operational challenges. The US, through the California Consumer Privacy Act (CCPA), treats personal data as a tradable asset, allowing businesses to monetise it with limited consumer rights. In contrast, the European Union (EU) General Data Protect Regulation (GDPR) frames personal data as an inalienable right, prioritising individual control and privacy. Indonesia’s Personal Data Protection Law (PDP Law) takes a hybrid approach, recognising personal data as a fundamental right while permitting regulated cross-border transfers. This study employs a qualitative and comparative legal analysis to examine the implications of these differing approaches. It explores property ownership theories, such as first occupancy, labour, utility, libertarian and personality theories, to assess their applicability to personal data. The findings suggest that traditional property concepts are insufficient to address the complexities of personal data, as it is inherently tied to individual identity and autonomy. Instead of being commodified, personal data requires robust legal protections to safeguard privacy and individual rights. The study highlights key challenges, including regulatory fragmentation, compliance complexities and consumer protection disparities. It underscores the need for greater harmonisation of data protection laws and stronger international cooperation to balance economic interests with the fundamental right to privacy. The insights provided aim to inform policy makers, businesses and legal practitioners in developing ethical and effective data governance frameworks.
Keywords: data protection; data ownership theories; GDPR; CCPA; Indonesia PDP Law -
Personal data monetisation model in India: Reimagining the contours of the Digital Personal Data Protection Act, 2023 through a market-oriented approach analysis
Pushpit Singh, TMT Lawyer, Bengaluru, and Silvia Tomy Simon, BA LLB Student
In the digital age, ensuring data privacy has become increasingly difficult as businesses leverage personal data for profit, often at the expense of individual autonomy. This challenge is exacerbated in India by the Digital Personal Data Protection (DPDP) Act, 2023, which adopts a market-oriented approach, prioritising data processing and economic interests over robust privacy safeguards. The Act deepens existing asymmetries in bargaining power, allowing businesses to exploit the commercial value of personal data while individuals receive no direct benefit. In a society where privacy awareness is limited and financial vulnerability is widespread, this imbalance further exposes individuals to the risks of exploitation and surveillance. To address these concerns, this paper proposes a personal data monetisation model designed to strike a balance between commercial interests and consumer rights. By enabling individuals to license their personal data to businesses in exchange for monetary compensation, the model seeks to restore autonomy and ensure that the economic value of data accrues to its rightful owners. The operationalisation of this model, however, raises critical questions, including how to determine data value, protect privacy in a surveillance-driven economy and address the risks of centralising consent management. This paper critically examines these challenges and explores pathways to implement a framework that aligns economic empowerment with the preservation of privacy as a fundamental right, fostering a more equitable digital ecosystem.
Keywords: personal data; Digital Personal Data Protection Act; 2023; monetisation; India
Volume 7 Number 2
-
Editorial
The first 25 years of the 21st century: A privacy odyssey; and what is next?
Ardi Kolah, Founding Editor-in-Chief, Journal of Data Protection & Privacy -
Practice Papers
Securing informational privacy in India’s IoT governance: Looking through the lens of FASTag
Anupriya, Lecturer and Assistant Dean (Clinical Legal Education), and Krishna Deo Singh Chauhan, Associate Professor, Jindal Global Law School, O.P. Jindal Global University
As Internet of Things (IoT) technologies become increasingly integral to digital governance initiatives in India, this paper addresses the critical challenge of ensuring that their implementation conforms to the constitutional right to informational privacy established in the Puttaswamy judgment. Using India’s FASTag electronic toll collection system as a case study, the paper develops a comprehensive framework for evaluating and safeguarding privacy rights in state-deployed IoT applications. Through detailed analysis of FASTag’s implementation, which involves extensive data collection, multiple stakeholder access and reported vulnerabilities including recent data breaches, the study demonstrates how IoT applications can inadvertently compromise informational privacy rights. The paper then applies the threefold test from Puttaswamy (presence of law, legitimate objective, proportionality) to evaluate FASTag’s constitutional validity. Despite various legal instruments including the Digital Personal Data Protection Act 2023, NETC Procedural Guidelines and other rules, the analysis reveals significant gaps in meeting the ‘quality of law’ standards. While FASTag demonstrates rational connection to legitimate state objectives through improved efficiency and transparency, evidenced by reduced wait times and fuel savings, it fails the necessity requirement due to inadequate privacy impact assessments. The proportionality stricto sensu evaluation further identifies concerns about unclear scope definition and insufficient data protection measures affecting thousands of users. Building on this analysis, the paper develops a practical framework for privacy-conscious IoT deployment in public services. Drawing on both the Indian legal position and privacy and data protection measures concerning electronic toll collection and IoT systems under European Union (EU) and Californian law, it proposes specific safeguards that can strengthen the implementation of FASTag, such as IoT systems and their alignment with the right to privacy. In the final section, it elaborates upon two specific safeguards and discusses at length the significance, challenges and details of their implementation. These safeguards are privacy impact assessments and data localisation. This framework’s significance lies in its comprehensive approach to balancing technological innovation with privacy protection, not only providing actionable measures for future IoT implementations, but also discussing the alignment of such measures with the fundamental right to privacy at the highest level. This contribution is particularly valuable as India continues to expand its digital infrastructure, ensuring that technological advancement does not come at the cost of constitutional rights.
Keywords: Internet of Things (IoT); informational privacy; data protection; safeguards; electronic toll collection; privacy rights; proportionality test; privacy impact assessments -
Credit scoring under the GDPR: Insights from the CJEU’s SCHUFA case
Laroussi Chemlali, Associate Professor, Ajman University, Ajman, and Leila Benseddik, Assistant Professor, Canadian University Dubai
The Court of Justice of the European Union’s (CJEU) SCHUFA case revealed significant insight into the complex relationship between credit scoring processes, data protection regulations and the emerging artificial intelligence (AI) governance framework. This paper offers a thorough analysis of the court ruling in the SCHUFA case, focusing on the question of whether credit scoring processes are qualified as automated decision making (ADM) under the General Data Protection Regulation (GDPR). The paper starts by defining credit scoring and its importance in financial decision making, followed by the concerns associated with it. The analysis then shifts to focus on credit scoring systems in light of GDPR and the new European Union Artificial Intelligence Act (EU AI Act), before proceeding to the main facts of the case along with the decision of the Court. After discussing the potential implications of the CJEU’s decision on credit information agencies and other industries relying on ADM, the paper highlights the importance of considering robust measures to mitigate the risks associated with ADM.
Keywords: automated decision making (ADM); AI; SCHUFA case; credit scoring; GDPR; EU AI Act; creditworthiness assessment -
Exploring the integration of privacy-enhancing technologies in Taiwan’s artificial intelligence and data protection frameworks
Hayung Sbeyan, PhD student, National Tsing Hua University; System Analyst, Ministry of Digital Affairs
This paper explores the integration of privacy-enhancing technologies (PETs) within Taiwan’s artificial intelligence (AI) and data protection frameworks, focusing on the Taiwan AI Fundamental Act and the Personal Data Protection Act. In response to the Constitutional Court’s directive for improvements in data protection, the Ministry of Digital Affairs proposed PET guidelines to align technology with regulatory frameworks. PETs, such as differential privacy, homomorphic encryption and secure multiparty computation, are highlighted as innovative solutions that balance data privacy with usability. Despite their potential, PETs face significant technical, legal and economic challenges. Implementing these technologies is particularly complex in sensitive sectors such as finance and healthcare, where data security is paramount. Additionally, the legal ambiguity surrounding the application of PETs creates a risk of inconsistent interpretation and enforcement, especially across different industries. While some degree of ambiguity is necessary for technological flexibility, the author argues that clear legal frameworks and technical guidelines must be provided to support the practical implementation of PETs. The government must actively promote education and cross-sector collaboration to ensure the consistent application and successful adoption of these technologies. Although the PETs guideline could reduce the efforts for implementation of PETs, applying the guideline might be an issue for decision makers. Thus, while using suggested techniques, one should have a basic knowledge of these technologies in order to gain maximum value from the guideline. On the other hand, if researchers, especially legal researchers seeking legal compliance, had a deeper understanding of PETs’ meaning and the results after implementing, it might help reduce the need for data protection requirements in practice. Ultimately, the author concludes that balancing privacy protection with technological innovation is vital to ensuring the long-term success of PETs in Taiwan’s evolving data governance landscape.
Keywords: regulatory compliance for PETs in Taiwan; privacy enhancing technology; Taiwan AI Act; Taiwan Personal Data Protection Act -
The cookie conundrum: Balancing privacy, compliance and user experience and the quest for strategic GDPR-compliant user privacy
Noémie Weinbaum, Managing Director, PS Expertise, and Roy Kamp, Legal Director, Central and Northern Europe, UKG
The digital landscape has witnessed a significant transformation since the introduction of cookies in the mid-1990s, evolving from simple user tracking mechanisms to complex tools integral to online user experiences and targeted advertising. This evolution, however, has not come without consequences; the proliferation of cookies has raised substantial concerns regarding user privacy and data security, prompting the development of regulatory frameworks such as the General Data Protection Regulation (GDPR)1 and the ePrivacy Directive.2 This paper undertakes a critical analysis of the intricate intersection between cookies, the ePrivacy Directive and the GDPR, with a particular focus on the IAB Belgium ruling.3 This landmark case has catalysed significant changes in consent practices, reshaping the digital advertising ecosystem and compelling businesses to reassess their data protection strategies. Notably, the ruling reinforces the primacy of consent under GDPR for cookie deployment, particularly in the context of personalised advertising. The decision also brings into stark relief the unresolved tension between consent-based models and the use of legitimate interest as an alternative legal basis for data processing. While the IAB Belgium ruling firmly aligns with the GDPR’s stringent consent requirements, the European Court of Justice’s (ECJ) subsequent rulings on legitimate interest introduce a potential divergence. For example, in the Koninklijke Nederlandse Lawn Tennisbond (KNLTB) case,4 the court recognised commercial legitimate interest as a lawful basis for processing data, yet this recognition did not extend to cookies, which are central to behavioural advertising and commercial profiling. The recent European Data Protection Board (EDPB) guidelines5 further complicate this regulatory landscape, as they emphasise the need for legitimate interest assessments but offer limited insight into how this legal basis should apply to cookies. This confluence of judicial and regulatory decisions underscores the ongoing challenges in harmonising legitimate interest with cookie-related data processing, calling for a more cohesive regulatory framework. As organisations navigate this complex regulatory environment, the insights provided in this paper aim to serve as a valuable resource for understanding the evolving dynamics of cookie compliance and the broader implications for data protection in the digital age. The paper ultimately seeks to inform stakeholders of the pressing need for accountability and user-centric approaches in the realm of digital privacy.
Keywords: cookies; ePrivacy; GDPR; LGPD; PIPEDA; DPDPA; CCPA; IAB; Planet49; data protection; privacy enhancing technologies; legitimate interest; consent -
Research Paper
Application of data protection laws with a proposal for a flexible regime for humanitarian organisations
Maria Beatriz Torquato Rego, Lawyer and Master’s student in Privacy, Cybersecurity and Data Management, Maastricht University
Humanitarian organisations often operate in emergency contexts where strict compliance with data protection laws, such as the General Data Protection Regulation (GDPR), can pose significant practical challenges. This paper explores the need for a differentiated data protection regime tailored to the realities of humanitarian crises, balancing efficiency and the fundamental rights of data subjects. By analysing key European Court of Justice cases, including Schrems II (C-311/18), Nowak (C-434/16) and Pankki S (C-579/21), the paper highlights the importance of adapting core GDPR principles to crisis situations. It also examines the integration of human rights principles, emphasising the protection of dignity and autonomy during emergencies. Furthermore, it addresses regulatory challenges, proposing proactive engagement with authorities to ensure accountability and trust. Practical solutions are proposed such as simplified Data Protection Impact Assessments (DPIAs), the use of pseudonymisation, data minimisation and standardised Memorandums of Understanding (MOUs) to replace complex contractual requirements. These measures aim to ensure compliance while enabling rapid and effective responses in emergencies. The paper concludes by calling for the development of a flexible regulatory framework that integrates data protection into the operational needs of humanitarian organisations without compromising ethical and legal standards.
Keywords: GDPR; data protection; differentiated regime; humanitarian crises; humanitarian organisations -
Book Reviews
Digital empires: The global battle to regulate technology by Anu Bradford
Reviewed by Dr Jacob Kornbeck, Brussels, Belgium - AI and international human rights law by Michał Balcerzak and Julia Kapelańska-Pręgowska (eds)
- The ethics of privacy and surveillance by Carissa Veliz
-
Data Act: An introduction by Moritiz Henneman, Gordia Konstantin Ebner, Benedikt Karsten, Gregor Lienemann, and Marie Wienroeder
Reviewed by Ardi Kolah, Founding Editor-in-Chief, Journal of Data Protection & Privacy
Volume 7 Number 1
-
Editorial
The multifaceted challenges and opportunities inherent in data protection and privacy regulation
Ardi Kolah, Founding Editor-in-Chief, Journal of Data Protection & Privacy -
Practice Papers
Recognising personal data as a digital asset in Dubai
Michael Clark, Data Scholar and Industry Advisor, and Lori Baker, Member of Editorial Board, JDPP
Data is misunderstood and misused as the commodity it could be. It is humankind's greatest asset; in short, it is potential. Add to this rapid development of powerful technology, from smartphones to wearables, and the world becomes smaller. People feel more connected, yet personal data feels further away from their control of it. Technology has largely dominated the perspective of how the future is viewed and shaped, and while data has never been elevated as the driving force behind technology, it is undeniably the heartbeat of (digital) economies globally. Data privacy law and regulation, often seen as the remaining hope for supporting rights of data owners, has become more fragmented and difficult to implement and with the emergence of the power of processing personal data via autonomous systems such as generative artificial intelligence (AI), it is reaching a pivotal moment. As virtual and physical worlds merge, autonomous processing of data becomes more prevalent and dominant, and people will seek agency while also desiring the trust to express themselves freely, without the fear of compromising their data and identity. This is perhaps the moment where the collective change of the commonly accepted model of data is needed, to view it instead as a multidimensional, identifiable and ownable thing. Certain countries such as China and the UAE are providing a basis for developing this concept further. The general discussion herein provides the foundations to conclude that Dubai is one of the few cities in the world that can and does change the way the use and ethical processing of personal data is considered, particularly as an asset to the data subject themselves.
Keywords: personal data; digital asset; digital economy; data analytics -
Bridging compliance and innovation: A comparative analysis of the EU AI Act and GDPR for enhanced organisational strategy
Sean Musch, CEO/Founder, Michael Charles Borrelli, Director, AI & Partners, and Charles Kerrigan, Partner, CMS
This paper conducts a comparative analysis of the GDPR and the EU AI Act, focusing on their approaches to innovation, compliance and risk management. It examines how the GDPR's data protection framework intersects with the AI Act's broader ethical considerations, highlighting their complementary roles in fostering responsible technology use. Key findings reveal that while both regulations aim to protect individuals and promote ethical practices, harmonising these frameworks is crucial for effective compliance, despite inherent differences between the two. The paper underscores the need for integrated strategies and adaptive policy-making, in a global context, to navigate the complex regulatory landscape, ensuring both innovation and accountability in AI development.
Keywords: GDPR; EU AI Act; data protection; ethical AI; innovation; compliance; risk management; regulatory harmonisation; transparency; bias detection -
Changes to the Federal Trade Commission (FTC) Health Breach Notification Rule closes some gaps but adds some ambiguity
Trinity Car, Managing Counsel, Privacy, Syneos Health, and Brad Rostolsky, Shareholder, Greenberg Traurig
On 26th April, 2024, the Federal Trade Commission (FTC) issued a final rule amending the 2009 Health Breach Notification Rule (HBNR). The primary aim of the Final Rule is to close gaps between the preceding version of the FTC's breach notification rule and the protections offered by the breach notification regulations under the Health Insurance Portability and Accountability Act of 1996 (HIPAA). The FTC focused on the personal data regularly processed by direct-to-consumer Health Apps, which represent a growing segment of the healthcare industry not regulated by HIPAA. This paper provides an in-depth analysis of the changes introduced by the Final Rule, the implications for businesses not regulated by HIPAA, and the potential operational ripple effects for many businesses now regulated under the Final Rule. It also discusses the updated individual notification obligations and the need for impacted individuals to be made aware of potential risks while balancing issues related to notice fatigue.
Keywords: Health Breach Notification Rule; Federal Trade Commission; personal health records; HIPAA; data privacy; mobile health apps -
Research Papers
Medical privacy: Aligning the need to breach patient confidentiality with data protection in the public interest
Andrew Harvey, Director of Information Governance, Cyber and Compliance/Data Protection Officer, Graphnet Health Ltd
This paper takes an overview of case law legislation and professional guidance to assess when it may be acceptable for medical practitioners to breach patient confidentiality and data protection law in the public interest. It looks at the implications of making such decisions in both a positive and negative light because of what happens if confidentiality is breached in the public interest, but also the implication, on occasion, if it is not. The paper synthesises the often contradictory considerations of the Data Protection Act 2018 and UK General Data Protection Regulation with the wider implications of breaching the common law duty of confidentiality and professional guidance offered by the likes of the British Medical Association, General Medical Council and right back to the Hippocratic Oath. In doing so, it creates a framework in which it is acceptable in many circumstances to breach patient confidentiality while demonstrating that due care and attention are required to ensure the appropriate decisions are made.
Keywords: case law; confidentiality; consent; direct care; health care; public health; public interest -
Cross-border flow of personal data (digital trade) ought to have data protection
Vandana Gyanchandani, Lecturer, Jindal Global University, NCR Delhi
The paper provides three specific arguments in support of the two key claims to promote an interface between data protection and digital trade law. It engages in the current academic debate among scholars to understand the role of digital trade law in coordinating the regulatory thicket of national data protection regulations (NDPRs) among states. In pursuance, it proposes a rebuttal to the critique that digital trade law is fundamentally ill-suited to engage in data protection policy debates. The paper argues that data protection and digital trade law cannot remain in separate silos as they both are fundamentally intertwined with the governance of cross-border flow of personal data. Data protection issues should form an indispensable consideration in the context of digital trade liberalisation and vice versa. The paper concludes that the standards regime in international trade law can be considered as a blueprint for the necessary regulatory interface between data protection and digital trade. The paper consists of five main sections. This introduction is the first section. The second section titled ‘Interconnected structural blocks of a data protection regulation in general’ provides the general structural elements of a data protection regulation and how the data protection principles and practices combine to actualise the mechanisms which govern the cross-border flow of personal data in a jurisdiction. It highlights that the structural elements of a data protection regulation are interconnected, which necessitates policy coherence between data protection and digital trade law. The third section titled ‘Three arguments against and in favour of an interface between data protection and digital trade law’ provides an outline of the critiques by Irion, Kaminski and Yakovleva to the proposals by Chander and Schwartz to promote a legal interface between data protection and digital trade law. Notably, it provides a rebuttal to the critiques by supporting the proposals by Chander and Schwartz. It supports the proposal for an international agreement on data privacy among states in the future which can bring coherence in the governance of cross-border flow of personal data. The fourth section titled ‘Future interface between data protection and digital trade law’ underscores the need for a self-standing agreement on data privacy in the context of international trade law. This is due to the fact that traditional trade law approaches need readjustment to cohesively tackle the realities of digital economy, especially data protection issues. In pursuance, it proposes that the trade standards regime, ie the Technical Barriers to Trade (TBT) and Sanitary and Phytosanitary (SPS) Agreement in the World Trade Organization’s (WTO) provide a unique blueprint to envision a self-standing legal agreement and forum on data protection concerns as it relates to cross-border flow of personal data in international trade law. The section briefly highlights the relevance of the WTO trade standards regime as a blueprint for the future international data privacy agreement in international trade law. The fifth section concludes the paper by raising two key challenges for a policy coherence between data protection and digital trade law — (a) progressive coordination and (b) a reasonable legal interface between the two regimes in both theory and practice.
Keywords: data protection; data privacy; digital trade; WTO; cross-border flow of personal data; adequacy decision; Joint Statement Initiative on E-commerce -
Key data protection and cybersecurity considerations in the mergers and acquisitions context through the lens of regulatory and judicial enforcement
Farrhah Khan, Senior Privacy Counsel, Johannesburg
With mergers and acquisitions being an integral part of the commercial landscape, the vast amounts of personal data implicit in such transactions cannot be overstated. It has become increasingly apparent, particularly given the advent and evolution of data privacy laws across the world, that it is crucial to incorporate key data protection and cybersecurity assessments into the due diligence process to identify and mitigate potential data protection and cybersecurity risks. Where companies fail to do so, the implications are often severe and extend to both exposure to enforcement risk and reputational damage. This paper will examine the status of the current mergers and acquisitions market and why it is necessary for data protection and cybersecurity considerations to be at the forefront of such transactions; thereafter, the risks implicit in neglecting to incorporate the necessary mechanisms and compliance checks into the due diligence process will be assessed. The focus of this paper will then turn to considering relevant regulatory and judicial enforcement actions to assess the precedent that exists for the view that failing to consider data protection and cybersecurity matters ultimately poses a significant commercial and compliance risk to both the acquiring company and the target company. Finally, this paper will conclude with a review of various strategies available to companies to mitigate such commercial and compliance risk from the perspective of safeguarding against undue post-acquisition liability.
Keywords: mergers and acquisitions; due diligence; cybersecurity; data protection; enforcement; liability -
Caught in the whirlwind of market power: The impact of WhatsApp’s 2021 update on users’ privacy in India
Nikita Shah, Assistant Professor of Law at Institute of Law, Nirma University
WhatsApp's entrenched market power has culminated in Indians being obliged to give up their privacy. This paper reports on an empirical study of WhatsApp to prove that it has been able to infringe consumers' privacy because of its market power, which has become insidious due to network effects, consumer inertia and asymmetry of information. Throughout this study, the author has not tried to quantify privacy or develop parameters of privacy, since the responses would have been flawed by subjectivity. Instead, the author has conducted this quantitative study to prove that consumers value privacy on paper but not when faced with the counteraction of free services. The researcher used questionnaires to understand the behaviour of Indian users towards privacy. The respondents were chosen based on a stratified sampling method, and analysis was done using descriptive statistics to quantitatively summarise the challenges faced by the respondents in switching away from WhatsApp. The study concluded that users cannot exercise constraints due to the privacy paradox, consumer inertia and asymmetry of information. WhatsApp has the largest consumer base in India, and hence, its pervasiveness is considered extensively in this paper.
Keywords: asymmetry of information; WhatsApp; consumer inertia; Indian Competition Act; market power; privacy paradox; privacy - Book Reviews
- An Advanced Introduction to U.S. Data Privacy Law by Ari Ezra Waldman
- Technology and Security for Lawyers and Other Professionals: The Basics and Beyond by W. Kuan Hon
-
Research Handbook on EU Internet Law (Second Edition) Edited by Andrej Savin and Jan Trzaskowski
Reviewed by Ardi Kolah, Founding Editor-in-Chief, Journal of Data Protection & Privacy