Balancing Privacy and Progress in Big Data Governance across Jurisdictions
AUTHORS
Emily Robertson,Department of Geography and Environment, London School of Economics and Political Science (LSE), United Kingdom
Daniel Clarke,Department of Geography and Environment, London School of Economics and Political Science (LSE), United Kingdom
Jonathan Hughes,School of Business and Management, University of Manchester, United Kingdom
Sarah Whitfield,School of Business and Management, University of Manchester, United Kingdom
ABSTRACT
The exponential growth of big data analytics has reshaped sectors ranging from healthcare to urban planning, while intensifying debates on how to safeguard individual privacy without constraining technological innovation. This study conducts a comparative analysis of three major legal frameworks—the European Union’s General Data Protection Regulation, the California Consumer Privacy Act, and Singapore’s Personal Data Protection Act—to evaluate how each balances data protection with innovation imperatives. Using a mixed-methods approach that combines doctrinal analysis of statutory provisions, quantitative assessment of enforcement outcomes and qualitative insights from semi-structured interviews with data protection officers, the research examines core principles including consent, accountability, transparency, and data minimization, as well as their implications for fields such as artificial intelligence and smart-city development. The findings highlight a shared emphasis on empowering data subjects and strengthening organizational responsibility, while also revealing notable divergences in consent mechanisms, penalty structures, and approaches to emerging technologies. Building on these insights, the paper proposes a harmonized governance model featuring interoperable consent standards, tiered enforcement proportional to organizational scale and potential harm, international data-sharing agreements, and regulatory sandboxes to encourage privacy-enhancing technologies. The study concludes that achieving a sustainable balance between privacy and progress requires adaptive legal instruments, continuous stakeholder engagement, and robust international collaboration to ensure that data-driven innovation advances without undermining fundamental rights.
KEYWORDS
Big data, Data privacy, Governance, GDPR, CCPA, PDPA, Cross-jurisdictional comparison, Harmonization
REFERENCES
[1] A. R. Lee, D. Koo, I. K. Kim, et al., “Identifying facilitators of and barriers to the adoption of dynamic consent in digital health ecosystems: A scoping review,” BMC Medical Ethics, vol.24, no.107, (2023). DOI:10.1186/s12910-023-00988-9(CrossRef)(Google Scholar)
[2] M. I. Khalid, M. Ahmed, and J. Kim, “Enhancing data protection in dynamic consent management systems: Formalizing privacy and security definitions with differential privacy, decentralization, and zero-knowledge proofs,” Sensors, vol.23, no.17, pp.7604, (2022). DOI:10.3390/s23177604(CrossRef)(Google Scholar)
[3] S. Lim and J. Oh, “Navigating privacy: A global comparative analysis of data protection laws,” IET Information Security, vol.2025, no.1, pp.5536763, (2024). DOI:10.1049/ise2/5536763(CrossRef)(Google Scholar)
[4] R. D. Garcia, G. Ramachandran, K. Dunnett, R. Jurdak, C. Ranieri, B. Krishnamachari, and J. Ueyama, “A survey of blockchain-based privacy applications: An analysis of consent management and self-sovereign identity approaches,” arXiv preprint, (2024). Available: https://arxiv.org/abs/2411.16404
[5] N. S. Munung, C. Staunton, O. Mazibuko, et al., “Data protection legislation in Africa and pathways for enhancing compliance in big data health research,” Health Research Policy and Systems, vol.22, no.145, (2024). DOI:10.1186/s12961-024-01230-7(CrossRef)(Google Scholar)
[6] OECD, “Regulatory sandboxes in artificial intelligence,” OECD Digital Economy Papers, (2023). Available: https://www.oecd.org/en/publications/2023/07/regulatory-sandboxes-in-artificial-intelligence_a44aae4f.html
[7] T. Moraes, “Regulatory sandboxes for trustworthy artificial intelligence – global and Latin American experiences,” International Review of Law, Computers & Technology, vol.39, no.1, pp.55–74, (2024). DOI:10.1080/13600869.2024.2351674(CrossRef)(Google Scholar)
[8] J. Duncan, “Data protection beyond data rights: Governing data production through collective intermediaries,” Internet Policy Review, vol.12, no.3, (2023). DOI:10.14763/2023.3.1722(CrossRef)(Google Scholar)
[9] A. J. Andreotta, N. Kirkham, and M. Rizzi, “AI, big data, and the future of consent,” AI & Society, vol.37, pp.1715–1728, (2022). DOI:10.1007/s00146-021-01262-5(CrossRef)(Google Scholar)
[10] R. Sonani and L. Prayas, “Machine learning-driven convergence analysis in multijurisdictional compliance using BERT and K-means clustering,” arXiv preprint, (2025). DOI:10.6084/m9.figshare.28259810(CrossRef)(Google Scholar)
[11] B. M. V. Bernardo, H. S. Mamede, J. M. P. Barroso, and V. M. P. D. Dos Santos, “Data governance & quality management—innovation and breakthroughs across different fields,” Journal of Innovation & Knowledge, vol.9, no.4, pp.100598, (2024). DOI:10.1016/j.jik.2024.100598(CrossRef)(Google Scholar)
[12] A. Lavorgna and P. Ugwudike, “The datafication revolution in criminal justice: An empirical exploration of frames portraying data-driven technologies for crime prevention and control,” Big Data & Society, (2021). DOI:10.1177/20539517211049670(CrossRef)(Google Scholar)
[13] S. Singler and O. Babalola, “Digital colonialism beyond surveillance capitalism? Coloniality of knowledge in Nigeria's emerging privacy rights legislation and border surveillance practices,” Social & Legal Studies, (2025). DOI:10.1177/09646639241287022(CrossRef)(Google Scholar)
[14] L. A. Bygrave, “The ‘Strasbourg effect’ on data protection in light of the ‘Brussels effect’: Logic, mechanics and prospects,” Computer Law & Security Review, vol.40, pp.105460, (2021). DOI:10.1016/j.clsr.2020.105460(CrossRef)(Google Scholar)
[15] X. Gao and X. Chen, “Understanding the evolution of transatlantic data privacy regimes: Ideas, interests, and institutions,” in Proceedings of the European Interdisciplinary Cybersecurity Conference (EICC 2024), Xanthi, Greece, June 5–6, 2024. ACM, New York, NY, USA, 13 pages. DOI:10.1145/3655693.3655720(CrossRef)(Google Scholar)
[16] R. Walters, L. Trakman, and B. Zeller, Data Protection Law: A Comparative Analysis of Asia Pacific and European Approaches. Springer, (2019). UNSW Law Research Paper No.19-78. Available: https://ssrn.com/abstract=3463731
[17] C. Gasimova, “Privacy and transparency in an AI-driven world: Does algorithmic transparency fit on data privacy under GDPR?” SSRN, (2023). Available: https://ssrn.com/abstract=4482889
[18] M. Nouwens, I. Liccardi, M. Veale, D. Karger, and L. Kagal, “Dark patterns after the GDPR: Scraping consent pop-ups and demonstrating their influence,” arXiv preprint, (2020). DOI:10.1145/3313831.3376321(CrossRef)(Google Scholar)
[19] B. Aysolmaz, R. Müller, and D. Meacham, “The public perceptions of algorithmic decision-making systems: Results from a large-scale survey,” Telematics and Informatics, vol.79, pp.101954, (2023). DOI:10.1016/j.tele.2023.101954(CrossRef)(Google Scholar)
[20] K. Demetzou, “Data protection impact assessment: A tool for accountability and the unclarified concept of ‘high risk’ in the General Data Protection Regulation,” Computer Law & Security Review, vol.35, no.6, pp.105342, (2019). DOI:10.1016/j.clsr.2019.105342(CrossRef)(Google Scholar)
[21] I. Sivan-Sevilla, “Varieties of enforcement strategies post-GDPR: A fuzzy-set qualitative comparative analysis (fsQCA) across data protection authorities,” Journal of European Public Policy, vol.31, no.2, pp.552–585, (2022). DOI:10.1080/13501763.2022.2147578(CrossRef)(Google Scholar)
[22] I. G. Cohen, “Privacy in the age of medical big data,” Nature Medicine, vol.25, no.1, pp.37, (2019). DOI:10.1038/s41591-018-0272-7(CrossRef)(Google Scholar)
[23] P. Vijayagopal, B. Jain, and S. Ayinippully Viswanathan, “Regulations and Fintech: A comparative study of the developed and developing countries,” Journal of Risk and Financial Management, vol.17, no.8, pp.324, (2024). DOI:10.3390/jrfm17080324(CrossRef)(Google Scholar)
[24] Y. Lim, J. Edelenbos, and A. Gianoli, “What is the impact of smart city development? Empirical evidence from a smart city impact index,” Urban Governance, vol.4, no.1, pp.47–55, (2024). DOI:10.1016/j.ugj.2023.11.003(CrossRef)(Google Scholar)
[25] J. Srouji and T. Mechler, “How privacy-enhancing technologies are transforming privacy by design and default: perspectives for today and tomorrow,” Journal of Data Protection & Privacy, vol.3, no.3, (2020). DOI:10.69554/XPTR8215(CrossRef)(Google Scholar)
[26] A. Alaassar, A. Mention, and T. H. Aas, “Exploring how social interactions influence regulators and innovators: the case of regulatory sandboxes,” Technological Forecasting and Social Change, vol.160, pp.120257, (2020). DOI:10.1016/j.techfore.2020.120257(CrossRef)(Google Scholar)
[1] A. Bradford, The Brussels Effect: How the European Union Rules the World. Oxford University Press, New York, (2020). DOI:10.1093/oso/9780190088583.001.0001.(CrossRef)(Google Scholar)
[27] A. Bradford, The Brussels Effect: How the European Union Rules the World. Oxford University Press, New York, (2020). DOI:10.1093/oso/9780190088583.001.0001(CrossRef)(Google Scholar)
[28] V. Lehdonvirta, B. Wú, and Z. Hawkins, “Weaponised interdependence in a bipolar world: How economic forces and security interests shape the global reach of US and Chinese cloud data centres,” Review of International Political Economy, pp.1–26, (2025). DOI:10.1080/09692290.2025.2489077(CrossRef)(Google Scholar)
[29] L. Belli, W. B. Gaspar, and S. Singh Jaswant, “Data sovereignty and data transfers as fundamental elements of digital transformation: Lessons from the BRICS countries,” Computer Law & Security Review, vol.54, pp.106017, (2024). DOI:10.1016/j.clsr.2024.106017(CrossRef)(Google Scholar)
[30] M. Ko, M. Jin, C. Wang, and R. Jia, “Practical membership inference attacks against large-scale multi-modal models: a pilot study,” in Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV 2023), Paris, France, 2023, pp.4848–4858. DOI:10.1109/ICCV51070.2023.00449(CrossRef)(Google Scholar)
[31] J. Flemings, M. Razaviyayn, and M. Annavaram, “Differentially private next-token prediction of large language models,” arXiv preprint, (2024). Available: https://arxiv.org/abs/2403.15638
[32] N. Carlini, J. Hayes, M. Nasr, M. Jagielski, V. Sehwag, F. Tramèr, B. Balle, D. Ippolito, and E. Wallace, “Extracting training data from diffusion models,” arXiv preprint, (2023). Available: https://arxiv.org/abs/2301.13188
[33] Z. Shao, H. Liu, J. Mu, and N. Z. Gong, “Enhancing prompt injection attacks to LLMs via poisoning alignment,” arXiv preprint, (2024). DOI:10.1145/3733799.3762963(CrossRef)(Google Scholar)
[34] L. Ahmad, S. Agarwal, M. Lampe, and P. Mishkin, “OpenAI's approach to external red teaming for AI models and systems,” arXiv preprint, (2025). Available: https://arxiv.org/abs/2503.16431
[35] Information Commissioner’s Office, “Guidance on AI and data protection,” (2024). Available: https://ico.org.uk
[36] X. Li, R. Zmigrod, Z. Ma, X. Liu, and X. Zhu, “Fine-tuning language models with differential privacy through adaptive noise allocation,” in Findings of the Association for Computational Linguistics: EMNLP 2024, Miami, Florida, USA, pp.8368–8375. Association for Computational Linguistics, (2024)
[37] A. Greenberg, “This prompt can make an AI chatbot identify and extract personal details from your chats,” Wired, Nov. 2024. Available: https://www.wired.com/story/ai-imprompter-malware-llm