The outlook and tone for data privacy is largely dictated by the torrential amount of data generated from data analytics models, artificial intelligence systems, and users leveraging digital experiences. We can expect to see increasing technology disruptions and transformations that will shape our personal digital footprint with each passing year.
Looking ahead to 2023 and 2024, businesses can anticipate these top privacy trends that will shape the industry:
- Increased regulations and compliance oversight
- Growing complexity in the privacy management of data analytics and AI
- Increased reliance on third parties
- Identity theft, breaches, and attack surface
Here’s a closer look at each of them.
Trend 1: Increased regulations and compliance oversight
Growing consumer anxiety over data privacy, fueled by high-profile data breaches and the poor accountability actions of big tech companies, are driving the enactment of stronger privacy regulations.
Furthermore, the upcoming EU AI Act is set to add another layer of requirements to privacy concerns, in addition to the already stringent privacy laws such as the EU’s GDPR. The act includes stricter regulations for AI systems, including privacy and data protection requirements, and provides tough enforcement measures for violations. Companies utilizing AI systems regulated under GDPR will need to prepare to comply with the EU AI Act.
As more countries follow suit with similar legislation with an interesting twist to ‘data sovereignty’, businesses worldwide are forced to prioritize data privacy compliance.
According to Gartner prediction on data privacy through 2024, by year-end 2024, 75% of the world’s population will have its personal data covered under modern privacy regulations. This is a new reality for a country that has largely put data privacy regulation on the backburner and sorely lacks a comprehensive federal privacy law. The current patch work of sectoral and state privacy regulations is enacted lack teeth to enforce FTC-level violations.
In the recent months, a number of laws in the pipeline did not garner support or lost momentum in bipartisan buy in.
As things stand, California is the GDPR of the land. Although not as comprehensive as the GDPR, California’s privacy law underwent a update under the CPRA, which introduced new rights for Californians and enabled enforcement measures. Virginia and Colorado also have enacted privacy laws that to some degree that relate to the CCPA, which will take effect in 2023. Recently, the International Association of Privacy Professionals (IAPP) released its state privacy legislation tracker.
In the coming year and beyond, we can expect to see heightened enforcement of existing data privacy laws, such as the EU’s GDPR and California’s CCPA, as well as the implementation of new regulations, including Brazil’s LGPD and China’s PIPL.
Recent changes in the FTC leadership have led to the office taking on a stricter role in enforcing privacy and cybersecurity-related crimes and violations. Earlier this year, the agency announced the creation of a new Office of Technology to strengthen its ability to keep pace with technological challenges in the digital marketplace and support law enforcement investigations and actions.
Trend 2: Growing complexity in the privacy management of data analytics and AI
As companies continue to adopt AI systems, managing the privacy of the data involved has become increasingly complex. AI systems often process vast amounts of personal information, and their algorithms can be opaque and difficult to understand.
AI systems are socio-technical systems with embedded human traits and values. The data lifecycle within the context of AIOps is complex in nature involving:
- Data inputs with various stages and sublayers of data management utilized for sourcing, training, testing, validation and evaluation.
- Data pipeline includes production activities associated with data inputs to an operational model.
- Outcomes and learning experiences within the context of adaptive AI environment include real-world results of a AI system, including predictions, recommendations, post-market and continuous monitoring, and possible inclusion of adverse reporting systems.
And all these activities translate to discreet datasets in varying degrees of complexity and sensitivity.
The complexities of data associated with AI, coupled with the technological environment in which it operates, require a cross-disciplinary and multi-layered approach to ensure that necessary safeguards are in place to protect personal data. Safeguards may include technical and organizational controls, privacy enhancing techniques and privacy risk assessment measures to ensure privacy hygiene and compliance with applicable regulatory obligations.
Trend 3: Increased reliance on third parties
In today’s disruptive technology market, businesses are increasingly relying on a wide range of third-parties beyond the traditional service provider framework for better utilization of resources and cost reductions. Due to the wide range of third parties leveraged, new market trends are emerging in the management of contracts, data sharing practices, and partnerships for control and oversight of personal data and sensitive personal data.
As a result of this multi-faceted reliance on third parties, it is becoming increasing challenging for businesses to ensure the privacy of their systems and the data they manage, as well as to meet the regulatory requirements and expectations associated with these activities.
Furthermore, the pandemic exposed a number of weaknesses in third party management, lack of resiliency in maintaining business operations, and complexities in management of disparate number of third parties. The data sharing between third parties is a risk that require oversight and due diligence, and updated framework to match risks.
Moreover, in specific sectors such as financial services, stricter regulations via amendments have been introduced to ensure compliance with third party privacy requirements.
Trend 4: Identity theft, breaches, and attack surface
There is no prize in guessing – hackers capabilities continue to grow. In order to put this in perspective, security capabilities and measures have matured and are continually at odds with hackers in the market, playing a never-ending game. The threats and sophistication advance in equal proportions like a cat and mouse game.
The emerging digital ecosystems are disruptive and complex, and acceleration in technological advancements never ceases to stop. The new wave of technologies in IoT, mixed reality, AI models, emergence of metaverse and ChatGPT has brought with it its own share of threats and vulnerabilities waiting to be exploited. In short, over the last few years, we have seen steady rise in high profile data breaches down to never-ending petty identity thefts, one can only expect to see this rise at an exponential rate to match with exponential technologies.
In conclusion, data privacy will continue to trend as a top concern for individuals, organizations, investors, and governments alike, as the reliance on disruptive technology and data-driven decisions continues to grow. The growth in the use of AI systems, particularly new models such as generative AI, will require a cross-disciplinary, multi-layer approach to ensure adequate safeguards are in place. As emerging technologies continue to push the boundaries of what is possible, it is crucial that we take a proactive approach in protecting personal data from identified and emerging threats.
As executives navigate the uncertain future of data privacy, they need a heightened understanding of the interplay between disruptive technologies, the complexities of data management and an ongoing stream of regulatory requirements. From defining privacy governance and programs, to identifying foundational and repeatable model both from regulatory obligations to responsible data use, we use purpose-fit approaches embedded with deep data and privacy knowledge to move you into future-proofed privacy practice.