2025 saw significant new privacy legislation passed, enforcement actions imposed, and a legal landscape working to close the gap following rapid technological leaps. Here is a forecast for what to look out for in 2026 and a look back at some of the most significant developments from the past year.
2026: What to Watch
1) DPO Appointments and Enforcement of Israel’s New Amendment 13
In Israel, 2026 will bring with it efforts to comply with the requirements of Amendment 13 to the Privacy Protection Law, 1981 and the regulator’s efforts to enforce them. Amendment 13 expands enforcement powers and adds a suite of new compliance obligations. One particularly significant obligation is the new requirement for entities that meet certain criteria to appoint a Data Protection Officer (DPO).
The Privacy Protection Authority’s (PPA) draft guidance clarifies, among other things: the types of organizations obligated to appoint a DPO, which include public bodies, data brokers, and organizations that conduct ongoing and systematic monitoring of individuals or large-scale processing of sensitive data as part of their core activities; thequalifications a competent DPO should hold, such as an in-depth knowledge of privacy law, a sound understanding of technology, and familiarity with the organization in which they operate; and the DPO’s specific rights and obligations. We expect the PPA to assess not only whether a DPO is employed but also whether the DPO is properly resourced and included in the organization’s privacy governance. Organizations found to violate these requirements may face significant fines and/or private rights of action. We expect to see such enforcement actions increase over the next year.
As the initial grace period for enforcement has recently passed, we expect this to be a focus of enforcement in the coming year. The PPA, at times, looks to the guidance, opinions and actions of EU regulators when charting its own courses of action. EU regulators recently doled out significant fines for DPO conflicts of interest and lack of independence as we previously covered in detail in our article regarding DPO conflict of interest, highlighting that appointing a DPO is not enough if the role is not structured correctly. Appointing a qualified DPO and equipping them with the required tools can greatly reduce an organization’s regulatory risk.
In 2026, as many Israeli companies appoint DPOs for the first time, the importance of qualified DPO services is becoming increasingly apparent. To meet this growing need, Arnon is proud to announce the launch of its DPO services. As privacy law professionals who work closely with hi-tech companies to implement privacy-by-design standards, our team members are uniquely positioned to assume the role of DPO for our clients. Contact us for more information about our DPO services.
Please see related updates from our 2025 Quarterly Reviews below.
2) Expanding Data Regulation and Guidance (AI and More)
While some requirements and prohibitions of the EU AI Act already apply, the requirements relating to most high-risk AI systems will enter into force on August 2nd, 2026. Additionally, several US states have either enacted or are in the process of passing legislation regulating the use of AI. Beyond increased AI governance, we see a similar trend toward broader data governance, including through the new EU Data Act (in force since September 2025), which will also influence industry compliance practices. The Data Act aims to give users of IoT (internet of things) and connected products, greater control of their data. As we noted in our related client update, the Data Act also reaches beyond the sphere of smart products and includes requirements for a broad range of cloud service providers, which encapsulates many SaaS and similar services. The Data Act requires that providers of certain such services make transitioning between competitors smoother. As stakeholders ready their organizations for compliance with the list of complex requirements coming into effect, we expect to see more guidance published, both from private and public sources, as practical procedures are developed and gaps in the legislation arise.
Please see related updates from our 2025 Quarterly Reviews below.
3) Data Legislation’s Narrowing Scope
Several developments of 2025 indicate a potential policy movement in the EU towards simplifying data-related regulation and adopting a more “pro-innovation” approach. The European Commission’s Digital Omnibus proposal signals a willingness to revisit foundational GDPR concepts, including by narrowing the concept of “personal data”. In parallel, the CJEU’s 2025 decision in EDPS v. SRB (C-413/23) clarified when the circumstances in which pseudonymized data may fall outside the definition of personal data. That said, the actual impact of these initiatives remains difficult to assess given the number of independent EU bodies involved and the fact that their positions do not always align. Meanwhile, we are seeing increased enforcement in practice, including toward Israeli entities that have received inquiries from supervisory authorities and have faced private actions on a range of matters. For example, the CNIL issued a €1 million fine against MOBIUS SOLUTIONS LTD., an Israeli company, for GDPR violations, underscoring that non-European entities may still be subject to supervisory authority action even if they are not registered in the EU.
Please see related updates from our 2025 Quarterly Reviews below.
4) Children’s Safety Online
Several U.S. states passed laws in 2025 requiring age verification, parental consent mechanisms, and design constraints calibrated specifically for children and teens. Australia issued a ban on social media for children under 16. Most of these new laws will enter into effect in 2026 and businesses operating online platforms should anticipate that age verification and other enhanced privacy controls for children will become more common and apply to a wider range of contexts in the new year.
Please see related updates from our 2025 Quarterly Reviews below.
5) Cookies
Many Israeli companies have interpreted Amendment 13 as requiring websites to display cookie pop-ups, even though Amendment 13 does not specifically address cookies. Because the Privacy Protection Authority (PPA) has only addressed cookie consent in a limited context (payment applications) and Israeli case law on this issue remains limited, we expect Israeli courts and the regulator to look to European guidance and case law (including recent decisions on cookie banner design and the need for a clear “Reject All” option) when assessing what constitutes valid consent in practice.
Please see related updates from our 2025 Quarterly Reviews below.
Key Related Updates from Our 2025 Quarterly Reviews
1) DPO Obligations and Amendment 13 of the Israeli Protection of Privacy Law – 1981
Amendment 13 to the Privacy Protection Law, 1981 Takes Effect
On August 14th, 2025, Amendment 13 to the Privacy Protection Law – 1981 (“PPL“) went into effect. This amendment represents a significant overhaul of the law to modernize it and bring it in closer alignment with modern privacy frameworks of other countries, most notably, the EU’s GDPR. The key changes under the amended PPL and in recent publications by the Privacy Protection Authority (“PPA“) include expanded enforcement powers now granted to the PPA, obligations on Boards of Directors of certain companies to oversee the implementation of data protection policies, obligations on certain companies to appoint a “Data Protection Officer”, changes to the notification obligations of data controllers, including expanded privacy notice obligations, changes to the database registration requirements, and new criminal and civil liabilities. The PPA has released a detailed overview of the changes to help companies navigate the new legal landscape.
PPA Issues Draft Guidance Regarding the new DPO Requirement under Amendment 13
The Israeli Privacy Protection Authority (“Authority”) has published draft guidance interpretating the new requirements for appointing Data Protection Officer (“DPO”’s) under Amendment 13 to the Privacy Protection Law. The guidance clarifies which organizations are obligated to appoint a DPO (i.e. public bodies, data brokers, entities engaged in systematic and ongoing monitoring, and entities processing large volumes of sensitive data) as well as the qualifications and expertise required for the DPO role. For more information on the draft guidance, see our article here.
Austrian Supervisory Authority Issues Fine on Conflict of Interest for DPO
The Austrian supervisory authority, DSB, fined a controller €5,000 for appointing its managing director as its Data Protection Officer, breaching Article 38(6) of the GDPR, which requires the DPO to be free from conflicts of interest while performing their duties. In explaining its decision, the DSB stated that a conflict of interest can arise if the DPO is unable to allocate sufficient time to their responsibilities of the role due to other obligations. In this case, the company processed a significant amount of health data in its business as a diagnostic laboratory during the Covid-19 pandemic. Furthermore, a DPO cannot generally be entrusted with determining the means and purposes of processing as this is exactly what the DPO is supposed to independently monitor.
Croatian DPA Issues Fines for DPO Conflicts of Interest and GDPR Compliance Failures
The Croatian Personal Data Protection Agency (“AZOP“) has recently imposed fines in two cases involving conflicts of interest in the appointment of DPOs. In one case, AZOP fined a company €12,000 for appointing its procurator (a person granted the rights to conclude contracts and undertake legal actions in the name of the company) as DPO, finding that the procurator’s significant decision-making powers created a conflict of interest. In another case, AZOP fined a business information publisher €40,000 for appointing a director as DPO, as well as for other GDPR violations, concluding that the director’s role compromised the DPO’s independence.
EDPB Publishes Guidance on the Use of Personal Data when Developing AI Models
The European Data Protection Board (EDPB)’s Opinion focuses on anonymity, legitimate interest, and unlawful data processing. Anonymity is discussed because, under the GDPR, truly anonymous data falls outside the scope of the GDPR; accordingly, if an AI model can be considered anonymous, its subsequent use would not constitute processing of personal data. According to the Opinion, an AI model may be considered anonymous if both: (1) the chance of extracting personal data about individuals used to develop the model is very low, and (2) the chance of getting such personal data from queries is also very low. When a supervisory authority looks to assess an AI system’s anonymity, the Opinions recommends a non-exhaustive list of criteria, which includes evaluating the selection of training data sources, data minimization strategies used throughout the system’s development, the implementation of measures to prevent the model from revealing personal data through its outputs, and the regular testing of the model against known attacks to ensure it remains secure. The Opinion also provides criteria for legitimate interest assessments and highlights the consequences of processing personal data without a legal basis.
National Security Agencies Published Guidance on Securing Data in AI Systems
Leading national security and cybersecurity agencies published guidance outlining essential best practices for protecting data used in AI and machine learning systems. The guidance emphasizes the importance of securing data throughout the AI lifecycle through measures such as data provenance tracking, encryption, digital signatures, secure storage, and robust access controls. The guidance also highlights key risks, including threats from compromised data supply chains, malicious data manipulation, and data drift, and provides practical mitigation strategies to help organizations safeguard sensitive and mission-critical information while maintaining the reliability and integrity of AI-driven outcomes
Texas Legislature Passes New AI Governance Act
The Texas Responsible Artificial Intelligence Governance Act came into effect on January 1, 2026. The Act applies broadly to developers and deployers of AI systems, requiring, in certain cases, certain entities using an AI system to disclose such use to the recipients of their services. The Act also prohibits the development or use of AI systems for harmful or discriminatory purposes and imposes specific obligations on both private and public sectors. It establishes enforcement powers for the Texas Attorney General, introduces fines for violations, and creates a Texas AI council and a regulatory sandbox to support responsible AI innovation.
The Privacy Protection Authority (“Authority“) has released draft guidelines to clarify the intersection between privacy laws and AI systems. In these guidelines, the Authority clarifies that the Israeli Privacy Protection Law applies to AI systems and underscores the necessity of a legal basis for processing personal data through such systems, including information that the AI systems derive from personal data. The Authority further elaborates on the standards and requirements for obtaining informed consent and asserts that data scraping also requires the data subject’s consent. Additionally, the draft guidelines emphasize the need for robust corporate governance led by senior management. The Authority also addresses the right to amend personal data in the context of AI, expressing its intention to prioritize enforcement of the rights to amend and access personal data. With respect to data security, the Authority notes that most AI-based databases will be classified as requiring medium or high statutory security levels and stresses the significance of adhering to the principle of data minimization.
EU Data Act Enforcement Begins
The EU Data Act came into effect fully on September 12, 2025. The Data Act aims to give users more control over the data generated by their Internet-of-Things technologies and related services. The act complements but differs from the GDPR by focusing not on personal data protection but on all types of data, including non-personal and industrial data. The new act gives users greater rights to access and share their data, requires fair data sharing between businesses, grants public authorities access to data for certain public interests, and protects smaller companies from unfair contract terms. The key impact of the EU Data Act is in providing users with powerful switching rights, as well as de facto rights to termination at will.
CNIL Issues Several Guidelines on the Interplay of GDPR and AI Development
The French data protection authority (“CNIL“) has issued several guidelines, each clarifying different aspects of how the GDPR applies to AI development. CNIL issued guidance on how to qualify AI system providers as controllers, joint controllers, or processors when creating training datasets with personal data. The guidelines provide helpful examples to illustrate cases that fall into each of the above categories.
CNIL also published additional guidelines on the use of legitimate interest as a legal basis for AI development. Following the EDPB’s recognition of legitimate interest as an acceptable legal basis for processing personal data when building an AI model, CNIL released new guidelines clarifying that legitimate interest can be used as a legal basis for processing personal data in AI projects, provided that strong safeguards are implemented. The recommendations also offer practical criteria and examples, including specifically for web scraping, to help organizations assess when legitimate interest is appropriate.
Finally, CNIL finalized a series of practical guidance sheets when assessing and/or implementing GDPR compliance in the design of an AI system. The sheets cover the steps required to ensure GDPR compliance in detail and include guidance on such topics as ‘determining the applicable legal regime’, ‘define a purpose’, ‘determine the role of the entity (controller/processor), etc.
3) Possible Changes to the GDPR’s Scope
EU Proposes Reduction of GDPR Record-Keeping Requirements
The European Commission has proposed simplifying GDPR record-keeping obligations as part of its broader regulatory streamlining efforts, aiming to reduce administrative burdens for businesses. The proposal would revise Article 30(5) GDPR, which offers an exemption from keeping a record of processing activities (“ROPA”). The exemption would be extended from organizations with fewer than 250 employees to those with fewer than 750 employees. Article 30(5) also includes exceptions to the exemption, which will also be revised such that only high-risk processing would require such SMEs to maintain a ROPA.
EU Digital Omnibus Proposal Seeks to Redefine Key GDPR Concepts
In November, the European Commission introduced the Digital Omnibus proposal – a series of amendments to the GDPR that would substantially alter the regulation’s scope and operation. For example, the proposal narrows the definition of “personal data”. The current interpretation of “personal data” includes any data that, when linked with other datasets, could identify an individual, even if the likelihood that such a link will occur is remote, such as if the second dataset is held by a separate entity. The proposed revision clarifies that data will only be considered “personal” when an entity has both the data and the means to identify the individual therein. This would render much personal data, as it is currently understood, non-personal and its processing would fall outside the scope of the GDPR. The proposal also eases certain restrictions on processing data for research and development purposes, including specific approval to use personal data to train AI models under certain conditions, limits certain data subject rights and a controller’s transparency obligations in certain cases. If passed, even with a selection of the proposed revisions, the impact on privacy compliance would be significant.
CJEU and Regulators Provide Clarification on Pseudonymization and Personal Data
On September 4, 2025, the Court of Justice of the European Union (CJEU) ruled in EDPS v. SRB (C‑413/23)– to expand the cases in which pseudonymized data may not constitute personal data. The case concerned the Single Resolution Board’s (SRB), a regulatory agency in the EU, transmission to Deloitte and other third parties of pseudonymized comments from shareholders and creditors of Banco Popular Espanol, S.A. regarding the bank’s bankruptcy proceedings. The Court found that since SRB held the key to reverse the pseudonymization, the data (with respect to SRB) is considered personal. However, the Court further ruled that pseudonymized data would not constitute personal data if the entity in question cannot reasonably reidentify the data subjects on its own and suggested that the pseudonymized data would not constitute personal data for Deloitte and other third-party recipients without access to the pseudonymization key. Previously, pseudonymized data was considered personal data (with all the safeguards of the GDPR applicable to it) even where an entity did not have the means to reidentify data subjects itself as long as it could, in theory, procure those means from another entity. This ruling aligns with the EU Digital Omnibus’s revised definition of personal data and effectively narrows the definition of personal data under the GDPR.
Following the CJEU judgment, the Danish Data Protection Authority clarified that in controller–processor relationships, where the controller holds the key to pseudonymized data that it sends to the processor (without such key), the data is still considered personal insofar as the processor is concerned. The nature of the data must be assessed from the controller’s perspective: if the controller can re-identify individuals, the data remains personal, regardless of the processor’s inability to identify data subjects on its own. The DPA explained this is because processors act solely under the controller’s instructions and their processing of the data is seen as an extension of the controller itself. Any processing for the processor’s own purposes would require a separate legal basis.
CNIL Asserts Jurisdiction on Non-European Processor and Slams It with €1 Million Fine
On December 11th, CNIL, the French data protection supervisory authority, issued a €1 million fine to a processor based outside of the EU for GDPR violations. The processor provided advertising services to the controller, which is based in France, and used the controller personal data to improve its own services without the controller’s approval. Then, after the agreement with the controller was terminated, the processor failed to delete the controller personal data as it was contractually obligated to do. This personal data was then compromised through a data breach the processor suffered and became available on the dark web. The processor also did not keep a record of processing activities (ROPA) as is required under the GDPR. For these violations, CNIL fined the processor, asserting its jurisdiction in that the processing of the controller personal data constituted “monitoring individuals’ behavior” in the EU and therefore the GDPR applies. This case serves as a reminder that non-European entities are not protected from supervisory authority actions simply because they are not registered in the EU.
Several States Advance Age-Appropriate Design and Age Verification Laws
Recent legislative activity in the U.S. reflects a growing trend toward age-appropriate design and stricter age verification requirements for online services and app stores. Vermont has passed its Age-Appropriate Design Code Act, pending the governor’s approval, which will require businesses to implement strong privacy protections for minors, limit data collection, and provide clear privacy tools. Texas has enacted a law mandating age verification and parental consent for app downloads and purchases by minors and is considering further restrictions on social media access for those under 18. In Louisiana, a bill nearing final passage would require both app stores and developers to verify users’ ages and obtain parental consent for minors. Nebraska has also adopted an Age-Appropriate Online Design Code Act aimed at enhancing parental control and limiting targeted engagement of children by tech companies.
California Expands Digital Safety and Accountability Framework
California has enacted a series of online safety measures aimed at protecting minors by extending oversight regarding artificial intelligence and online content access. Among others, California enacted Assembly Bill 1043, which will require “operating system providers” to verify users’ ages and transmit this information to app developers, who must use it to comply with applicable laws and Senate Bill 243, which addresses “companion chatbots”. Senate Bill 243 requires entities that make the companion chatbot available, among other things, to clearly disclose when users are interacting with AI, implement and publish protocols to prevent the production of suicide, suicidal ideation, or self-harm content, and prevent the production of sexually explicit visual content.
German Court Requires Clear “Reject All” Option on Cookie Banners
The Hanover Administrative Court ruled that website operators must provide a clearly visible “Reject All” button on cookie consent banners whenever an “Accept All” option is offered. This decision arose from a case against a media company whose banner design made it difficult for users to refuse cookies and failed to provide clear information about consent and third-party data processing.
The above content is a summary provided for informational purposes only and does not constitute legal advice. It should not be relied upon without obtaining further professional legal counsel.
