EU

CNIL Issues Several Guidelines on the Interplay of GDPR and AI Development

The French data protection authority (“CNIL“) has issued several guidelines, each clarifying different aspects of how the GDPR applies to AI development. CNIL issued guidance on how to qualify AI system providers as controllers, joint controllers, or processors  when creating training datasets with personal data. The guidelines provide helpful examples to illustrate cases that fall into each of the above categories.

CNIL also published additional guidelines on the use of legitimate interest as a legal basis for AI development. Following the EDPB’s recognition of legitimate interest as an acceptable legal basis for processing personal data when building an AI model, CNIL released new guidelines clarifying that legitimate interest can be used as a legal basis for processing personal data in AI projects, provided that strong safeguards are implemented, and offers examples how several guarantees – such as increased transparency, exclusion of certain types of data, and ensuring data subject’s rights – are adapted to different types of AI systems. The recommendations also offer practical criteria and examples, including specifically for web scraping, to help organizations assess when legitimate interest is appropriate.

Finally, CNIL finalized a series of practical guidance sheets when assessing and/or implementing GDPR compliance in the design of an AI system. The sheets cover the steps required to ensure GDPR compliance in detail and include guidance on such topics as ‘determining the applicable legal regime’, ‘define a purpose’, ‘determine the role of the entity (controller/processor), etc.

UK

New UK Data (Use and Access) Act 2025 Updates Data Protection Law

The ICO published guidance regarding the new Data (Use and Access) Act 2025 (“DUAA”) updating UK data protection law. The new act clarifies when personal data can be used for scientific research, including commercial research, and allows for ‘broad consent’ to research areas. It permits the re-use of personal data for scientific research without sending out individual privacy notices if this would be disproportionately burdensome, provided rights are protected and information is published online. The DUAA also allows organizations to use a wider range of lawful bases for significant automated decision-making and permits certain analytics and functionality cookies without user consent. Additional changes include a new ‘recognized legitimate interests’ lawful basis for specific activities for processing personal data without having to carry out the usual balancing test. Additional changes include permitting certain re-uses of personal data—such as archiving in the public interest—without the need for further compatibility assessments and clearer guidelines for subject access requests and international data transfers.

Israel

Amendment 13 to the Privacy Protection Law, 1981 Takes Effect

On August 14th, 2025, Amendment 13 to the Privacy Protection Law, 1981 (“PPL“) went into effect. This amendment represents a significant overhaul of the law to modernize it and bring it in closer alignment with modern privacy frameworks of other countries, most notably, the EU’s GDPR. The key changes under the amended PPL and in recent publications by the Privacy Protection Authority (“PPA“) include expanded enforcement powers now granted to the PPA, obligations on Boards of Directors of certain companies to oversee the implementation of data protection policies, obligations on certain companies to appoint a “Data Protection Officer”, changes to the notification obligations of data controllers, including expanded privacy notice obligations, changes to the database registration requirements, and new criminal and civil liabilities. The PPA has released a detailed overview of the changes to help companies navigate the new legal landscape.

PPA Issues Draft Guidance Regarding the new DPO Requirement under Amendment 13

The Israeli Privacy Protection Authority (“Authority”) has published draft guidance interpretating the new requirements for appointing Data Protection Officer (“DPO”’s) under Amendment 13 to the Privacy Protection Law. The guidance  clarifies which organizations are obligated to appoint a DPO (i.e. public bodies, data brokers, entities engaged in systematic and ongoing monitoring, and entities processing large volumes of sensitive data) as well as the qualifications and expertise required for the DPO role. Public comments are open until September 23rd, 2025. The Authority will defer enforcement on DPO appointments until October 31, 2025, with limited extensions for organizations showing significant progress, but affected entities are expected to complete appointments promptly. For more information on the draft guidance, see our article here.

Court Scrutinizes Meta’s Consent Practices Under Privacy Law

On July 9, 2025, the Central District Court in Lod certified a class action against Meta Platforms, Inc, over its use of users’ names and profile photos in advertisements. Under the Protection of Privacy Law, 1981, use of a person’s name, image, or likeness for profit is prohibited without consent. The court clarified that while such consent may be implied, users must still be clearly informed of how their rights may be affected. In this case, Meta asked users to approve the terms of service generally and while the terms did give more detail about the nature of Meta’s processing, the fact that the consent was bound to the general statement only, the court deemed it was not informed consent.

The plaintiffs of the case also claimed that the term be deemed an “oppressive” clause under the Uniform Contracts Law, 1982 (“UCL“) and should be voided. This framework is intended to protect individuals from being bound by provisions they had no real opportunity to negotiate or fully understand. Even though users technically had the ability to opt-out of this processing through their settings, the court ruled that in order for the opt-out to render the term non-oppressive, the users had to be clearly informed how to opt-out, not merely retain the technical ability to opt-out. For more details on this case, see our summary article here.

This case highlights the importance of full transparency both when relying on implied consent so that the data subject has been clearly informed about what s/he is consenting to and when trying to avoid clauses be deemed “oppressive” under the UCL.


The above content is a summary provided for informational purposes only and does not constitute legal advice. It should not be relied upon without obtaining further professional legal counsel.

Want to know more?
Contact us

Shiri Menache

Head of Marketing and Business Development

Matan Bar-Nir

Press Officer, OH! PR