Quarterly Privacy Law Update | Q2 2024

4 July, 2024

We are pleased to share our quarterly privacy update which includes updates on regulations, case law and other privacy developments

Contact us for more information and a personalized consultation.


France’s data protection authority, CNIL, fined HUBSIDE.Store (“Company”) €525,000 for unlawfully using personal data purchased from data brokers for direct marketing, without valid consent. The data brokers used misleading consent forms, invalidating the consent. CNIL found that, although the Company contractually obligated the data brokers to obtain proper consent, it did not actually verify that such consent had been properly obtained, nor did it meet its obligation to inform individuals as required by Article 14 of the GDPR. CNIL has recently issued several decisions addressing use of personal data purchased from data brokers. Special caution needs to be taken in this regard, to ensure GDPR compliance.

The Dutch Data Protection Authority (DPA) released guidelines on data scraping, emphasizing that most data scraping practices are generally not compliant with the GDPR. Specifically, the DPA restricts the use of legitimate interest as a legal basis for data scraping regarding an interest with a source in law, as opposed to purely commercial interests. The DPA also recommends considering certain factors when determining the necessity of data scraping, such as the sources used and the duration of the scraping, and advises evaluating: (i) whether the data subject themselves published the data, or whether it was published by a third party; (ii) the data subject’s expectations relating to the data processing; and (iii) the relevant website terms. Please see here for the full memo.

The EDPB issued an opinion following a request by the Dutch, Norwegian, and Hamburg Data Protection Authorities emphasizing the need for valid consent in “pay-or-okay” models used by major online platforms for behavioral advertising. The EDPB found that, in most cases, pay-or-okay models do not comply with the GDPR, as these models incentivize users to give consent, which means that consent is not freely given and therefore does not qualify as consent under GDPR . The EDPB urged platforms to offer an additional equivalent alternative, which does not implement behavioral advertising or require payment of a fee, such as a free version that shows standard, non-behavioral ads.

CNIL issued a formal notice to a company, following a complaint about the collection of unnecessary personal data during its recruitment process. CNIL clarified that the principle of data minimization in recruitment only permits the collection of information that is directly relevant to the job or professional skills evaluation, to prevent any implication of unlawful discrimination based on age, family status, or similar factors. Employers should limit the information requested to that which identifies the best-suited candidates, confirming their skills and qualifications. Additional data (like the candidate’s marital status, social security number) should not be collected during the recruitment process, but only after the candidate has been hired. CNIL emphasized that, during recruitment, companies should not collect information about a candidate’s family members, birthplace, specific nationality, or past salary details.

The Austrian Supreme Administrative Court clarified the scope of the prohibition on automated decision-making under Article 22 of the GDPR. The case in question involved the Austrian Public Employment Service (“AMS”), which facilitates the (re)integration of employment seekers into the professional world. AMS employed an AI system, the “AMS-Algorithm”, to categorize job seekers based on their likelihood of (re)integrating into the labor market. The services provided by AMS were subsequently tailored based on such categorization.

Initially, the Austrian DPA determined that using the AMS-Algorithm constituted a prohibited use of automated decision-making under the GDPR, which prohibits the processing of personal data without consent, where such processing will have a legal or similarly significant effect on the data subject. Following an appeal of the DPA’s decision, the Supreme Administrative Court upheld the DPA’s view. It claimed that, even if the AMS-Algorithm’s immediate outcome did not have a direct legal or similar significant effect, the categorization itself could be considered as such, since the subsequent handling by AMS personnel relies heavily on this categorization.  


The Israeli Privacy Protection Authority has announced the release of a bill amending the Class Actions Law for public consultation. The bill aims to enable class action suits for privacy violations, including: (i) for various breaches under the Israeli Privacy Protection Law; (ii) for claims related to the management or possession of a database requiring registration; and (iii) for violations of the Data Security Regulations that result in significant security incidents affecting the data of over 1,000 individuals.

The Israel Privacy Protection Authority released guidelines (“Guidelines”) in April, regarding the use of open source software in connection with databases that contain personal data. According to the Guidelines, open source software may contain certain vulnerabilities or limitations which may compromise data security and could lead to a data breach. Specifically, the Guidelines address the following issues:

  • Documentation. Under the Protection of Privacy Regulations (Data Security) – 2017 (“Regulations”), database owners are required to maintain certain documents that provide details about the computer systems that hold personal data. The Guidelines specify that such documents must detail the open-source software included in or used with such systems, and the applicable license terms.
  • Maintenance. The Regulations require database owners to properly maintain the systems holding personal data and ensure regular software updates of such systems. As such, the Guidelines expressly prohibit the use of open-source software that is not supported or maintained.
  • Public Networks. The Guidelines require database owners to ensure that their databases are not connected to public networks that are not protected by sufficient security measures. Specifically, they must take steps to confirm that any open-source software in their systems is free from malware.
  • Service Providers. Prior to engaging an outsourced service provider, database owners must evaluate the risks posed by the use of open-source software by that service provider.
  • Risk Assessment. Database owners must evaluate the risks posed by open-source software prior to developing or licensing systems based thereon.
  • Open-Source Program Officer. Entities should appoint an Open-Source Program Officer responsible for ensuring the ongoing maintenance and security of open-source software used by the entity. 

Please see the full memo here.


  • New US State Privacy Legislation.

In recent months, various US states have advanced their privacy legislation, reflecting a growing commitment to safeguarding personal data. The Kentucky Consumer Data Protection Act, which comes into force on January 1, 2026, incorporates an applicability threshold typical of US state privacy laws. The Nebraska Data Privacy Act, coming into effect January 1, 2025, resembles that of Texas and does not apply to small businesses. The Minnesota Consumer Data Privacy Act aligns with other state privacy laws and targets businesses operating within Minnesota or offering products or services to Minnesota residents. This will take effect on July 31, 2025. Finally, Maryland enacted the Maryland Online Data Privacy Act, entering into effect on October 1, 2025, that also includes familiar applicability thresholds.  

On April 7, 2024, members of the House Committee on Energy and Commerce, and the Senate Committee on Commerce, Science, and Transportation introduced the American Privacy Rights Act (the “Act”), serving to consolidate various state privacy laws into a unified federal framework. The current draft of the Act encompasses several key provisions, including the following:

  • Scope. The Act is applicable to “covered entities”, which are entities that determine the purpose and means of processing covered data, and includes businesses under the jurisdiction of the US Federal Trade Commission, common carriers, and nonprofit organizations. It explicitly excludes small businesses, which are more specifically defined in the Act itself.
  • Covered Data. The Act is applicable to the processing of information that identifies, is linked to, or is reasonably linkable to an individual, including in combination with other information (“Covered Data”). The term “individual” means a natural person residing in the US. Various types of data, including de-identified data, publicly-available data and employee data, do not constitute Covered Data.
  • Categorization. The Act classifies covered entities into specific groups, such as Large Data Holders, Data Brokers, and Covered High-Impact Social Media Companies. Entities falling under these categories are subject to more stringent obligations than other covered entities.
  • Obligations. Covered entities must fulfill several obligations. These include the adoption of data minimization practices, ensuring transparency, providing certain opt-out rights (including related to targeted advertising), and obligations related to the processing of covered data by algorithms.
  • Relationship to Other Laws. Subject to certain exceptions, the Act takes precedence over non-sectoral state privacy laws.

Colorado and Maryland both passed laws to improve the protection of children’s personal data. The Colorado law, SB 41, applies to any entity that controls consumer personal data and conducts business in Colorado, or markets products or services to Colorado residents, irrespective of the generated revenue. It also includes various instructions on how to avoid harm to minors and better protect minor privacy rights. The Maryland Kids Code sets out several responsibilities for covered entities, including: (i) a requirement that covered entities offering online products likely to be accessed by children conduct a Data Protection Impact Assessment; (ii) a prohibition on the processing of children’s personal data in a manner that does not align with the best interests of the child; and (iii) a prohibition on profiling children, unless the profiling is in the child’s best interest. 

A California court dismissed a lawsuit filed by X (formerly Twitter) against Bright Data, an Israeli business intelligence company, over accusations of unlawful data scraping. X alleged that Bright Data scraped and sold content from X’s platform without authorization, making the scraping unlawful. However, the court ruled that Bright Data’s actions did not violate any contractual agreement with X and did not infringe upon X’s rights. Further justifying the legality of scraping, the decision noted that content generated by X users is publicly accessible and is owned by the users themselves – who retain their rights to the content – and not by X. 


This publication is provided as a service to our clients and colleagues, with explicit clarification that each specific case requires individual examination and discussion in writing.
The information presented here is of a general nature and is not intended to answer the unique circumstances of any individual or entity. Although we strive to provide accurate and available information, we cannot guarantee the accuracy of the information on the day it is received, nor that the information will continue to be accurate in the future. Do not act on the information presented without appropriate professional advice after a comprehensive and thorough examination of the specific situation.

Want to know more?
Contact us

Shiri Menache

Head of Marketing and Business Development

Matan Bar-Nir

Press Officer, OH! PR