Technology

Brazil’s LGPD Poised to Take Effect in a Matter of Days

Brazil’s long-anticipated data protection law, Lei Geral De Proteção de Dados Pessoais (“General Law for Data Protection” or “LGPD”), now appears positioned to take effect in a matter of days.  Ever since the law was originally passed in August 2018, implementation and enforcement timelines have been in flux.  In a rather sudden turn of events last week, however, dramatic back-to-back votes by each house of Brazil’s National Congress now put the substantive provisions of the LGPD on track to take effect in a few days’ time, upon approval by Brazil’s president.  The LGPD’s administrative fines and sanctions provisions remain scheduled to take effect next year in August 2021. READ MORE

CA Businesses Poised to Have CCPA Compliance Deadline Extended for B2B and Employee Data

The California legislature has passed AB 1281 to the Governor’s desk for signature and, given the absence of legislative opposition, it appears the bill is now well positioned to be signed into law.  AB-1281 extends by one year the expiration date of the business-to-business (“B2B”) and employee-related exemptions provided for under the California Consumer Privacy Act (“CCPA”) (previously discussed here).  If signed into law, it will give California businesses at least one more year to work on folding employee and B2B data into their existing CCPA compliance programs, a welcome reprieve for California employers facing a resurgence of coronavirus cases in workplaces around the State.  READ MORE

German Supervisory Authority Publishes First Substantive Guidance on International Data Transfers in the Post Schrems 2.0 Era

On 16 July, 2020 the European Court of Justice (“CJEU”) published its decision invalidating the EU-U.S. Privacy Shield and setting out enhanced requirements for using the so-called Standard Contractual Clauses for Processors (Decision 2016/1250 – “SCCs”) (judgement C-311/18 – “Schrems II”). See our previous blog on the Schrems II decision for further details. Shortly thereafter, the European Data Protection Board (“EDPB”) adopted FAQs (see our follow-up blog post), which mainly focused on how to conduct the required risk assessment in connection with the SCCs. READ MORE

AI Update: EU High-Level Expert Group Publishes Requirements for Trustworthy AI and European Commission Unveils Plans for AI Regulation

Assessment List for Trustworthy Artificial Intelligence

On July 17, 2020, the European High-Level Expert Group on Artificial Intelligence (“AI HLEG”) presented its final Assessment List for Trustworthy Artificial Intelligence (“ALTAI”), to help companies identify AI-related risks, minimize them and determine what active measures to take, through self-evaluation. READ MORE

Face-off on Use of Biometric Technology in the UK

In one of the world’s first test cases regarding the legality of the use of automated facial recognition and biometric technology, on 11 August 2020 the English Court of Appeal handed down judgment in R (Bridges) v CC South Wales. The court found that the use of this technology by the South Wales Police Force violated privacy, equality and data protection laws. READ MORE

How to Comply with International Transfers – The Regulatory Guidance Overview on the “Schrems II” Decision

EDPB and data protection authorities’ views and statements on the “Schrems II”- decision by the CJEU

 On 16 July, 2020, the European Court of Justice (“CJEU“) passed a decision invalidating the EU-US Privacy Shield and calling into question the Standard Contractual Clauses (“SCCs“) (judgement C-311/18 – “Schrems II“). The shockwaves of the decision were felt worldwide and companies are now scrambling to make sense of sometimes conflicting guidance published by various EU supervisory authorities. READ MORE

Pending U.S. Supreme Court Cases May Restrict FTC’s Pursuit of Monetary Relief in Privacy and Cybersecurity Matters

Earlier this month, the U.S. Supreme Court agreed to hear a pair of cases that provide it with the opportunity to severely restrict the Federal Trade Commission’s (“FTC’s”) authority to obtain equitable money relief in consumer protection enforcement actions, including privacy and cybersecurity matters. Under Section 13(b) of the FTC Act, in certain circumstances the FTC is empowered to bring actions in federal court to seek temporary restraining orders and injunctions for violations of the Act. In two consolidated cases, FTC v. Credit Bureau Center, LLC and AMG Capital Management, LLC v. FTC, the Supreme Court will now consider whether, as the FTC claims, this provision also authorizes the agency to seek equitable money relief for such violations, even though the provision makes no mention of money relief. The decision will have broad implications because the FTC has relied on Section 13(b) to seek monetary relief in consumer protection enforcement actions, including privacy and cybersecurity matters. A ruling against the FTC could substantially alter the FTC’s approach to privacy and cybersecurity enforcement.

The FTC’s privacy and cybersecurity enforcement actions typically rely on Section 5 of the FTC Act, which prohibits unfair or deceptive trade practices. The FTC takes the position that a failure to implement “reasonable” cybersecurity or privacy practices can constitute an “unfair” practice, and that making false or misleading statements about such practices can be a “deceptive” trade practice under the statute.

The FTC can enforce Section 5 in two ways. First, it can rely on its traditional administrative enforcement authority, which allows the FTC to initiate an administrative proceeding to issue an order to “cease and desist” violations of Section 5, but only provides for monetary relief in limited circumstances. Second, in certain situations the FTC can sue directly in federal court under Section 13(b) of the FTC Act. Although Section 13(b) authorizes only “injunctions,” the FTC often brings cases under this section in federal court seeking monetary relief under equitable doctrines such as restitution, disgorgement and rescission of contracts.

Until recently, courts universally accepted the FTC’s expansive view that its authority under Section 13(b) to obtain “injunctions” enables it to seek equitable monetary relief. But that has begun to change. In Credit Bureau, the Seventh Circuit rejected the FTC’s position that Section 13(b) authorizes monetary relief on the ground that an implied equitable monetary remedy would be incompatible with the FTC Act’s express remedial scheme. Most notably, the court observed that the FTC Act has two detailed remedial provisions expressly authorizing equitable money relief if the FTC follows certain procedures. The FTC’s broad reading of Section 13(b) would allow the agency to circumvent these conditions on obtaining equitable money relief, contrary to the intent of Congress. And in AMG Capital Management, although the Ninth Circuit considered itself bound to follow its prior precedent allowing the FTC to obtain money relief under Section 13(b), two of the three panel members joined a special concurrence arguing that this position is “no longer tenable.” And a decision from the Third Circuit last year, while not addressing whether the FTC is barred from pursuing money relief under Section 13(b), held that to pursue such relief the FTC must, at a minimum, allege facts plausibly suggesting that the company “is violating, or is about to violate,” the law.

If the Supreme Court restricts or eliminates the FTC’s pursuit of equitable money relief under Section 13(b), its decision would represent a significant setback for the FTC’s recent attempts to expand its remedial authority in privacy and cybersecurity cases, among others. In June 2018, medical laboratory LabMD obtained the first-ever court decision overturning an FTC cybersecurity enforcement action, convincing the Eleventh Circuit that an FTC cease-and-desist order imposing injunctive relief requiring LabMD to implement “reasonable” data security was impermissibly vague. (The team directing that effort – led by Doug Meal and Michelle Visser – joined Orrick in January 2019.) In the wake of LabMD, the FTC’s new Chairman, Joseph Simons, stated that he was “very nervous” that the agency lacked the remedial authority it needed to deter allegedly insufficient data security practices and that, among other things, the FTC was exploring whether it has additional untapped authority it could use in this space. The FTC has followed through on that promise in the ensuing years, pursuing a wide range of additional remedies, including equitable money relief. An adverse ruling by the Supreme Court could strike a severe blow to the FTC’s efforts on this front.

Such a ruling is entirely possible. Just last month in SEC v. Liu, the Supreme Court recognized limits on the disgorgement power of the Securities and Exchange Commission, determining that it is restricted to situations where the remedy does not exceed a wrongdoer’s net profits and is awarded for victims. However, unlike the FTC Act, the SEC Act specifically authorizes the SEC to seek “equitable relief.” Therefore, the consolidated AMG and Credit Bureau cases afford the Supreme Court an opportunity to recognize even greater restrictions on the FTC’s authority to obtain equitable money relief under Section 13(b) – or, as the Seventh Circuit did in Credit Bureau, to reject such authority altogether.

While in the short term such a ruling may reduce the monetary risks of FTC privacy and cybersecurity enforcement for companies collecting personal information, it could serve as a catalyst for a legislative proposal that would provide the FTC significant new authority to police privacy and security violations and assess civil penalties.

To discuss these cases in more detail, or for advice on the FTC’s privacy and cybersecurity enforcement program more generally, please feel free to contact any member of our privacy & cybersecurity team, which has immense experience in this area.

The Supreme Court Is Positioning to Take On TCPA

On July 6, 2020, the United States Supreme Court issued its ruling in Barr v. American Ass’n of Political Consultants, a case in which the plaintiffs challenged a government-debt collection exception to the Telephone Consumer Protection Act’s (“TCPA”) ban on “robocalls” to cell phones on First Amendment grounds, and sought to have the entire robocall-regulating statute invalidated.[1] The Court agreed with the plaintiffs—political and nonprofit organizations that wanted to make political robocalls to cell phones—that the exception unconstitutionally favors government-debt collection speech over political and other speech in violation of the First Amendment. However, instead of nullifying the entire set of robocall restrictions found at 47 U.S.C. § 227(b)(1)(A)(iii), as plaintiffs sought, the Court found the government-debt collection exception severable and invalidated only that portion of the statute, leaving the general robocall restrictions in place.

In its July 6 decision, the Supreme Court seemed to endorse the need for a broad ban on “robocalls.” The Court referred back to the context in which the TCPA was enacted in 1991, characterizing it as a time when “more than 300,000 solicitors called more than 18 million Americans every day.”[2] According to the Court, “[t]he Act responded to a torrent of vociferous consumer complaints about intrusive robocalls.”[3] The Court’s July 6 decision shifts the universe of acceptable practices back to a pre-2015 framework, prior to the enactment of the government-debt-collection exception.

Later in the same week, on July 9, the Supreme Court granted certiorari in another case, taking issue with the TCPA’s robocall provision, Facebook, Inc v. Duguid. In that case, the Supreme Court will address what qualifies as an automatic telephone dialing system (“ATDS”) an issue that has been brewing in the courts with materially different interpretations across several circuits.[4] The Facebook decision should have significant implications on the scope of the robocall restrictions.

Passed in 1991, 47 U.S.C. § 227(b)(1)(A)(iii) of the TCPA prohibits a caller from using an ATDS to call a cell phone and prohibits calls using an artificial or prerecorded voice, unless the caller has obtained prior express consent. The TCPA defines an ATDS as “equipment which has the capacity to store or produce telephone numbers to be called, using a random or sequential number generator; and to dial such numbers.”[5] This definition, and the FCC’s expansive interpretation of it, has been the subject of intense litigation. The proper scope of the ATDS definition is a high-stakes question. This TCPA provision imposes strict liability with statutory damages of $500 per violation—trebled to $1,500 per violation if the violation is deemed willful or knowing.[6] A company found to have used a telephone system that qualifies as an ATDS to call cell phones without prior consent can find itself subject to millions (or even billions) of dollars in damages.

In 2015, the FCC issued a Declaratory Ruling setting forth its interpretation of the ATDS definition. According to the FCC, an ATDS includes “dialing equipment [that] has the capacity to store or produce, and dial random or sequential numbers [without human intervention] … even if it is not presently used for that purpose, including when the caller is calling a set list of consumers.”[7] The Declaratory Ruling explicitly stated that “the capacity of an autodialer is not limited to its current configuration but also includes its potential functionalities.”[8] This interpretation drastically broadened the scope of equipment implicated by the Act to potentially include almost all technology that is capable of being upgraded with software to permit automated dialing.

In 2018, the D.C. Circuit in ACA International v. Federal Communications Commission struck down the FCC’s 2015 interpretation of an ATDS, holding that it “offered no meaningful guidance to affected parties” on whether their equipment was covered by the TCPA restrictions.[9] The Court noted that the FCC’s interpretation was so expansive that it could lead to unreasonable outcomes such as conventional smartphones being considered covered equipment.[10] The opinion was most critical of the potential future capacity aspect of the FCC’s interpretation, explaining that “[i]t cannot be the case that every uninvited communication from a smartphone infringes federal law, and that nearly every American is a TCPA-violator-in-waiting, if not a violator-in-fact.”[11] With the D.C. Circuit’s invalidation of the FCC’s 2015 interpretation, the courts have been left to interpret the provision based on the plain language of the statute.

Courts have disagreed on the critical issue of the functions a device must have the capacity to perform in order to qualify as an ATDS. In its 2018 decision in Marks v. Crunch, the Ninth Circuit succinctly stated that “[t]he question is whether, in order to be an ATDS, a device must dial numbers generated by a random or sequential number generator or if a device can be an ATDS if it merely dials numbers from a stored list.” [12] The Ninth Circuit answered that question with an expansive interpretation, holding that “the statutory definition of ATDS includes a device that stores telephone numbers to be called, whether or not those numbers have been generated by a random or sequential number generator.”[13] The Ninth Circuit’s interpretation potentially means that any telephone system with the capacity to automatically dial a stored list of telephone numbers without human intervention qualifies as an ATDS. The Second Circuit recently adopted an interpretation similar to that of the Ninth Circuit in Marks.[14]

The Third, Seventh and Eleventh Circuits adopted starkly different interpretations of the ATDS definition based on a plain reading of the statutory language. In Gadelhak v. AT&T, for example, the Seventh Circuit held that “the capacity to generate random or sequential numbers is necessary to the statutory definition,” expressly rejecting the Ninth Circuit’s reading of the statute in Marks.[15] The Third and Eleventh Circuits adopted a similar approach in Dominguez v. Yahoo and Glasser v. Hilton, respectively.[16]

The Supreme Court’s decision in Facebook v. Duguid will likely once and for all resolve this circuit split and provide litigants with a uniform interpretation of what constitutes an ATDS under the Act. The adoption of a narrow interpretation will likely result in a dramatic decrease in TCPA litigation where fewer dialing systems would qualify as an ATDS—most modern telephone systems do not generate random or sequential telephone numbers for dialing. However, a broad interpretation may result in an influx of litigation, particularly in circuits such as the Third, Seventh and Eleventh, where recent rulings had limited such cases and led serial litigators to file suit elsewhere.


[1] Barr v. Am. Ass’n of Political Consultants, Inc., No. 19-631, 2020 WL 3633780 (U.S. July 6, 2020).

[2] Id. at *3.

[3] Id.

[4] The Supreme Court granted certiorari on question 2 of the petitioner’s brief, which reads: “Whether the definition of ATDS in the TCPA encompasses any device that can ‘store’ and ‘automatically dial’ telephone numbers, even if the device does not ‘us[e] a random or sequential number generator.’” Facebook, Inc. v. Duguid, no. 19-511.

[5] 47 U.S.C. 227(a)(1)(A)-(B).

[6] 47 U.S.C. 227(3).

[7] In the Matter of Rules & Regulations Implementing the Tel. Consumer Prot. Act of 1991, 30 F.C.C. Rcd. 7961 (2015).

[8] Id.

[9] ACA Int’l v. Fed. Commc’ns Comm’n, 885 F.3d 687, 701 (D.C. Cir. 2018).

[10] Id. at 692.

[11] Id. at 698.

[12] Marks v. Crunch San Diego, LLC, 904 F.3d 1041, 1050 (9th Cir. 2018), cert. dismissed, 139 S. Ct. 1289, 203 L. Ed. 2d 300 (2019).

[13] Id. at 1043.

[14] See Duran v. La Boom Disco, Inc., 955 F.3d 279, 280 (2d Cir. 2020).

[15] Gadelhak v. AT&T Servs., Inc., 950 F.3d 458,469 (7th Cir. 2020).

[16] Dominguez on Behalf of Himself v. Yahoo, Inc., 894 F.3d 116, 117 (3d Cir. 2018); Glasser v. Hilton Grand Vacations Co., LLC, 948 F.3d 1301, 1304 (11th Cir. 2020).

Highest Administrative Court in France Upholds Google’s €50 Million Fine

On January 21, 2019, the CNIL (the French data protection authority) issued a fine of €50 million to Google under the General Data Protection Regulation (the “GDPR”) for its failure to (1) provide notice in an easily accessible form, using clear language, when users configured their Android mobile device, and (2) obtain users’ consent to process personal data for ad personalization purposes. The CNIL’s enforcement action and resulting fine arose out of actions filed by two not-for-profit associations, None of Your Business and La Quadrature du Net. The fine was the first significant fine imposed by the CNIL under the GDPR and remains one of the highest fines to date. In determining the amount of the fine, the CNIL considered the fact that the violations related to essential principles under the GDPR (transparency and consent), the violations were continuing, the importance of the Android operating system in France, and the fact that the privacy notice presented to users covered a number of processing operations. Google appealed the decision. READ MORE

French Court Annuls Parts of the CNIL’s Cookie Guidelines

On June 19, 2020, the Conseil d’Etat, the highest administrative court in France, annulled in part the cookie guidelines issued by the CNIL (the French data protection authority). The court ruled that the CNIL did not have the power to prohibit “cookie walls” (i.e., the practice of blocking access to a site or app for users who do not consent to the use of cookies) in the guidelines. READ MORE