EU Council Approves Voluntary Chat Scanning Compromise in Child Abuse Regulation

Key Points

  • EU Council adopts a compromise that makes chat scanning for CSAM optional for messaging services.
  • Voluntary scanning still allows forced scanning for platforms deemed “high‑risk.”
  • Regulation includes privacy‑focused age‑verification requirements for child users.
  • European Commission can review the law every three years, potentially expanding scanning scope.
  • Privacy advocates warn the compromise could enable future mass surveillance and censorship.
  • Tech and security groups call for strong parliamentary opposition to mandatory scanning.
  • The agreement now moves to trilogue negotiations with an expected final adoption next year.

European Union technical background

European Union technical background

Background

After several years of debate, the EU Council reached an agreement on the Child Sexual Abuse Regulation, commonly nicknamed “Chat Control.” The compromise replaces a mandatory scanning requirement with an optional one, giving messaging platforms the choice to scan all user communications for child sexual abuse material (CSAM). This shift was presented as a victory for encryption, as it avoids a blanket backdoor that could undermine end‑to‑end security.

Key Provisions

The new text allows providers to decide whether to implement full‑scale chat scanning. However, the regulation retains a clause that could compel scanning for services classified as “high‑risk.” It also grants the European Commission the authority to review the law every three years, opening the possibility of broader scanning in the future. Additionally, the law introduces age‑verification measures intended to reliably identify child users, with a stated emphasis on privacy‑preserving methods.

Privacy Concerns

Privacy advocates argue that even a voluntary framework poses significant risks. Critics note that the term “voluntary” does not eliminate the potential for mass surveillance, especially if the Commission later mandates scanning for certain platforms. They also highlight the difficulty of implementing effective, privacy‑friendly age verification, citing concerns that such systems could be bypassed or lead to increased use of privacy tools like VPNs. The regulation’s provisions on website blocking further raise alarms about possible censorship and the erosion of digital rights.

Industry Reactions

Reactions from the tech and security sectors are mixed. Some view the optional scanning as a positive step that protects encrypted communications, while others warn that the remaining clauses could lay groundwork for future surveillance. VPN providers and digital‑rights organizations have called for strong parliamentary resistance to any mandatory scanning, ID‑verification, or content‑blocking measures. The Internet Society’s government affairs director described the compromise as a “positive step forward for the security of communications of European residents,” yet emphasized the need for close monitoring during upcoming trilogue talks.

Next Steps

The agreement now proceeds to trilogue negotiations among the European Parliament, the Council, and the European Commission. Stakeholders expect intensive discussions to shape the final text, with some fearing that the April deadline for adoption may be too tight to resolve contentious issues. Observers will watch closely to see whether the final regulation balances child‑protection goals with the preservation of privacy, encryption, and digital freedom.

Source: techradar.com