What is "Chat Control" – and Where Are We Today?

An update on the EU regulation proposal and its impact on the adult industry.

What is "Chat Control"?

It is a proposal for an EU regulation that would impose risk assessment, orders for detection and reporting of CSAM, and removal of such content on providers of communications and content hosting (chat, e-mail, cloud storage, social networks). The most controversial version also envisages scanning private messages, potentially even before encryption (so-called client-side scanning).

In September 2025, member states were divided again on the latest version; opponents warn of the threat to encryption and mass surveillance, so there is no agreement yet. The European Parliament also emphasizes that solutions must protect privacy and encryption, otherwise there will be no agreement.

The Slovenian Information Commissioner has repeatedly warned that general and indiscriminate scanning of private communications is not acceptable.

What This Means for the Adult Industry (Practically)

Even if the regulation aims exclusively at child abuse, it is important for our sector how providers will implement obligations. The biggest impacts:

• Private messages & bookings: WhatsApp/Signal/Telegram, e-mail, or DMs through which bookings take place could become subject to automatic scanning (upon order). Danger: false positives in algorithms (e.g., incorrect age assessment in a photo), leading to account blocking or incorrect reports.

• Overblocking and "chilling effect": Due to legal risk, providers would rather delete "to be safe" – even completely legal, consensual adult content, to avoid sanctions.

• Stricter KYC and traceability: Platforms (OF, cam portals, hosting) will likely further tighten identity/age verification and data retention. Consequences: more bureaucracy for creators, higher compliance costs, and more frequent temporary suspension of accounts upon "suspicion".

• Encryption and security: If we went to "client-side" scanning, it weakens encryption and increases the risk of abuse/leakage of private data (e.g., private galleries, conversations with clients).

Specifically for Slovenia

Since it is a regulation, it would – if adopted – be directly applicable in Slovenia as well. The role of the Information Commissioner (IP RS) will be key in interpreting the proportionality of measures and supervising providers. IP RS has already publicly warned against mass surveillance.

In practice, we could see: stricter policies of Slovenian/EU chat and hosting providers, more account blocks due to "cautious" moderation, more identity checks and retention of proof of age/consent by creators.

What Workers/Creators Can Do Now

1) Separate channels: Use text channels without sending images where possible for logistics (schedules, locations, payments). Send photos only when necessary, with simple watermarks and without a face if not needed.

2) Consent & age – everything on paper: Have written consent statements and proof of age for all collaborators/filming in a safe vault (off-cloud or with strong encryption).

3) "Red team" your profiles: Review what an algorithm might misunderstand (titles, hashtags, descriptive words). Reduce the likelihood of false positives.

4) Secure tools and settings: Enable two-factor authentication, manage backups carefully, use business accounts separately from private ones. If a platform offers E2EE, leave it on.

5) Financial diversification: Do not rely on a single channel (one platform, one payment processor). Have a backup payout route.

6) Monitor regulation development: The situation moves from month to month.