I code and do art things. Check https://cloudy.horse64.org/ for the person behind this content. For my projects, https://codeberg.org/ell1e has many of them.

  • 7 Posts
  • 68 Comments
Joined 4 months ago
cake
Cake day: July 16th, 2025

help-circle

  • It doesn’t seem to be voluntary at all, from what I can tell from the draft:

    “Upon that notification, the provider shall, in cooperation with the EU Centre pursuant to Article 50(1a), take the necessary measures to effectively contribute to the development of the relevant technologies to mitigate the risk of child sexual abuse identified on their services. […]”

    “In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available interpersonal communications services should take all reasonable measures to mitigate the risk of their services being misused for such abuse […]”

    These quotes sound mandatory, not voluntary. And let’s look what these technologies referenced are:

    “In order to facilitate the providers’ voluntary activities under Regulation (EU) 2021/1232 compliance with the detection obligations, the EU Centre should make available to providers detection technologies […]”

    “The EU Centre should provide reliable information on which activities can reasonably be considered to constitute online child sexual abuse, so as to enable the detection […] Therefore, the EU Centre should generate accurate and reliable indicators,[…] These indicators should allow technologies to detect the dissemination of either the same material (known material) or of different new child sexual abuse material (new material), […]”

    Oops, it sounds again like mandatory scanning.

    Source: https://cdn.netzpolitik.org/wp-upload/2025/11/2025-11-06_Council_Presidency_LEWP_CSA-R_Presidency-compromise-texts_14092.pdf

    The new draft seems to pretend better to look less mandatory, but it still looks mandatory to me. Feel free to correct me if somebody can figure out that I’m wrong.


  • It doesn’t seem to be voluntary at all, from what I can tell from the draft:

    “Upon that notification, the provider shall, in cooperation with the EU Centre pursuant to Article 50(1a), take the necessary measures to effectively contribute to the development of the relevant technologies to mitigate the risk of child sexual abuse identified on their services. […]”

    “In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available interpersonal communications services should take all reasonable measures to mitigate the risk of their services being misused for such abuse […]”

    These quote sound mandatory, not voluntary. And let’s look what these technologies referenced are:

    “In order to facilitate the providers’ voluntary activities under Regulation (EU) 2021/1232 compliance with the detection obligations, the EU Centre should make available to providers detection technologies […]”

    “The EU Centre should provide reliable information on which activities can reasonably be considered to constitute online child sexual abuse, so as to enable the detection […] Therefore, the EU Centre should generate accurate and reliable indicators,[…] These indicators should allow technologies to detect the dissemination of either the same material (known material) or of different new child sexual abuse material (new material), […]”

    Oops, it sounds again like mandatory scanning.

    Source: https://cdn.netzpolitik.org/wp-upload/2025/11/2025-11-06_Council_Presidency_LEWP_CSA-R_Presidency-compromise-texts_14092.pdf

    The new draft seems to pretend better to look less mandatory, but it still looks mandatory to me. Feel free to correct me if somebody can figure out that I’m wrong.


  • It doesn’t seem to be voluntary at all, from what I can tell from the draft:

    “Upon that notification, the provider shall, in cooperation with the EU Centre pursuant to Article 50(1a), take the necessary measures to effectively contribute to the development of the relevant technologies to mitigate the risk of child sexual abuse identified on their services. […]”

    “In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available interpersonal communications services should take all reasonable measures to mitigate the risk of their services being misused for such abuse […]”

    These quotes sound mandatory, not voluntary. And let’s look what these technologies referenced are:

    “In order to facilitate the providers’ voluntary activities under Regulation (EU) 2021/1232 compliance with the detection obligations, the EU Centre should make available to providers detection technologies […]”

    “The EU Centre should provide reliable information on which activities can reasonably be considered to constitute online child sexual abuse, so as to enable the detection […] Therefore, the EU Centre should generate accurate and reliable indicators,[…] These indicators should allow technologies to detect the dissemination of either the same material (known material) or of different new child sexual abuse material (new material), […]”

    Oops, it sounds again like mandatory scanning.

    Source: https://cdn.netzpolitik.org/wp-upload/2025/11/2025-11-06_Council_Presidency_LEWP_CSA-R_Presidency-compromise-texts_14092.pdf

    The new draft seems to pretend better to look less mandatory, but it still looks mandatory to me. Feel free to correct me if somebody can figure out that I’m wrong.






  • https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=OJ%3AL_202402847

    Supply in the course of a commercial activity might be characterised not only by charging a price for a product with digital elements, but also by charging a price for technical support services where this does not serve only the recuperation of actual costs, by an intention to monetise, for instance by providing a software platform through which the manufacturer monetises other services, by requiring as a condition for use the processing of personal data for reasons other than exclusively for improving the security, compatibility or interoperability of the software, or by accepting donations exceeding the costs associated with the design, development and provision of a product with digital elements

    TL;DR, just donations can already be a problem, apparently. But IANAL.


  • As far as I understand the license doesn’t matter at all for EU regulation, other than “non-free” software is treated even worse.

    Generally if you give something away for free, you can’t be claimed to be the owner.

    The CRA from what I can tell applies to software given away for free, sadly. I’m not a lawyer, though. But you can perhaps see why people don’t trust the EU.


  • I admit it’s a complex topic, but if you read the post in detail, it should answer your questions. The “owner” is typically the maintainer, if in doubt that’s the person with repository write access. And the EU can apparently potentially require whatever to be maintained, not that I understand the exact details. The point was that the regulation doesn’t seem to avoid FOSS fallout well.