• EU : CSAM to become mandatory

    • This topic has 2 replies, 2 voices, and was last updated 1 year ago.


    Today, the Commission is proposing new EU legislation to prevent and combat child sexual abuse online. With 85 million pictures and videos depicting child sexual abuse reported worldwide in 2021 alone, and many more going unreported, child sexual abuse is pervasive. The COVID-19 pandemic has exacerbated the issue, with the Internet Watch foundation noting a 64% increase in reports of confirmed child sexual abuse in 2021 compared to the previous year. The current system based on voluntary detection and reporting by companies has proven to be insufficient to adequately protect children and, in any case, will no longer be possible once the interim solution currently in place expires. Up to 95% of all reports of child sexual abuse received in 2020 came from one company, despite clear evidence that the problem does not only exist on one platform.

    To effectively address the misuse of online services for the purposes of child sexual abuse, clear rules are needed, with robust conditions and safeguards. The proposed rules will oblige providers to detect, report and remove child sexual abuse material on their services. Providers will need to assess and mitigate the risk of misuse of their services and the measures taken must be proportionate to that risk and subject to robust conditions and safeguards…

    Apple has added in iOS 15, watchOS, MacOS..Child Safety in iMessage :

    The Messages app includes tools to warn children when receiving or sending photos that contain nudity. These features are not enabled by default. If parents opt in, these warnings will be turned on for the child accounts in their Family Sharing plan.

    When receiving this type of content, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo. Similar protections are available if a child attempts to send photos that contain nudity. In both cases, children are given the option to message someone they trust for help if they choose…

    Nothing is sent out.

    • This topic was modified 1 year ago by Alex5723.
    Viewing 0 reply threads
    • #2446159

      The EU Wants Big Tech to Scan Your Private Chats for Child Abuse

      Europe’s proposed child protection laws could undermine end-to-end encryption for billions of people.

      ALL OF YOUR WhatsApp photos, iMessage texts, and Snapchat videos could be scanned to check for child sexual abuse images and videos under newly proposed European rules. The plans, experts warn, may undermine the end-to-end encryption that protects billions of messages sent every day and hamper people’s online privacy.

      Under the plans, tech companies—ranging from web hosting services to messaging platforms—can be ordered to “detect” both new and previously discovered CSAM, as well as potential instances of “grooming.” The detection could take place in chat messages, files uploaded to online services, or on websites that host abusive material…

      2 users thanked author for this post.
      • #2447138

        Big Brother at it again.
        Whilst child abuse is despicable so is over surveillance.
        Best find the source and shut it down.


        Just because you don't know where you are going doesn't mean any road will get you there.
    Viewing 0 reply threads
    Reply To: EU : CSAM to become mandatory

    You can use BBCodes to format your content.
    Your account can't use all available BBCodes, they will be stripped before saving.

    Your information: