Majority leader in the US Senate, Dick Durbin of the Democrats, wants to step up the fight against images of child sexual abuse. During the week, he submitted a draft for a Strengthening Transparency and Obligation to Protect Children Suffering from Abuse and Mistreatment Act to the US Congress Chamber. The government faction’s Stop CSAM Act aims to more effectively remove “Child Sexual Abuse Material” (CSAM) from the Internet. Online service providers would be liable to prosecution for “knowingly hosting or storing” abusive content or “knowingly promoting or facilitating the sexual exploitation of children, including the creation of CSAM, on their platforms.”

The initiative does not go as far as the proposals for chat control in the EU and for the Online Safety Bill in Great Britain. This is intended to force providers to allow private messages to be searched even with end-to-end encrypted communication such as WhatsApp, Signal and Threema. But the US bill opens the door to civil lawsuits against service providers for negligently “encouraging or facilitating” acts related to the exploitation of children, “hosting or storing child pornography,” or “making available” such material to third parties. The US civil rights organization Electronic Frontier Foundation (EFF) complains that this version is so broad and vague that it also undermines end-to-end encryption.

New criminal and civil claims against providers based on broad terms and low standards “will undermine digital security for all internet users,” the EFF said. Since the distribution of abuse representations is already prohibited by law, the Stop CSAM Act be interpreted in such a way that it also captures “passive behavior such as the mere provision of an encrypted app”. Providers of cryptographically secured services who receive a warning would have “knowledge” of a potential criminal offense in the sense of criminal law, even if they could not check such a deletion request and react to it. Lawyers for plaintiffs are likely to argue that simply providing an encrypted service that can be used to store any image – not necessarily CSAM – encourages the sharing of illegal content.

Under current US law, service providers who have actual knowledge of potential depictions of abuse on their platforms must report this to the National Center for Missing and Exploited Children (NCMEC). This then forwards relevant investigation reports to law enforcement agencies, for example in Europe. The Stop CSAM Act goes even clearer. It is said to apply to “interactive computer services”, which includes personal messaging and email apps, social media platforms such as Facebook, Instagram and Twitter, cloud storage providers and many other Internet intermediaries and online service providers.

The included new civil claim also provides an exception to Section 230 of the Communication Decency Act (CDA), which is considered the fundamental norm for online freedom of expression. “Section 230” offers providers partial immunity for user-generated content. If providers could be sued in the future for “facilitating” the sexual exploitation of children simply because they provide a platform for sharing content, this would endanger “freedom of expression on the Internet”, according to the EFF. In addition, a “notice-and-takedown” system is to be introduced, which will be monitored by a newly created committee for the protection of children on the Internet. This could oblige providers to remove or deactivate content before an independent authority or court has decided on it.


(tiw)

To home page

California18

Welcome to California18, your number one source for Breaking News from the World. We’re dedicated to giving you the very best of News.

Leave a Reply