Senate approves bill to protect minors on the Internet

WASHINGTON — El Senate The United States on Tuesday overwhelmingly approved legislation designed to protect children from dangerous online content, in what would be the first major effort by Congress in decades to hold technology companies more accountable for the harm they cause.

The project of ley, approved by a vote of 91 to 3, was pushed by parents of children who committed suicide after being bullied in internet or have otherwise been violated by online content. Such a law would require companies to undertake Actions reasonable to prevent harm on online platforms frequently used by minors, requiring them to exercise a “duty of care” and ensure that they generally set the most secure default settings possible.

The House has not yet acted on the bill, but House Speaker Mike Johnson said he is “committed to working toward consensus.” Supporters hope the Senate’s powerful vote will prompt the House to act before the end of Congress’s session in January.

The goal of the legislation is to allow children, teens and parents to “take back control of their online lives,” said Connecticut Democratic Sen. Richard Blumenthal, who co-wrote the bill with Republican Sen. Marsha Blackburn of Tennessee. Blumenthal said the message to Big Tech is that “we will no longer trust you to make decisions for us.”

Privacy laws

The bill would be the first major package of tech regulatory rules in years, and could pave the way for other bills that strengthen online privacy laws or set parameters for the growing artificial intelligence industry, among other issues. While both parties have long supported the idea that big tech companies should be subject to greater government scrutiny, there has been little consensus on how that should be done. This year, Congress passed legislation that would force TikTok, the China-based social media company, to sell itself or face a ban, but that law targets only one company.

If the bill becomes law, companies would be required to mitigate harm to children, including bullying and violence, encouragement of suicide, eating disorders, substance abuse, sexual exploitation and advertisements for illegal products such as narcotics, tobacco or alcohol.

To do so, social media platforms would have to give children options to protect their information, disable addictive product features and stop making personalized algorithmic recommendations. They would also be required to limit other users’ communication with children and limit features that “increase, maintain or extend use” of the platform, such as auto-playing videos or on-platform rewards.

Source: With information from AP

Tarun Kumar

I'm Tarun Kumar, and I'm passionate about writing engaging content for businesses. I specialize in topics like news, showbiz, technology, travel, food and more.

Leave a Reply