According to experts, children and young people should be better protected from drug-glorifying content on Instagram, TikTok and YouTube. All three providers require a minimum age of 13 years for use, according to a study published on Monday by the state media authorities on behalf of the Commission for Youth Media Protection.

However, there is no reliable age check when registering, so that dangerous content can also be accessed by significantly younger users.

For the study, around 160 offers from high-reach influencers on the three platforms were examined for violations of youth media protection. The selection focused on offers aimed at a German-speaking target group and containing content on addictive substances and intoxicants, especially alcohol and cannabis. In around 60 percent of the offers, the examiners found drug glorifying or trivializing content. Instagram was ahead of YouTube (32 cases) and TikTok (24 cases) with 39 cases.

According to their own statements, the state media authorities have already initiated the first proceedings on suspected cases and reported offers from unknown users to the platforms. The reactions showed a high level of interest in making the offers legally compliant, it said. Instagram mother Meta has already blocked the affected offers, TikTok has deleted a lot of content and YouTube has moved numerous videos to the 18+ area or blocked them for users from Germany.

The Commission for the Protection of Young People in the Media and the Federal Government’s Drug Commissioner, Burkhard Blienert, called on the platforms and also those who post content there to take more responsibility for the protection of young people in the media. (KNA)

California18

Welcome to California18, your number one source for Breaking News from the World. We’re dedicated to giving you the very best of News.

Leave a Reply