Deniz polisinden Adalar çevresinde 'deniz taksi' denetimi

In December, a new law came into force requiring some of the world’s most popular social media sites, including Instagram and Facebook, to prevent Australians under the age of 16 from opening accounts on their platforms.

The ban, closely watched worldwide, was justified by campaigners and the government as necessary to protect children from harmful content and algorithms.

Companies, including Meta, said they agree that more needs to be done to keep young people safe online. However, they continue to advocate for alternative measures, concerns that are also shared by some experts.

In a blog update, Meta said: “We are calling on the Australian government to work constructively with industry to find a better path — rather than blanket bans — such as encouraging the entire sector to deliver safe, privacy-protective, age-appropriate online experiences.”

The company said that in its first week of complying with the new law, it blocked 330,639 accounts on Instagram, 173,497 on Facebook, and 39,916 on Threads.

They argued that age verification should be carried out at the app store level, which would reduce the compliance burden on both regulators and the apps themselves, and that exceptions related to parental consent should be created.

“This is the only way to provide consistent, industry-wide protection for young people regardless of which apps they use, and to avoid the vicious cycle of chasing new apps that young people would turn to in order to circumvent the social media ban.”

From the U.S. state of Florida to the European Union, various governments are experimenting with limiting children’s use of social media. However, in addition to the age limit of 16, Australia became the first jurisdiction to reject exceptions for parental consent in such a policy, making its laws among the strictest in the world.

Europe Asia News

 

facebook sharing button Facebook
twitter sharing button Tweeter
whatsapp sharing button Whatsapp