NY State Law Will Require Minors to Get Parental Consent to Use Certain Apps

Social media platforms may have moderation policies in place that prevent users from posting content that can be too graphic or harmful, but many still slip through the cracks. It goes without saying that these are bad for young and impressionable minds, and New York State might have a solution for it.

Teen on Social Media
(Photo : Getty Images)

NY State's SAFE for Kids Act

Lawmakers along with Governor Kathy Hochul have joined efforts to present the new Stop Addictive Deeds Exploitation (SAFE) for Kids Act, which aims to protect minors from algorithms that are allegedly following and preying on young people.

With the Act, consent from a parent or a guardian will be required before kids can access algorithm-based feeds on social media apps like TikTok, YouTube, and Instagram. This will apply to all accounts that are held by users under 18 years old.

According to Engadget, studies have been presented to support the claim that the platforms can be harmful, such as the role of social media contributing to a "national youth mental health crisis that is harming children's wellbeing and safety," says NY State Attorney General Letitia James.

She added that young people from New York are struggling with "record levels" of anxiety and depression, and that social media companies are using addictive features to keep the minors on the platform longer are to blame.

Some of the restrictions that will be implemented include limited screen time, wherein parents or guardians can set a limited number of hours for usage, as well as restrict access and notifications for the apps between 12 AM to 6 AM.

In addition to that, young users will no longer be allowed to view content from people they don't follow unless the parent or guardian lets them. However, it's not a foolproof policy since they can just follow the account they want to view. This might be amended by lawmakers later on.

If the bill does not face heavy opposition from the affected social media platforms, it might become law by 2024. If social networking sites violate the rules within the Act, they could face consequences like paying up to $5,000 in damages.

Read Also: Microsoft To Settle FTC Kids' Privacy Case With $20M

NY Child Data Privacy and Protection Act

There's another Act that serves to protect children from the social media systems that collect user data to be used for targeted advertising. This means that restrictions will apply to practices like processing information that relates to, describes, or is linked to a user under 18 years old.

When passed as law, platforms cannot collect or retain personal data, unless it is to provide an online product and the collection, processing, retention, or sale is limited to that purpose. The social media companies will also have to design a feature that alerts child users, as per The National Law Review.

The alert, which informs the user that their data is being collected and retained, should be presented in a manner that would be understood by its target reader, meaning the platforms should not use heavy words that might confuse younger users.

Related: YouTube Faces UK Watchdog for Violating New Law Protecting Children

© 2024 iTech Post All rights reserved. Do not reproduce without permission.

More from iTechPost