New York is advancing new rules to curb addictive social media use among minors, addressing age verification and parental consent, a step towards safeguarding youth mental health.
![]() |
New York's proposed regulations to limit social media addiction among minors are now open for public comment. Image: CH |
New York, USA — September 16, 2025:
In an effort to combat the rising concerns over addictive social media features affecting minors, New York's Attorney General Letitia James has unveiled proposed regulations aimed at enforcing the SAFE for Kids Act, a state law passed last year. The regulations mark a bold step toward protecting the mental health of children and teens, addressing social media addiction that many experts believe is linked to growing anxiety and depression in young people.
The Stop Addictive Feeds Exploitation (SAFE) for Kids Act seeks to regulate social media content aimed at users under the age of 18, specifically prohibiting platforms from showing algorithmically curated content without parental consent. The law also restricts notifications between 12 a.m. and 6 a.m., addressing concerns about late-night screen time disrupting sleep and overall well-being. While these measures aim to reduce harmful digital exposure, they also raise questions about how much regulation is appropriate without compromising user privacy and freedom.
The new regulations propose that companies must verify the age of their users and seek explicit parental consent for content that includes algorithm-driven feeds or overnight notifications. Platforms are given a choice of age verification methods, such as photo uploads or cross-referencing email addresses, but they must ensure that these practices respect privacy standards. The guidelines are designed to make it harder for minors to bypass age restrictions and access features that may contribute to addictive behaviors.
The push for these new regulations comes at a time when youth mental health issues, including increased rates of anxiety, depression, and loneliness, are reaching alarming levels. Research links these issues to social media usage, with personalized feeds fueling excessive screen time and intensifying feelings of comparison, isolation, and stress.
"Children and teenagers are facing alarming levels of mental health issues, driven in part by addictive social media features," said Attorney General Letitia James, highlighting the law’s intention to reduce the digital harm that many young people experience.
While many see these measures as crucial for protecting minors, the SAFE Act has faced resistance from digital rights advocates, who argue that the regulations may undermine privacy and free expression. Concerns have been raised about the potential for misuse of age verification systems, and the broader implications for data security and privacy.
Moreover, similar laws in other U.S. states have faced legal challenges, suggesting that these measures may not be as straightforward to implement as they appear. Over 20 states have enacted or proposed age-verification laws for social media, but many are entangled in lawsuits that question their constitutionality and the impact on free speech.
As the debate continues, many are questioning whether the proposed regulations will be sufficient to protect youth without overstepping the bounds of regulation. With Instagram and other platforms already implementing voluntary age-verification systems, the Attorney General's office has acknowledged these efforts but maintains that they have been inadequate in addressing the problem fully.
Once finalized, the new regulations will give social media companies 180 days to comply. This gives the tech industry a clear timeline to adopt these measures, but questions remain about the real-world impact. Will these regulations truly reduce social media addiction and its effects on young people, or will they face significant pushback from both tech companies and civil liberties groups?
The rules are currently open for a 60-day public comment period, allowing for feedback from stakeholders, including parents, educators, tech companies, and privacy advocates. The Attorney General’s office will likely refine the guidelines based on the feedback received before finalizing them.
The outcome of these regulations could set a precedent for other states and countries considering similar measures. New York's bold move signals a growing recognition of the dangers posed by social media platforms but also raises important questions about the balance between protecting youth and preserving digital rights. As the digital landscape continues to evolve, these regulations may serve as a pivotal moment in the ongoing effort to safeguard young people in the age of social media.