Across the globe, lawmakers are rethinking the role social media plays in the lives of children and teens. From Europe to the United States, countries and states are rolling out new laws aimed at limiting access, delaying exposure, or requiring parental consent for minors to use popular platforms. These efforts are rooted in growing concerns about mental health, privacy, and online safety.
Many of the laws aimed at protecting kids from social media risks have faced significant roadblocks in the courts, often due to challenges from NetChoice, a group representing big tech companies like Meta and Google.
NetChoice argues that these state laws often violate the First Amendment by restricting free speech and placing heavy burdens on platforms. They also contend that some regulations conflict with federal law, like the Communications Decency Act, which shields online platforms from certain legal liabilities.
Because of these challenges, several state laws haven’t taken effect, leaving lawmakers and tech companies in ongoing disputes over how to keep kids safe while respecting the law.
So what exactly is being banned? When do these laws take effect? Are they working? Here’s a breakdown of some of the key legislations in place or in progress.
Countries and States Restricting Social Media for Kids
Region | Age | Law Details | Effective Date |
Australia | Under 16 | Online Safety Act: Platforms must actively block users under 16 from creating or keeping accounts, face fines up to AUD $50 million per violation for failing to verify age, with exemptions for health, education, and private messaging services. | December 2024 |
France | Under 15 | Law No. 2023-566 (Digital Majority Law): Platforms must verify age and get parental consent for users under 15, inform families about digital risks, allow parents to suspend accounts, monitor screen time, and follow ARCOM standards — violations can cost up to 1% of global revenue. | July 2023 |
Arkansas | Under 16 | SB 611: Platforms must verify users under 16, get parental consent, limit addictive features, ban notifications to minors between 10 p.m. and 6 a.m., provide parental controls, and face fines plus parent lawsuits for violations. | April 22, 2025. The revised law (SB 611 / Act 900) was signed after the original 2023 version was permanently blocked by a federal court. It is currently in effect, but facing a new federal lawsuit filed by NetChoice on June 27, 2025. |
Connecticut | Under 18 | SB 1295: Platforms must create an online safety center, combat cyberbullying, block unsolicited adult messages, limit addictive design features, submit risk plans to the Attorney General, and address broader risks to minors’ health from data use. | July 1, 2026 |
Florida | Under 14 | HB 3: Bans social media accounts for children under 14, requires parental consent for ages 14–15, enforces strict age verification, allows parents to request account deletion, and imposes fines up to $50,000 per violation. | January 1, 2025. Currently blocked due to ongoing federal lawsuits by NetChoice, arguing the law violates the First Amendment and privacy rights. |
Georgia | Under 16 | SB 351: Requires platforms to verify user age, obtain parental consent for users under 16, limit data collection and ads for minors, provide content moderation info to parents, and impose fines up to $10,000 for violations involving harmful content after a 90-day cure period. | July 1, 2025. Temporarily blocked by federal court citing free-speech concerns. |
Louisiana | Under 16 | Act 656: Prohibits targeted advertising and sale of sensitive data for minors under 18. Applies to large platforms with over 1 million users. Platforms are encouraged to estimate user age to avoid liability, but age verification and parental consent are not required. | July 1, 2025 |
New York | Under 18 | Stop Addictive Feeds Exploitation (SAFE) for Kids Act: Bans algorithm-driven feeds for users under 18 without parental consent, limits related notifications from 12 a.m. to 6 a.m., requires age checks, and prohibits platforms from downgrading service for users who opt out. | June 2026 |
Ohio | Under 16 | Ohio Rev. Code Ann. 1349.09:Requires parental consent for users under 16 and provides parents with content moderation info. Violations can result in fines up to $10,000 per day per child. | January 15, 2024. Scheduled effective date. Permanently blocked in April 2025, after a federal court ruled the law unconstitutional under the First Amendment. |
Tennessee | Under 18 | HB 1891: Requires platforms to verify all users’ ages, get parental consent for minors under 18, ban storage of verification data, offer parental controls, and imposes fines up to $1,000 per violation | January 1, 2025. There is currently ongoing litigation challenging the law on constitutional grounds, but no injunctions or stays have been issued, so the law remains active. |
Texas | Under 18 | HB 186: Requires platforms to verify users are 18+ before account creation, bans minors from having accounts without parental approval, allows parents to request account deletion, and mandates deletion of age verification data after use. | September 1, 2025. The bill passed the Texas House but missed a deadline and did not receive approval from the Senate in May 2025, causing it to fail. |
Utah | Under 18 | SB 194: platforms to verify ages accurately, set minors’ profiles to private, limit data use, offer parental controls with verified consent, protect minors’ data, and impose fines up to $2,500 per violation.HB 464: Repeals Utah’s old law and lets minors or parents sue for mental health harm from addictive algorithm-driven social media, unless companies show they have limited harmful features or got parental consent. | October 1, 2024. Enforcement of both laws are currently blocked due to ongoing constitutional challenges. |
What All This Means for Parents
While the legal landscape continues to shift, it’s encouraging to see lawmakers starting to recognize the risks social media can pose to young people and working toward meaningful protections.
Whether these laws will make a measurable difference remains to be seen. In many cases, even with age restrictions in place, teens can still find ways around them. And in some cases, bans can push kids toward using more secretive or less-regulated corners of the internet.
That’s why no law or platform policy can replace the influence of informed, engaged parenting.
If your child is online, it’s worth starting with open, ongoing conversations. Ask what apps they use and why. Sit down together and scroll through their feeds. Talk about the content that lifts them up, and the content that brings them down. Help them recognize the impact social media has on their moods, self-image, and time.
Beyond conversation, setting clear expectations around usage — like screen-free bedrooms, tech curfews, or breaks during homework — can create structure that protects your child’s well-being without isolating them from their peers. Instead of focusing on what’s forbidden, focus on what’s healthy and sustainable.
And finally, stay informed. Know what laws are active in your state or country. Understand what features platforms are rolling out next. You don’t need to be a tech expert, but staying a step ahead helps you guide your child with confidence.
In the long run, a strong relationship, honest communication, and clear boundaries will have more impact than any age-gate ever could.
For parents wanting a safer tech option, devices like Gabb phones — without social media or internet browsers — offer kids a way to stay connected while avoiding online risks. It’s a useful tool alongside conversations and boundaries as families navigate the digital world.
Have you seen these laws impacting your family? Do you have questions about how to navigate social media with your kids? Drop a comment below or share your story. We’re all learning together, and your insight might help another parent.
Success!
Your comment has been submitted for review! We will notify you when it has been approved and posted!
Thank you!