Suffolk campaigner says surge in support for social media rules shows “pent-up demand for change”

The government are consulting on further measures to prepare children for the future in an age of rapid technological change.

Daisy Greenwell co-founder of Smart Phone Free Childhood
Author: Jasmine OakPublished 6th Mar 2026

A Suffolk campaigner calling for tighter restrictions on smartphones and social media for children has welcomed what she described as a strong public response to a new government consultation on online safety.

Daisy Greenwell from Woodbridge and co-founder of the campaign group Smartphone Free Childhood, said the reaction showed many families feel the current approach to regulating technology used by children is not working.

The UK Government is currently consulting on potential new rules designed to make the online world safer for young people. The consultation is exploring measures, including possible minimum age limits for social media, restrictions on features designed to encourage excessive use, such as autoplay and infinite scrolling, and stronger parental controls.

More on the consultation can be found here.

"Families have been struggling with how addictive this tech is, how harmful it is to their kids"

Speaking about the early response to the consultation, Ms Greenwell said:

“Yeah, it was really, really exciting. I think we didn't realise quite how far it would spread.

“But I think it's a reflection of the kind of pent-up desire from families everywhere for this situation to get sorted out and for someone to do something.”

She said parents have increasingly raised concerns about the impact of technology and social media on children.

“For years and years, this has been building. Families have been struggling with how addictive this tech is, how harmful it is to their kids, and it feels like no one's doing anything to sort it out,” she said.

“So I think it was a reflection of the desire for change.”

Calls for stronger protections

The government consultation, titled Growing up in the online world, is examining how digital technology affects children’s wellbeing and what additional measures may be needed beyond existing legislation, such as the Online Safety Act 2023.

Possible proposals include setting a minimum age for social media use, improving age verification systems, limiting features that encourage excessive screen time and strengthening guidance on mobile phone use in schools.

Ms Greenwell said she believes technology companies should be required to prove their platforms are safe for children before they are allowed to use them.

“The current kind of model of regulation is that the platforms get access to our children first and then regulation tries to catch up later, and we're saying we need to flip that,” she said.

“Instead of regulating after harm happens, we need to require safety before they access kids.”

She added that previous attempts to improve online safety had not fully addressed the problems faced by young users.

“We've tried incremental reform over the last decade and trying as a nation to kind of make the online world safer for kids, and that's just not really working,” she said.

“Nobody's thinking, ‘oh it's all better’. Kids are still experiencing, seeing and having very bad times on these platforms.”

Consultation open until May

The government consultation is open to responses from the public, parents, children and organisations until 26 May 2026.

Officials say the aim is to better understand how digital technology affects children and how families can be supported to manage its use safely.

The government has said it will review the responses and publish its findings later this year.

We have contacted the government and social media companies for a response.

Social media response

A spokesperson for Meta has said:

  • We want the same thing as lawmakers: safe, positive online experiences for young people.
  • We believe parents should decide which apps their teens use, so we support laws that empower parents to approve teen app downloads.
  • We’ll comply with social media bans, but with teens using ~40 apps weekly, targeting a handful of companies won’t keep them safe.
  • Governments considering bans should be careful not to push teens toward less safe, unregulated sites, or logged-out experiences that bypass important protections - like the default safeguards we offer in Instagram’s Teen Accounts.
  • It is also notable that a group of UK children's charities and NGOs have come out and said they oppose a ban, believing it would be ineffective and not the right way forward.

Meta’s approach to teen safety:

"Meta have spent over a decade developing more than 50 tools, features, and resources to provide teens and their parents with age-appropriate and positive experiences on our apps. This includes Teen Accounts, which we launched last year across Instagram and is now rolling out across Facebook and Messenger. These integrity features shape our compliance with the UK OSA.

"Teen Accounts have built-in protections that limit who can contact them and the content they see, and also provide new ways for teens to explore their interests. Teens are automatically placed into Teen Accounts, and teens under 16 will need a parent’s permission to change any of these settings to be less strict. These settings include:

"Private accounts: All teens that sign up to Instagram will be placed in private accounts, meaning that only approved followers can see and engage with their content, and see their followers and following lists. Teens under 16 will need a parent’s permission to change this. Teen Accounts are also automatically set to only allow tags and mentions by people the teen follows.

"DM protections: Teens are defaulted into stricter default messaging settings, meaning they can only be messaged by people they follow or are already connected to - including other teens. Teens under 16 can only change these settings with a parent’s permission. This builds on other existing protections from unwanted contact from adults, where we restrict adults over 18 from starting private chats with teens they're not connected to on Instagram and Messenger - and this can’t be turned off.

"Sensitive content controls: With Instagram Teen Accounts, teens will be placed into the strictest setting of our sensitive content control, so they’re even less likely to be recommended sensitive content, and in many cases we hide this content altogether from teens, even if it’s shared by someone they follow. While no technology is perfect, we’ve put additional safeguards in place to help make sure teens are only seeing content that’s appropriate for them - including when they first create an account. Teen Accounts are also automatically set to hide potentially offensive comments and message requests with the strictest Hidden Words setting.

"Supervision on Instagram: Supervision is a set of tools and insights that parents and guardians can use to help support their teens under 18 on Instagram. Supervision is optional, and both the parent and the teen must agree to participate. It can be removed at any time by either person. The other person will be notified if you remove supervision.

"Time Limits: With Teen Accounts teens are reminded to leave Instagram after 60 minutes spent on the app every day and between 10pm and 7am Teen Accounts are automatically set to sleep mode. If supervision is enabled parents and guardians can also set time limits for their teens.

"We know that inappropriate content continues to be top of mind for parents, and just announced that from now on, Instagram Teen Accounts will be guided by PG-13 movie ratings by default. This means that teens will see content on Instagram that’s similar to what they’d see in a PG-13 movie. Teens under 18 will be automatically placed into an updated 13+ setting, and they won’t be able to opt out without a parent’s permission."

Hear all the latest news from across the UK on the hour, every hour, on Greatest Hits Radio on DAB, smartspeaker, at greatesthitsradio.co.uk, and on the Rayo app.