Silicon Valleys Journal
  • Topics
    • Finance & Investments
      • Angel Investing
      • Financial Planning
      • Fundraising
      • IPO Watch
      • Market Opinion
      • Mergers & Acquisitions
      • Portfolio Strategies
      • Private Markets
      • Public Markets
      • Startups
      • VC & PE
    • Leadership & Perspective
      • Boardroom & Governance
      • C-Suite Perspective
      • Career Advice
      • Events & Conferences
      • Founder Stories
      • Future of Silicon Valley
      • Incubators & Accelerators
      • Innovation Spotlight
      • Investor Voices
      • Leadership Vision
      • Policy & Regulation
      • Strategic Partnerships
    • Technology & Industry
      • AI
      • Big Tech
      • Blockchain
      • Case Studies
      • Cloud Computing
      • Consumer Tech
      • Cybersecurity
      • Enterprise Tech
      • Fintech
      • Greentech & Sustainability
      • Hardware
      • Healthtech
      • Innovation & Breakthroughs
      • Interviews
      • Machine Learning
      • Product Launches
      • Research & Development
      • Robotics
      • SaaS
  • Media Kit
No Result
View All Result
  • Topics
    • Finance & Investments
      • Angel Investing
      • Financial Planning
      • Fundraising
      • IPO Watch
      • Market Opinion
      • Mergers & Acquisitions
      • Portfolio Strategies
      • Private Markets
      • Public Markets
      • Startups
      • VC & PE
    • Leadership & Perspective
      • Boardroom & Governance
      • C-Suite Perspective
      • Career Advice
      • Events & Conferences
      • Founder Stories
      • Future of Silicon Valley
      • Incubators & Accelerators
      • Innovation Spotlight
      • Investor Voices
      • Leadership Vision
      • Policy & Regulation
      • Strategic Partnerships
    • Technology & Industry
      • AI
      • Big Tech
      • Blockchain
      • Case Studies
      • Cloud Computing
      • Consumer Tech
      • Cybersecurity
      • Enterprise Tech
      • Fintech
      • Greentech & Sustainability
      • Hardware
      • Healthtech
      • Innovation & Breakthroughs
      • Interviews
      • Machine Learning
      • Product Launches
      • Research & Development
      • Robotics
      • SaaS
  • Media Kit
No Result
View All Result
Silicon Valleys Journal
No Result
View All Result
Home Technology & Industry AI

The Three Things Legislators and Tech Companies Must Do Right Now to Protect Kids Online

By Kate Doerksen, Cofounder & CEO, Sage Haven

SVJ Thought Leader by SVJ Thought Leader
April 7, 2026
in AI
0

Every day, millions of children between the ages of 8 and 15 go online to learn, to play, and to connect with friends. And every day, the systems designed to protect them fall further behind the technology they are using. We have spent years debating screen time and making marginal improvements into parental controls. We have passed a handful of well-intentioned laws. And yet the tools available to parents remain inadequate and confusing, the platforms remain largely unaccountable, and children remain exposed to risks that would be unacceptable in any other environment.

The research is no longer ambiguous. Jonathan Haidt’s The Anxious Generation laid out the case clearly, and the legislative momentum that has followed — from Australia’s under-16 social media ban to New York’s algorithm restrictions to Utah’s interoperability law — shows that policymakers are finally catching up. But momentum is not enough. What’s needed is a coherent, legally durable, and technically viable framework. 

These are the three things that tech companies and legislators need to do now. We don’t need studies or pilots – we need action!  

1. Mandate Age and Parent Verification 

The current system of age verification for most sites and apps is still a joke. A child clicks a button confirming they are 13 or older, and the platform moves on. Studies show the average age U.S. kids first see porn is 12 with 15% of kids exposed at 10 or younger and some kids seeing porn as young as 7 or 8 years old.  Millions of children under 13 have social media accounts in direct violation of existing law and platform policy. 

Everyone knows we need age verification but big tech is in that Spiderman-meme where they are all pointing at each other to do age verification. Apps are pointing at the App Stores and the App Stores are pointing at the apps.  I prefer App Store solutions because it would streamline the experience where parents and kids don’t have to constantly age verify for every app from there.  That said, I also believe all app builders need to add age verification technology into their products asap without waiting for legislative action. 

There are new AI technologies that can do age estimations based on facial images and activity. I’d love to see it mandated that this be done by third parties who have no vested interest in “rounding up” on the age and who have strong commitments to deleting the user data after the estimation and security. I also believe there needs to be auditing requirements for apps of a certain size to ensure they aren’t supporting underage users.

The second critical step is that all apps who have users under 13 also need to do parent verification. This needs to be done even if the age estimate technology is close to 13 since it’s not perfectly accurate. This ensures COPPA compliance and is key to protecting kids online.  

Additionally, I’d love legislators and technology companies mandate parent verification requirements that extend up to age 16 for all apps that have anything potentially dangerous to kids including:

– Any user generated content (including messages, voice calling or uploaded images or pictures, social media, etc. )

– Any potentially age-inappropriate content (including all AI platforms, streaming services with content for 13+, etc)

– Any addictive or manipulative design features (including autoplay, gamified engagement like Streaks, etc). There is a great outline of these features in Jonathan Haidt’s work on the Childhood Index and detailed here. Additionally, the UK’s Age Appropriate Design Code established the legal principle that platforms must design differently for child users. 

Technology is not the barrier. The mandate is and this needs to come from both legislators and great technology leaders.

2. Require Interoperability which Gives Families More Options

Today’s dominant tech platforms created “closed loop” systems that limit parent choices for apps. Let’s take messaging for example. If every teen is on Snapchat, which can only be used to talk to other users on Snapchat, it’s harder to keep your kids off of it when they just want to communicate with friends. This is emblematic of a bigger structural problem where parents are stuck with limited options, low visibility and lackluster parental controls provided by these key platforms who have absolutely no incentive to make it more competitive and every incentive to do just enough to avoid legislation. 

The European Union has shown what a mandate looks like in practice. Under the Digital Markets Act, designated “gatekeeper” platforms are required to make their messaging services interoperable. This means that users on one platform can communicate with users on another without both parties being locked into the same app. The goal was consumer choice and competition. The child safety implications are just as significant: when messaging is interoperable, parents are no longer forced to accept whatever parental controls a single platform decides to offer. Safety features become portable. Accountability becomes comparative. It is technically possible to do this without major sacrifices to security too!

That principle needs to extend to the United States, and it needs to extend explicitly to the platforms where children are actually communicating. This includes ensuring new messaging apps have access to Rich Communication Services to better compete with iMessage and Google Messages. It also includes closed loop messaging apps like Snapchat and WhatsApp expose APIs that allow parents, safety tools, and competitive services to interact with them. 

Snapchat remains one of the most widely used messaging platforms among tweens and teens, and its core design — disappearing messages, minimal parental visibility, and lots of harmful and even dangerous design features — makes it among the most difficult for parents to meaningfully monitor. Snapchat’s Family Center allows parents to see who their child is talking to, but not what is being said, and only if the child consents to the connection. On a platform specifically known for ephemeral content, that is not oversight. It is the appearance of oversight.  

The EU has required interoperability for WhatsApp in Europe (which they are rolling out as slowly as humanly possible) but WhatsApp in the U.S. is still a closed loop app. It will continue to be one until U.S. legislators mandate this interoperability.

Mandating interoperability would change the power dynamic completely.  Families could migrate to platforms with stronger safety architectures without their children losing access to their existing social networks. New entrants with genuinely child-safe design could compete on a level playing field rather than being locked out by network effects. Utah passed an interoperability law in 2025. The federal Access Act would extend this nationally. Both should be models for a child-safety-specific interoperability standard that goes further: requiring not just messaging compatibility, but parental oversight API access as a baseline condition of operating a platform used by minors.

I’m a broken record here but the technology to do this well already exists. The only barrier left is a political one.

3. Ensure There Is Real Accountability

Passing laws without enforcement mechanisms is not child safety policy. It’s theater. Real accountability means three things: penalties that hurt, audits that are independent, and liability that cannot be signed away.

Right now, the financial penalties available to regulators for child safety violations are a rounding error for companies generating tens of billions in annual revenue. The FTC fined Facebook $5 billion in 2019. While this was a record-setting figure that made headlines, it only represented 5-6 weeks of revenue. And when the news broke, Facebook’s share price rose by nearly 2% adding about $10 billion to its market value — more than double the cost of the fine itself. When the cost of violating the rules is already priced into the business model, the rules are not functioning as rules. Penalties need to scale with platform revenue in a way that creates genuine deterrence, not a predictable cost of doing business.

Independent audits are equally critical. Platforms currently self-report on their safety performance. They define the metrics, conduct the assessments, and publish the results. This is not accountability — it is marketing. Legislation in the UK under the Online Safety Act has begun to establish a model where an independent regulator can compel platforms to submit to third-party safety audits. The EU’s Digital Services Act includes similar provisions for very large platforms. The United States needs an equivalent: a federal standard requiring regular, independent, and publicly disclosed audits of the safety systems platforms claim to have in place for minor users.

Finally, liability. Technology platforms have long sheltered behind Section 230 of the Communications Decency Act, which provides broad immunity from lawsuits related to user-generated content. That protection made sense in 1996, when the internet was young and the concern was that platforms would be crushed by liability for things their users said. It makes considerably less sense applied to algorithmically curated feeds, AI-generated recommendations, and design systems that have been engineered to maximize the time children spend on them. Courts and legislators are beginning to draw this distinction but they need to move faster. Platforms should not be immune from liability for design choices that foreseeably harm children, regardless of whether the harm was delivered through user-generated content. 

The courtroom is now where the accountability conversation is playing out in real time and the stakes could not be higher. A landmark bellwether trial is currently underway in Los Angeles Superior Court, in which Meta and Google’s YouTube stand accused of deliberately designing their platforms to addict children. The plaintiff, identified as Kaley, began using social media at age 10 and claims her compulsive use (driven by addictive design features) led to depression, self-harm, and a stolen adolescence. TikTok and Snapchat both settled before trial. Mark Zuckerberg testified before a jury for the first time, as internal documents presented in court appeared to confirm that Meta knew preteens were using its apps, aimed to maximize time spent scrolling, and disregarded expert advice on how to make its platforms safer. The case has been designated a bellwether trial — meaning its outcome could shape how more than 1,500 similar lawsuits are resolved.

It’s Time to Act

We are not at the beginning of this crisis. We are well into it. The legislative tools exist. The legal pathways have been tested and, in many cases, upheld. The research consensus is as strong as it has ever been. 

This generation of kids deserve age and parent verification that actually works to keep them safe without harmful content or design. This generation of parents deserve better options that will become available when interoperability is mandated.  We all deserve better technology options that are less addictive and harmful to our mental health and well-being. 

None of this requires dismantling the internet. It requires holding an enormously profitable industry to the same standard we hold every other industry that interacts with children. We cannot wait for another decade of debate. It is time to act.

Previous Post

Meeting Culture Is the Growth Metric No One Is Tracking

Next Post

Turning Vision into Impact: GIAC Africa Summit 2026 to Convene Global Leaders, Investors and Innovators April 10-11 in Dallas

SVJ Thought Leader

SVJ Thought Leader

Next Post
Turning Vision into Impact: GIAC Africa Summit 2026 to Convene Global Leaders, Investors and Innovators April 10-11 in Dallas

Turning Vision into Impact: GIAC Africa Summit 2026 to Convene Global Leaders, Investors and Innovators April 10-11 in Dallas

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

  • Trending
  • Comments
  • Latest
Faith and the Digital Transformation of Religion: How One Person Began Helping Faith Communities and People of Faith

Faith and the Digital Transformation of Religion: How One Person Began Helping Faith Communities and People of Faith

December 30, 2025
AI’s Most Underrated Role: Giving Enterprise Architects Back Their Focus

AI’s Most Underrated Role: Giving Enterprise Architects Back Their Focus

November 26, 2025
Your customers are talking, but are you listening? How AI Conversational Intelligence is rewriting the rules of customer experience

Your customers are talking, but are you listening? How AI Conversational Intelligence is rewriting the rules of customer experience

November 13, 2025
AI at the Human Scale: What Silicon Valley Misses About Real-World Innovation

AI at the Human Scale: What Silicon Valley Misses About Real-World Innovation

October 27, 2025
The Human-AI Collaboration Model: How Leaders Can Embrace AI to Reshape Work, Not Replace Workers

The Human-AI Collaboration Model: How Leaders Can Embrace AI to Reshape Work, Not Replace Workers

1

50 Key Stats on Finance Startups in 2025: Funding, Valuation Multiples, Naming Trends & Domain Patterns

0
CelerData Opens StarOS, Debuts StarRocks 4.0 at First Global StarRocks Summit

CelerData Opens StarOS, Debuts StarRocks 4.0 at First Global StarRocks Summit

0
Clarity Is the New Cyber Superpower

Clarity Is the New Cyber Superpower

0
Turning Vision into Impact: GIAC Africa Summit 2026 to Convene Global Leaders, Investors and Innovators April 10-11 in Dallas

Turning Vision into Impact: GIAC Africa Summit 2026 to Convene Global Leaders, Investors and Innovators April 10-11 in Dallas

April 7, 2026

The Three Things Legislators and Tech Companies Must Do Right Now to Protect Kids Online

April 7, 2026

Meeting Culture Is the Growth Metric No One Is Tracking

April 7, 2026

The Technical Debt Crises

April 7, 2026

Recent News

Turning Vision into Impact: GIAC Africa Summit 2026 to Convene Global Leaders, Investors and Innovators April 10-11 in Dallas

Turning Vision into Impact: GIAC Africa Summit 2026 to Convene Global Leaders, Investors and Innovators April 10-11 in Dallas

April 7, 2026

The Three Things Legislators and Tech Companies Must Do Right Now to Protect Kids Online

April 7, 2026

Meeting Culture Is the Growth Metric No One Is Tracking

April 7, 2026

The Technical Debt Crises

April 7, 2026

About & Contact

  • About Us
  • Branding Style Guide
  • Contact Us
  • Help Centre
  • Media Kit
  • Site Map

Explore Content

  • Events
  • Newsletter
  • Press Releases
  • Reports & Guides
  • Topics

Legal & Privacy

  • Advertiser & Partner Policy
  • Communications & Newsletter Policy
  • Contributor Agreement
  • Copyright Policy
  • Privacy Policy
  • Prohibited Content Policy
  • Terms of Service

Tiny Media Brands

  • Silicon Valleys Journal
  • The AI Journal
  • The City Banker
  • The Wall Street Banker
  • World Lifestyler
  • About
  • Privacy & Policy
  • Contact

© 2025 Silicon Valleys Journal.

No Result
View All Result

© 2025 Silicon Valleys Journal.