Every generation has its own public health crisis. For my parents, it was tobacco. For mine, it was opioids and school shootings. And for our children—it’s the algorithm.
We have quietly allowed the mental health of an entire generation to be tested in real time by systems designed not to protect them, but to addict them. Social media platforms optimize for attention, not well-being. They reward outrage, not empathy. And they mine the insecurities of youth to feed an economy that sells validation by the scroll.
So when Virginia passed Senate Bill 854—a bill that limits social media usage for minors under 16 to one hour per day per platform without verifiable parental consent—it felt like a long-overdue line in the sand.
No, the bill isn’t perfect. It doesn’t address every structural issue or solve for the deeply embedded profit motives of social media companies. But perfection is the enemy of progress. And progress, in this moment, looks like a state finally standing up and saying: “Not our kids. Not like this.”
Virginia’s action should be a signal to every state—and to Washington—that it’s time to stop playing defense. We need a national strategy to confront the digital harm facing our youth, with smart, enforceable policies that balance innovation with responsibility.
We already regulate what children can watch, what they can consume, and even how late they can work. Why should the platforms that influence their self-worth, shape their worldviews, and dictate their daily rhythms be any different?
Social media isn’t inherently evil. But it’s been allowed to evolve without guardrails—without any of the accountability we demand from other industries that interact with children. We don’t let companies market cigarettes to kids. Why are we letting them market anxiety, comparison, and distraction?
We need common sense limits. We need clear age verification. We need transparency. We need parental controls that work. And yes, we need legislation—not just from states doing their part, but from Congress with the courage to act in the best interest of the next generation, not Big Tech.
The same is true for the growing power of AI and robotics. These tools are not neutral. They are extensions of human intent and corporate incentive. And without thoughtful legislation, they will be adopted faster than our values can keep pace. We must move quickly to create ethical frameworks and usage boundaries before the future becomes unrecognizable.
Because this isn’t just about social media. This is about reclaiming the health, attention, and future of our children. This is about choosing human dignity over digital addiction. And it starts with the courage to act, even when the solutions are imperfect.
I applaud Virginia for acting. I hope more states follow. But the real work is still ahead. We need to create a world where childhood isn’t optimized for engagement—but for joy, growth, and presence.
The algorithm won’t save us. But we still can.
If this episode stirred something in you—if you’re a parent, a policymaker, or simply someone who cares about the future we’re handing to the next generation—I’d love for you to follow or subscribe on Substack, Apple Podcasts, or Spotify.
What Virginia did is just one step. The real test is whether the rest of us are willing to act. Whether we’ll keep letting Big Tech write the rules—or draw a better line ourselves.
I’ll be back soon with more.
Until then—be bold, be present, and protect what matters most.
Share this post