Relationally Responsible Tech: Designing a Digital Future That Puts People First
Relationally responsible tech as the opposite of "junk tech," advocating for technology designed with children's and families' well-being in mind to foster relational intelligence rather than isolation. The post Relationally Responsible Tech: Designing a Digital Future That Puts People First appeared first on Getting Smart.

What if, instead of asking what technology can do, we started by asking who we want to become?
In an age where “junk tech” floods our lives—designed to extract attention rather than deepen connection—it’s time to flip the script. We need technology that strengthens, rather than severs, our most vital bonds. What the world needs now is relationally responsible tech: technology grounded in the well-being of children and families, and designed to cultivate relational intelligence, not isolation.
Relationally responsible tech starts from a simple but radical premise: human relationships are not a nice-to-have—they are the infrastructure of flourishing. When designed with care, technology can act as a bridge, not a barrier, to deeper connection. When it isn’t, it becomes what we now see all too often: isolating, addictive, and indifferent to human development.
As Julia Freeland Fisher and the Christensen Institute’s recent research on AI and guidance reveals, today’s edtech and advising tools rarely treat relationships as an outcome to be measured or prioritized. Instead, we see a widening gap between convenience and connection. The tools we use to “support” students often sideline the very thing they need most: human guidance, belonging, and care.
But there are signs of hope. Fisher’s study highlights a powerful possibility: AI could free up educators and advisors to take on more relational roles—coaching, mentoring, connecting. Getting Smart’s recent illustration of Kentucky’s Learning Ecosystem in 2040: A Day in the Life helps to illustrate what this relationship might look like. The lines between human and bot may blur, but the need for real connection remains sharp. Some innovators are beginning to design for this future, finding creative ways to scale human connection, not just content delivery.
How Might We Scale Human Connection?
Two standout examples come from the Stanford Accelerator for Learning. The Autism Glass Project, led by Dr. Dennis Wall and Nick Haber, uses wearable AI-powered glasses to help children on the autism spectrum interpret facial expressions and social cues in real time. The aim isn’t to “fix” neurodivergence, but to offer tools that empower children to engage more confidently in relationships—building connection, not dependency. By supporting social communication through augmented feedback, the technology becomes a relational ally, not a replacement.
Another Stanford initiative, FIND (Filming Interactions to Nurture Development), developed by Dr. Phil Fisher, uses brief video clips and AI-assisted analysis to coach caregivers in responsive parenting. It helps parents see the micro-moments where they are connecting positively with their child—and how to build on them. This is AI at its most humane: not judging or scoring, but encouraging presence, attunement, and love.
How Might We Learn From Others?
We can take a cue from an unlikely group of tech critics: the Amish. As Cal Newport and Kevin Kelly have observed, the Amish approach technology by starting with what they value—community, family, continuity—and then working backwards. They adopt technologies only if they serve those values. It’s not the gadget they resist—it’s the mindset that idolizes progress at the expense of people.
This discernment is something our society desperately needs. We’ve let convenience lead, and connection lag behind.
The good news? We can course-correct.
Organizations like Digital Promise and the Stanford d.shool are beginning to lead the way, with product certifications that reward edtech tools designed with learner variability—and equity—in mind. The Stanford d.school’s Edtech Remix centers equity in edtech through community learning. Digital Promise’s new certification for Prioritizing Racial Equity in AI Design, developed in partnership with the Edtech Equity Project, is a model for how tech can align with values. Soapbox Labs, which builds voice-based AI with youth at the center, was the first recipient and just won their second award. Voice, after all, is deeply relational. It signals presence, responsiveness, and attention.
The Shift from Extractive to Expressive
Relational technology isn’t just about what it does—it’s about what it says to the people on the other end. It signals: you matter. I’m listening. I’m here.
That’s why we need a shift from extractive to expressive tech, from tools that compete with our relationships to tools that complete them.
The state of Washington, among others, has also taken thoughtful steps toward human-centered AI, offering early guidance for public agencies and schools. This is an important start, but we must go further: we need a movement for relational technology, rooted in child development science and relational ethics. We must build a coalition of designers, developers, educators, policymakers, and families who begin not with the question, “What’s possible?” but “What’s precious?”
And it starts with us.
As one Amish workshop owner told researcher Howard Rheingold, their fear isn’t the tech itself, but the creeping belief that progress is always beneficial, that individualism trumps community, that the goal is to get ahead—not together, but alone.
Relationally responsible tech invites us to resist that story. To start instead with love, belonging, and connection. To ask, before adopting the next device, platform, or bot: Does this help us hold each other closer—or push us further apart?
The future of technology—and of our children’s well-being—depends on how we answer.
The post Relationally Responsible Tech: Designing a Digital Future That Puts People First appeared first on Getting Smart.