How to Be Kind in a World That’s Always Monitoring You

If we want to live in a more ethical data-driven world, we’re going to have to manage it ourselves, write Aram Sinnreich and Jesse Gilbert.

Apr 3, 2025 - 12:11
 0
How to Be Kind in a World That’s Always Monitoring You
Dynamic communication flux in the era of the high speed technologies

If you’re reading this article on a phone, a tablet, or a laptop, the odds are very high that you’ve currently got a camera pointed at your face, and a bunch of other tech like microphones, GPS, and biometric sensors keeping track of you and your surroundings. Not to mention, the other “smart” devices featuring Amazon’s Alexa or Apple’s Siri that may be lurking somewhere close by.

All of those devices are collecting data about you—and about the people and things you interact with—and sending them over the internet to… well, it’s hard to say. As researchers of data, industry, and culture, we have learned that once information like this gets collected, there’s no telling who’s going to end up with it, or what such people might do with it. And they’re collecting more and more data with every passing day. For instance, in March, Amazon announced that it will no longer allow Echo device users to prevent the recordings of their voices from getting sent automatically to Amazon’s “cloud” service for processing. That means the millions of people who own such devices will have to trust Amazon’s cybersecurity—and its good intentions—to keep those recordings safe for the rest of their lives. [time-brightcove not-tgx=”true”]

Read More: Schools Let AI Spy on Kids Who May Be Considering Suicide. But at What Cost?

Most of us are okay with this, most of the time. Partly because data is invisible, and we don’t have to think about it too much. Partly because we enjoy the convenience of having easy access to the internet and the many services it offers. And partly because we don’t have much of a choice; to escape data surveillance in the 2020s, you’d basically need to live like a hermit.

But more and more, we hear stories about how sharing data can put us, and the people we love, at risk without our knowledge or consent. Women who use period tracker apps are in danger of arrest under new laws that criminalize abortion. Amazon is being called upon by prosecutors to share Alexa recordings to establish guilt in murder trials. Ancestry websites are revealing uncomfortable family secrets, with roughly 5% of customers discovering new siblings they’d never known about. 

Even worse, sometimes the data is wrong, which can make it even more dangerous. Just ask the member of a Pennsylvania synagogue who was falsely identified as Donald Trump’s would-be assassin after anti-Semitic conspiracy theorists used a Facebook photo to smear him. Or Helyeh Doutaghi, the Yale professor who was recently placed on leave after an AI-generated news article falsely tied her to a terrorist organization.

None of us opted into this situation, no matter how many times we’ve clicked “accept” at the end of a 50-page Terms of Service (TOS) agreement we didn’t read. Yet each of us bears some responsibility for the consequences, not merely because we invited these technologies into our lives, or because we might own some tech shares in our retirement accounts. Rather, we bear responsibility because we have a basic duty of care to one another—to live ethically and responsibly as members of society. When data has pervaded every aspect of our lives, from our bodies to our intimate relationships to our professional spheres to our civic institutions, that means we must take data into account even when we can’t see it, and even when we don’t know what its consequences will be, or who might be at risk.

It’s not enough to abide by the lengthy TOS we didn’t read. Tech companies aren’t looking out for us, they’re typically looking after their bottom lines (TOS make that abundantly clear, if you bother to read them). And the U.S. government isn’t about to help, either. Even though the European Union put basic data privacy protections into place in 2018, Congress has yet to pass a single bill limiting the widespread collection and use of our personal data in America.

Unfortunately, if we want to live in a safer, more ethical data-driven world, we’re going to have to manage it ourselves. And because most of us lack the political power, economic might, or technical expertise to change the system we live in, we’re going to have to do it from the ground up, through our shared cultures and social norms. We need to start collectively practicing “data kindness.”

What is data kindness? As its name suggests, it’s just regular old human kindness, but taking into account the invisible webs of data that now surround every action and interaction. Thankfully, this is something we’re good at. Historically, when people are confronted with new circumstances, we find a way to integrate them into our practices of kindness towards one another. 

Think about the drastic rise in food allergies among children in the U.S., which has more than doubled in prevalence since the 1990s. Back then, it was rare to consider allergies when arranging meals at an event, or inviting someone to your home. Now, it’s standard practice. 

This change didn’t happen of its own accord. First, there was reliable evidence of a widespread problem. Second, there was a public discussion about the problem and a critical mass of people willing to do something about it. Then, people began to include dietary considerations in their practices of kindness, and the idea spread even further. The government eventually caught on and passed the Food Allergen Labeling and Consumer Protection Act in 2004, and soon thereafter businesses began to follow suit (because it affected their bottom line).

Data kindness can follow the same kind of path. We already have reliable evidence that unchecked data surveillance causes widespread problems, and articles like this are part of the public discussion about the problem. Now, the burden is on all of us to do something about it. 

What does this mean in practice? While we don’t have all the answers, we have a few suggestions about how everyone can begin integrating data kindness into their own lives.

First of all, when you invite someone into your home, remember that they may be more vulnerable than you are to certain kinds of data surveillance. If you have an Alexa or Siri device listening in to your conversations, let guests know as they enter, and offer to unplug it. If you have a video doorbell device like Ring, alert your guests before they show up on your doorstep.

Secondly, remember that data surveillance means more than just devices. If you’re planning on uploading a DNA sample to an ancestry website, consider consulting your blood relatives before making the decision. They may have secrets they’d rather not have disclosed for them, and you may have secrets they’d rather not hear about.

Third, when sharing selfies and videos on social media, look closely before you post. Who else is included in those images? Every face, even those in the background, will be immediately subjected to facial recognition algorithms, which may end up exposing the people you’ve recorded to unintended consequences. If you’re sharing a textual post, think about the emotions you’re putting out into the world. Facebook might want you to enrage your friends because it makes them money through increased engagement. But do you really want to spread bad vibes?

Of course, this is only a tiny handful of ways in which we can live kinder, more ethical lives in a datafied society. Our suggestions may not work for your life, and our values might not match your own. But everybody has some definition of kindness, which means everyone can practice data kindness in their own way. If enough of us start doing so, all of those little changes in routine can add up to big changes for a better world.