Creating boundaries with emotional data privacy
Do you have a user policy? Should you?
This article is cross-posted from my weekly newsletter, The Sunday Soother, a newsletter about modern spirituality and useful tips for creating more meaning in your life that goes out every Sunday morning. To get more content about how to infuse your life with thoughtfulness, reflection, and meaning, subscribe here. I am also a holistic personal development coach. You can learn more about working with me here.
Friday morning, I was huffing and puffing up Franklin Street, Northeast, in D.C’s Brookland neighborhood, sweaty and disheveled as I neared the end of my 5-mile run. Mostly I’m pretty absorbed in just trying to get my body to inch forward on these runs, unaware of the world around me, but that morning I flicked my head up, wiped off my brow, and looked directly into the camera riding atop an Apple Maps car that was idling at the intersection next to me.
Great, I thought. My sweaty and struggling figure is going to be captured for posterity for anybody who is digging around in the Northeast section of a D.C. map. I wasn’t even wearing my “cute” running outfit. And, most annoying of all, I don’t get to say anything about it.
But that’s life these days. Personally, I’m not very vigilant around my data. I’ve given up to the fact that most companies know more about me than my blood relations in terms of my activities and shopping habits.
I thought about my laziness around the electronic data trails I create through the world more as I recently caught up with a friend who works at a location data company and is well-versed in data privacy. He’d come into my workplace to do a presentation, and afterwards, I was struck: Society, huge tech companies, the law, and some of us, likely — we’ve put enormous thought into the way we want our data to be used. We know what’s legal and what’s not, what’s pushing it, what’s crossing the line, what’s useful, where it’s stored, how to access it. All for data like where we’re going, what we’re watching, what we’re buying, who we’re talking to.
All that, and somehow, we as a society have never (to the best of my knowledge) done even remotely anything close to the same for something, in a way, that’s much more sacred — our emotions.
Here we are, often handing away our hearts to people we’ve known for just a few weeks, or trusting somebody who hasn’t revealed any of themselves to us with our private thoughts, with revered insights, with truths we may not have told anybody else.
It can be beautiful to be truly open to somebody like that — one of life’s greatest pleasures and intimacies.
And yet. There’s no recourse for if those gems are broken, shared, exposed. There are no guidelines about how to know when it’s safe — well, or safe-ish, because there are no guarantees in life — to truly give your emotional being over to another person. And what happens when somebody you trusted with all of your being evolves into something else? What then?
The only actual defined concept I could think of that we have today is therapy, where a therapist is legally and ethically bound to keep private what you share in that room. But what about the hundreds, thousands of other people you interact with over the course of a life? Dozens of whom may come to know things about you that are so intense, so private, so beautiful, so shameful, so you, and only you?
What I’m talking about here, of course, is emotional boundaries with regards to learning who to trust with our innermost self and vulnerabilities. Emotional boundaries, an extremely common concept that is crucial to our well-being — yet one that is never truly taught.
As I thought more about this, and my recent conversation with my data privacy friend, I wondered: What if we modeled our emotional boundaries and intimacies in some of the same ways that society and the law do today with data collection around citizens? What would that look like? Would it be useful? Would it be draconian? Would it be liberating? Would it help with safely sharing pain, beauty, love, if we thought more, well, like a corporation around it, intentionally, with an eye towards our own investments?
Could we create, in essence, our own user policy around our emotional data, who got access to it, and where and when we gave of it? And even if it really didn’t work out, or wasn’t a true match up in the way the two spheres operate, at least we might be dedicating more intention to our emotional data, and more intention is never a bad thing, in my opinion.
So I emailed my friend. He agreed to do a brief Q&A on this. Our conversation is below.
Data privacy friend (hereafter referred to as DPF): I thought about your emotional data privacy question over the weekend — I think it’s a really interesting question and an interesting time to ponder it. The process of creating etiquette for the online world is fraught and contentious — I’m thinking of the norms surrounding quoting public tweets in widely read media, where some people are outraged that reporters do it and others (including me) are like “welp that was public buddy, what can I tell you.”
The other thing that came to mind was some sci-fi I read by Hannu Rajaniemi (what do you know, he had a sci-fi op-ed in the NYT the weekend before last about privacy!). In The Quantum Thief he sets up a society of humans with brain implants on Mars and the concept of “gevulot” (from the Hebrew for “boundary”). It’s like an extra sense for these people, and they can attach it to the data they emit to control how it is used and stored (even within conscious thought, since everyone’s minds are at least part-digital). To let people remember them, or not, or see them in public spaces, or not. You extend greater gevulot in more intimate relationships, and might dial it up or down depending on your environment and mood. It’s fascinating.
Catherine: GEVULOT!!! I MUST READ THIS BOOK. That is exactly what I’m talking about! And absent the fact we can’t really pull that off here on earth, how do we create something for ourselves resembling it?
DPF: I don’t have a 101 to offer, but I do think I can provide some food for thought from ongoing and recent regulatory fights:
- Lawmakers are considering opt-in versus opt-out for data sharing. [My company] does opt-out for anonymized data — stuff that is useful, and powers our business and offerings, but which can’t be connected back to an individual and we think poses little or no privacy risk. And we do opt-in for more sensitive collection, like use cases where a user specifically wants or needs their movements to be recorded.
- You might look at the guarantees offered by the CCPA (which will be in effect at the start of next year) and GDPR. Here’s a chart comparing them. The first page or two isn’t super-relevant, but scroll down to the rights that are extended by each law — to delete data, opt-out of transfer, amend data, and be protected from some classes of decisions based on the data, among others.
- Finally, the debate about the EU’s Right to be Forgotten is relevant, though separate from the above measures. People tend not to care much about corporations, so this angle doesn’t get discussed that much, but to me it brings up interesting questions about how much control we deserve to have over how others think about us, make decisions about us, or remember us. We have a right to privacy, but others also have a right to know and think whatever they care to. This gets back to the gevulot stuff. I think it’s interesting to consider what the hard & fast rules ought to be, and how they differ from what etiquette and norms ought to be.
Catherine: A question as I think about this more — I think what I’m of course really talking about are emotional boundaries, and how you grant people access to your emotional self once you believe them to be worthy/trusting of it.
But in actual data privacy, can you over correct? Like, in life, you don’t want to not grant ANYONE access to your emotional self — that’s just called wall building/keeping others out. So I’m wondering if there is a parallel for that in data gathering/data privacy, if that makes sense…
DPF: Yeah, there is definitely such a thing as a destructive level of privacy. Obviously this is all tainted by [my company’s] interest in making a buck, so take it for what it’s worth. But I mentioned that we collect anonymous data from our users — we can’t connect it back to individuals but the split-up segments do represent places where individuals traveled. You can imagine people not wanting to contribute that data — it doesn’t take much battery or bandwidth but it does take some, so there’s that issue if nothing else. But if they did that then we couldn’t give even halfway-decent ETAs and we’d send people into traffic jams all the time. So we think the anonymized data is a reasonable bargain to offer.
Sending your data flying around does carry some risk — when everything works right the systems and employees handling it are trained and screened or just not looking at its specific contents anyway, but you increase the attack surface area, as they say, by having data in more places. So there’s some risk. But again, probably an okay tradeoff.
I think there are a lot of these sorts of tradeoffs, and in other areas as well. To me it parallels a bunch of conversations about culture and intellectual property, too: how tight should copyright be? Is it okay to embed a photo you copied from the web in a tweet? A print article? Is it okay to photograph people on the street? Without permission? What about with their faces obscured? Is it different if it’s done for Google Streetview? Are native peoples right to try to control how their ideas are used in culture? What about other ethnic groups? Which ones?
I don’t think there’s a bright line for any of these, just a bunch of compromises to muddle through in order to make life livable for individuals but also not ask society to grind to a halt in its deference to them.
Catherine: Huh. *Brain explodes*