Ep 112: Consequences of Your Teen’s Digital Footprint

Andy:
Sharenthood. What the heck is that? And why is it important? Why is it so important that you went and wrote this book that I mean is very well researched. You spend a lot of time, there's an extensive notes section in here. This is not something you just kind of put together willy nilly off the cuff. So how did you get interested in this topic? What is it and why does it matter?

Leah:
I got interested in the concept of sharenthood or sharenting because I became a mom right around the same time that I joined the Berkman Klein Center for Internet and Society at Harvard University as part of their research team that focuses on youth and media issues, which includes privacy. And I was initially researching schools and school privacy, which is extremely important, but I began to realize Andy, that as I was looking at what schools were doing and the way laws were changing that related to schools, that I was going home from my work and posting on Facebook myself, or looking at posts that I was receiving on my Facebook newsfeed. And I'm thinking, gosh, we parents, as well as grandparents, aunts, uncles and other trusted adults in kids' lives, we are just putting a whole lot of information out there that is way more private and in some cases embarrassing than the schools are putting out in most instances.

Leah:
So then I got curious. And the term sharenting or sharent or sharenthood, which was added to the dictionary this year I recently saw, is defined as the ways in which parents post about kids on social media. That is the more narrow and more common definition. The way I define it in my book is broader. I define sharenting as all the ways that not just parents, but also grandparents, teachers, aunts, uncles, coaches, neighbors, et cetera, all the ways that all of these adults transmit children's private information on digital technologies. So posting on Facebook or other social media, far and away the most obvious example.

Leah:
But Andy, anytime that we are giving our kids an educational app or using an Alexa or Siri, or putting a tracker on them, or even giving them a watch that measures their steps, if that device or service is acquiring, storing, transmitting, analyzing, resharing their private information because of a choice that we as the adults in their lives made, then we are engaging in my book literally and metaphorically in sharenting. And it matters because it is ubiquitous. When I started researching this topic, it was actually a little bit hard to know what research terms to look up, because sharenting was just something that, well, gosh, it's just a thing that people do. It's not really a topic and it's certainly not really a concern. So, why do we even give it a name? But because it's so ubiquitous, it was time to really drill down into it.

Andy:
And you point out that this can start even before birth. We can post a picture of the ultrasound, hey, wow, excited. It's a boy, check it out. Or, birth certificate, hey, wow, look, baby just born, really exciting. And it's totally natural. People have been telling all of their friends and family members excitedly about their newborns for a long, long time. But what's different now is that it is not just your friends are telling about it. You're also sharing that data with some other third parties that might be able to use it.

Andy:
And you point out here that even something as simple as that can be problematic because a lot of websites use private information like your birthday or your city of birth, where were you born and things like that to verify your identity. And so the more that information that starts just kind of leaking out and getting posted, the harder it is for you to maintain your own privacy later on. Or I guess, these things that might seem benign or might seem like just a fun little thing to post, I guess we don't often think about the consequences. And as reading through this book, it really got me thinking about how a lot of those things that seem on the surface like it's not a big deal, that there's more beneath the surface.

Leah:
I agree with that Andy and there can be more beneath the surface in a number of ways. First and foremost, when we as parents or teachers or grandparents or other adults, are sharing this kind of information about our kids, bottom line is that we are likely not asking them for permission. And-

Andy:
Especially if it's their ultrasound.

Leah:
Right, and I was just about to say, you took the words right out of my mouth. We can't ask them for permission in many contexts. And even if we could, and I certainly encourage parents in age appropriate, developmentally responsible ways to have that conversation with their kids, they really may not understand in any meaningful way what they're agreeing to. So first and foremost, we are sharing information that is not completely ours. And look, I'm a mom of two kids. They're five and nine. I certainly think that my husband and I, as their parents, have the responsibility and the authority to be making the major life decisions that shape their upbringing. No question. But I also think that when it comes to sharing information about them, that we're not required to share, it's not as if the school is saying we won't enroll your child, unless you go on Facebook and share a Halloween picture or say what their favorite candy is.

Leah:
This isn't like going to the doctor and giving the date of birth so you can get a medical record. If my husband and I make a choice to share that, that is information that is unlikely to ever really disappear from the internet, because nothing disappears from the internet. It's information that may seem innocuous now, when we are thinking about our kids as they are in a static moment in time, but first we may not actually have a comprehensive sense of just how innocuous it is.

Leah:
And if you think about communities where a parent might be, again, in the most well-meaning way, let's say sharing a Halloween costume picture with their social network on social media. Well, if their child is being bullied or is feeling vulnerable and another parent's child sees that picture or screenshots it or shares it, that's something that can make the child feel embarrassed. And also, Andy, we very likely aren't thinking about how that information might play out over time. So something that feels very innocuous or even cute to us when we are thinking about our kids as they are right now, may feel very different to them when they are old enough to go online for themselves and be like, hey mom, why did you tell the world I wet my bed until I was seven?

Andy:
I really wish I could have kind of kept that secret.

Leah:
Yeah, exactly. And again, I do think that almost all parents are coming from a place of good intention or at worse, maybe just being a little bit careless. No one is trying to make their kids feel embarrassed. No one's trying to set their kids up for a tough time in adolescence. In fact, we're really trying to do the opposite. We're trying to say, look how proud we are or gosh, we're having a tough time in our house right now. Let's try to get help. And also, we may be trying to validate ourselves or seek reassurance for our own parenting choices, which is something that's important.

Andy:
Is this normal, are other people having the same problem too. Because like, and it feels good when other people say, oh yeah, same thing. My kid's eight and still wetting the bed or whatever. There's something to that, and then you kind of, ah, okay, good to know. Right. So there is value in that I think. And that is, I think also what's hard about getting to the teenage years because in general parents seem not to want to share as much stuff as their kids get older, because then it becomes more obvious that like, okay, they're developing like their own identity.

Andy:
And maybe it's not really my place to be sharing with my friends about, oh, hey, just had this crazy problem with my, I just really had a big fight with my daughter last night. And she just started her period and I don't know how to talk about it. And maybe that's not something to share, but especially when kids are younger, we don't necessarily even think about that. I think because they're not really a person yet even, they don't really quite have that identity starting to form. And it's easy to just say stuff, just put it out there.

Leah:
Absolutely. And I think the other thing to keep in mind about sharing information that seems innocuous or even heartwarming, is that if it falls into the wrong hands and I'm not trying to be alarmist at all, this is by no means any way, shape or form the majority of interactions online. But if you think about it, if you're sharing information like exactly where you and your family live, or even what your child's favorite candy is or what they're scared of, you are putting out information about their physical whereabouts. You're putting out information about their likes, their dislikes and things that can be manipulated or even abused in the wrong hands. And so that's a big concern. Another concern is that we don't have good transparency when it comes to what tech companies are doing with the information we put into them.

Leah:
We do not have, I know Apple recently is coming out with a nutrition style label, but we don't have across the board easy to read consistent nutrition style labeling for the devices and the services that we're using. So when we as parents click I accept or continue or whatever it is so that we can use an app or use a device, we really don't know what kind of bargain we're getting into. Even if we do try to read the fine print and I'm a law nerd, so I do read it. Good luck trying to understand as a consumer what it means to say we collect information from you, including the following types. It's never, almost never-

Andy:
Which may be used for any of the following purposes.

Leah:
Exactly. You tend not to get a definitive list of information types or a definitive list of purposes.

Andy:
Well, because as a company it's like our needs are going to evolve over the coming year. So we want to write this thing as broad as possible so that we don't even know what technology is going to, what we might be able to do with this data in five years. And so we want to write our user agreement in such a way that as our technological capabilities evolve over the coming years, we'll be able to evolve with it. And we'll still be able to use this data down the line.

Leah:
Totally. And look, from a perspective that is innovation focused and vendor focused, that makes total sense. And if I were counsel for one of those companies, I would be telling them the same thing.

Andy:
I would write it the same way. Yes. As broad as possible.

Leah:
But, where I come at it as a parent and then as a researcher, is we ultimately, if we're thinking about multi-stakeholder interests, right, so if you're the head of the startup or your counsel to that tech company, of course your obligation at that point is how do you maximize flexibility in the data that you are acquiring? Because your needs may evolve, your opportunities may evolve.

Leah:
But when we're taking a multi-stakeholder perspective and thinking what is best for our kids and our families, and even more broadly for society, it is not to have a wild west approach to personal data. I'm going to come out and say, I don't think that we, and we, of course it's not going to be every single person, but maybe majority approach, we don't want to live in a society where what opportunities you have, be it an educational opportunity, a career opportunity, a credit product opportunity, we don't want to live in a society where things that you did or that happened to you from the time of even before you were born, can be aggregated, analyzed, acted upon by companies over which you have no insight and certainly no control.

Leah:
When you think about the kind of autonomy and room for individual liberty and individual trajectory that characterizes the United States and many other countries, but I'm focusing on the US for the moment, those are the values that underlie our liberal democracy. So I do think that all of us, including ultimately the tech companies and the lawyers writing the agreements on behalf of the tech companies, should be really concerned about this from an ethical perspective.

Andy:
I think it's not a question of if, I think this is going to happen, this is already happening. This is what machine learning does really, really, really, really well. Collect a lot of data, get a huge data set and be able to predict outcomes in the future. And the bigger the data set gets, the better it gets at predicting outcomes. If we can collect a bunch of data, as you point out, maybe your kids using learning apps at school, maybe they're playing games and there's data about their reaction times and how good they are. Things just in games that they're playing on their iPad, whatever. Well, all of this data is cheap. It's not that hard for someone to gather all of it together and then to put that into a model and start predicting how well they're going to do in college, whether they would be good at this kind of a career, whether they be good at that kind of a job.

Andy:
And as soon as you get some kind of validity where you can actually demonstrate that above and beyond their SAT scores, above and beyond their high school grades, above and beyond their essay that they write to get into your college, this data can give us a better idea of who is going to succeed in your college. That's worth so much to a college. They're going to buy that data. It's just going to happen. Employers, well, who should I hire? It costs a lot of money to train a new employee. If you're hired, it takes a year and a half for an employee to like really get to the point where they're productive. If I'm a company and I'm going to invest in hiring a new employee, it might be worth spending $10,000 to hire a data company that's been collecting data on everybody for their entire life that can tell me out of these 10 applicants that I have, which one has the best potential.

Andy:
That just makes economic sense and it's going to happen. And the data that seems innocuous, that your kids are putting out there, that you're putting out there with just little posts that seem like not a big deal, just little things that you do, pictures from your family vacation, all of that stuff, we need to think a lot deeper about before we just put it out there I think. Because we have no idea what the capabilities are going to be in 20 years, when your kids start becoming adults and are out in the world, and we have no idea how this data might be used against them or to help them or anything. And it's better to be cautious and to think about it and to be aware, I think, not that we have to go disconnect from the internet and live in a cave in the backwoods somewhere, but just that I think we need to be a lot more conscious.

Leah:
I am a hundred percent with you. And what you said, Andy, made me think of two things. The first is on your excellent point that we don't know what is going to happen with this five, 10, 15 years. So no sooner did Sharenthood come out last fall from MIT Press, then I had to start keeping a list and you can see it here, I'll show you on the screen. This is the copy that was used for talks. And I started my handwritten notes. So I started keeping a list in the inside front cover of all the things that happened after the book went to press that I wished I had been able to include. So they'll be in a sequel, but one of the big ones was the New York Times ran, I think it was actually a cover story in the hard copy edition, but a prominent story last fall.

Leah:
And the headline was how photos of your kids are powering surveillance technology. And it was a wonderful piece of investigative journalism that looked at the ways in which photos that parents had posted of children, as well as other photos on social media, had been used to train surveillance technology. And the journalist said, who could have possibly predicted that a snapshot of a toddler in 2005 would contribute a decade and a half later to the development of bleeding edge surveillance technology. And the answer is that all of us paying attention right now in 2020, we don't have a crystal ball, we cannot predict exactly what the uses will be, but we should be on notice to sound lawyerly-

Andy:
It will be used for something.

Leah:
Exactly Andy. And then the other thing your point made me think of is there's an awesome book by Cathy O'Neil, Weapons of Math Destruction. And it's just a wonderful book. And one of the things that Cathy talks about in here is all the ways in which the hiring industry, so the kinds of tools that you were talking about, right, to try to help employers predict who is going to be a good fit, in whom should I invest resources and training. And she found in her book that in 2016, which is now several years ago, it was already a $500 million industry. The products and services to help assess fit. And that 60 to 70% of adult applicants for jobs in the US were taking some sort of employment fit assessment. And these are largely online at this point. And so to then say, okay, when our kids cohort is coming into the jobs, they are going to have so much more of a data trail that can just be folded in. So it's the wave of the future, right?

Andy:
Add it to the picture. It's another thing to consider when you're hiring people. Not that it's going to be the only thing necessarily, but the more accurate it gets, honestly, I mean, it might get to the point where that's all you need to know, because it it's more accurate than any thing that you could possibly get from doing an interview with somebody. It might get to the point where it's like, you don't even need to have an interview with somebody because people can lie during interviews. People can act really good at stuff when they're not really, but the data doesn't lie. And we can look back to all these apps you used to use when you were a kid and we can see how you really spent your time in high school, we can see, who knows, right? We can see what your actual propensities are for all kinds of different things. And we can just say, hey, who's actually, this person I think makes a lot more sense than this person.

Leah:
And we have to be really vigilant because you're right that you can get certain objective realities from data that you can't get from more subjective interactions like interviews. But we also have to be very on guard that the ways in which the products are set up into which the data's being entered and the ways in which data is being extracted and then analyzed and shared are free from built in bias, because very often they're not. And so there's the more sort of glass is half full type of view of the use of data in hiring or other contexts to say, gosh, it can create a more complete, more well-rounded, more objective picture. So let's say that I don't interview well, but it turns out you look at the data and I am just a whiz when it comes to spreadsheets. I'm not actually, but I have colleagues who are, so it all works out. But the data can tell you that. So then I get the opportunity that I would not have otherwise gotten.

Andy:
Right, it's a two-sided coin because also it's like, if you go into, if you hear this interview and you go into like, wow, alert mode, I need to stop my kid from using all apps and from doing the learning things at school and from everything, because that data can be used against them. Well, that's not necessarily the point either. Because if I'm an employer in 20 years from now looking who to hire, and your kid has no data trail whatsoever, then that doesn't really look too good to me either. I'm going to go with the one who I for sure can look at and say, yeah, the data shows this person's going to be a good fit with 99% accuracy, you're hired.

Andy:
So yeah, it's a very, very deep issue. There's not a simple solution, I think, which makes it really, you know, there's a lot of sides to consider. But I think we need to be considering them is the point and the point that you make, we need to be including our kids in that conversation and just talking to them about all of this too.

Leah:
Absolutely. And it's interesting, more and more schools are teaching some form of digital literacy or digital citizenship. So how you discern fact from misinformation or disinformation and how you engage online. And the digital world is one where many parents, certainly Gen X and above, I'm the tail end of Gen X. So I'll out myself and say, it's certainly true for me. I did not have email until I went to college. I did not have a cell phone until I graduated from college. I had a very funny, somewhat disheartening exchange with one of my law students a few years ago, she saw a whiteboard up on my wall and I made some comment about how, yeah, you know, now I use those for to do lists or planning, but in college, of course it was where we all left each other notes.

Leah:
And she said, oh, you used a whiteboard for something that wasn't decoration? And I was like, yeah. How are we supposed to find each other? You would walk by someone's room and if they weren't there, you'd leave a note that's like, in the dining hall, right? You weren't texting. We weren't even, particularly IM’ing, because as I said, I'm the tail end of Gen X. And so I do look back on that. I'm like, how did we even know where we were?

Leah:
But maybe we didn't, I don't know. Maybe there's still someone wandering around the dining hall being like, where is everyone? But, we as parents don't have as much digital sophistication sometimes even as our children do. And we certainly haven't had the formal education in it. And so it's a weird area where I'm not saying that we should give our kids veto power over the decisions we as adults make about digital life. Because if I did that, my nine-year-old would get to play Fortnite and not go to school, not things that are going to happen right now. But you know, Andy, we do have to be listening to and including our kids and also modeling for them that even in those situations where there's no right or wrong answer, we are having mindful values-based respectful discussions and internal deliberations about it.

Andy:
Yeah. We're being very mindful at thinking about it before we just click accept, I do, download now.

Creators and Guests

Andy Earle
Host
Andy Earle
Host of the Talking to Teens Podcast and founder of Write It Great
Leah A. Plunkett
Guest
Leah A. Plunkett
Teaching & WRITING on #sharenting #digitallife #privacy access to justice @Harvard_Law @bkcharvard I Author of Sharenthood @MITPress I former @UNHLaw prof
Ep 112: Consequences of Your Teen’s Digital Footprint
Broadcast by