Skip to main content
EdCast

How to Support Your Child’s Digital Life

The challenges of raising a child in the age of digital media
Parent and child on a tablet

When it comes to navigating a child’s digital life, there are many challenges facing today’s parent, says Katie Davis, Ed.D.’11. While emphasis is often placed on screen time limits, Davis says this is often a simplistic approach to managing children’s digital media use and families need to go deeper. 

Davis, an associate professor at the University of Washington, has long researched the impact of digital technologies on young people. In her latest work, she explores a wide range of technology and its impact on children at multiple stages of development — from toddler to 20-something. She reflects on her own experience as a parent and encourages families not to stress about the challenges of raising children in the age of digital media. 

“We do our best as parents to steer our children towards positive technology experiences, and we do our best to monitor what they're doing, ultimately, the challenges are bigger than what we can solve within our family, and it really takes more than individual families to address these challenges,” Davis says. “It takes government regulation, and it takes technology companies changing their practices.”

In this episode of the EdCast, Davis talks about how children engage with technology at each stage of development and how they can best be supported.

TRANSCRIPT

JILL ANDERSON: I'm Jill Anderson. This is the Harvard EdCast. 

Katie Davis thinks parents need more and better support figuring out their child's digital life. She's been studying the impact of digital technologies on young people's development for almost 20 years. Children today are exposed to many different types of technology, beginning at birth. It's hard for parents to navigate technology, let alone determine whether it's right for their child. 

This is partly why it's been easier to focus on screen time limits, rather than examining the design and quality of technology for kids. Katie says parents need to go deeper by considering the stage of their child's development. I asked how parents can do that to decide whether a technology is a good or bad fit for their kid. 

Katie Davis
KATIE DAVIS: I think you can open up your computer, open up your browser and find so many different reports about good technologies, bad technologies, what technology is doing to our children's brains, their behaviors. And it's really hard to know how to apply that to your particular context and how to make good parenting decisions. 

And what I've tried to do in my work is to distill the research, both mine and other people's research, into really two key questions that parents, and really anybody who is involved in a child's life, can ask to try to understand, how is this particular technology impacting this particular child? 

So the two questions are, first, is it self-directed? And by self-directed, I mean technology experiences that place children in the driver's seat of their digital interaction. So when a technology experience is self-directed, the child, rather than the technology, is in control. 

And then the second question to ask yourself is, is this technology community supported? And by community support, I mean technology experiences that are supported by other people, either during a particular digital experience or surrounding it. 

I use these two questions to guide my own parenting decisions around all of my son's technology use. When I'm looking at or trying to decide, should I let him play this game, or should I let him watch this show? Or should I let him watch another show, and another show after that? I'm thinking, OK. Is he in the driver's seat of his technology experience? And is he getting the kind of community support from me and other people in his life that can help him to make the most of his technology experiences? 

JILL ANDERSON: I'm wondering if you can give me maybe a few examples of what this might look like when a child is in the driver's seat versus not, and what that community perspective might be. Because I feel like there's a lot of gray area in that. 

KATIE DAVIS: Absolutely. And the examples will be different for younger children versus older children and teens. So if we start with younger children, when I was writing Technology's Child, my son was three, four, and five. It took me a while to write it. 

So he went through different stages of development. But he was a young child. And there were two apps that he liked a lot during that time. And one was a drawing and painting app called Pepper's Paint Box, based off of Peppa Pig. And the other one was a game, Paw Patrol-- Rescue Run. 

And when Oliver plays with Pepper's Paint Box, I feel like for the most part, he's in control. So it's a very open ended app. You can choose what colors you want to use. You can choose what you want to draw.  One thing that I notice when he's playing with it-- and this is something to really pay attention to is, can you have a conversation with your child, as they're engaging with the technology? Or is their attention being completely co-opted by what they're doing on the screen? 

So with a drawing app like Pepper's Paint Box, Oliver is going at his own pace. He's choosing what he wants to do. He's choosing how long he wants to spend drawing with this app. He's showing his artwork off to me. We're having a conversation about it. 

And so really, when he's using this app, he's in the driver's seat. I feel like his experiences are self-directed. It's kind of when kids are playing in the analog world, playing with blocks or something, that has sort of a predictable cadence of about 15 or 20 minutes or so. And then children kind move off to something else. Same thing with this app. 

It's very different when he plays Paw Patrol-- Rescue Run. That app has a very clear path for him to follow, very clear objectives. He has to complete a particular mission. There's one way, pretty much, to do it. There are all these different rewards along the way that he can collect, like badges and pup treats, that keep him playing and keep him wanting to collect more and more and more. 

When I try to have a conversation with him when he's playing the game, it's very difficult. He will kind of tune me out completely. And so that's an example of a technology experience that is somewhat less self-directed. 

Now it doesn't mean that I never let him play this game. But I try and steer him towards more of these open ended, user paced apps more than the other type that are more-- the pace is dictated by the way the app has been programmed and developed. 

JILL ANDERSON: And as kids get older, is there some examples of how that might look different? Because of course, they get older. And then smartphones come into play. 

KATIE DAVIS: Absolutely. 

JILL ANDERSON: You don't see what they're doing, probably. 

KATIE DAVIS: Then it becomes harder to see. But it doesn't mean that these two ideas of self-directed and community supported don't come into play. So for instance, if you have a teen, if you're trying to figure out, OK. I can't exactly see what they're doing on their phone or on their computer. So how am I supposed to know if it's self-directed or not? 

Well, one thing is, it's really important to talk with them, and to observe, and to open up these conversations in a very non-judgmental way. And then from there, to look for evidence of, well, are they doing things like exploring their interests or learning new skills? 

So maybe they're watching a cooking channel on YouTube, or they're watching cooking videos on TikTok, and they're developing those skills, or they're developing their art skills or music skills. That's something that is an example. It's an indication that they're using technology in a self-directed way. 

Maybe they are adding their voice to an important social issue, and their voice is being heard on that sense. Maybe they're finding a community of people who share their interests. Or perhaps they're exploring an emerging identity that they're not yet completely comfortable with sharing offline, especially if that identity is, in some ways, marginalized. And maybe they're finding a community of people to really support that online. 
So all of these are examples of a teen who is using technology in a self-directed way. They are in the driver's seat. They are experiencing some sort of personal growth and experiencing connection with other people and so on. 

Examples of less self-directed experiences for older teens would be encountering hate speech or cyberbullying. I hear a lot from teens that they are often drawn into mindless scrolling, whether it's on Instagram or TikTok, or letting autoplay on YouTube take them from one program or video to another. 
Actually, autoplay is a really good example of a feature, a design feature, that really undercuts a person's ability to be self-directed in their technology use. Because it's the algorithm that's determining what you see and what you do next, rather than your attention and your decisions. 

And you really have to have a lot of self discipline and a lot of intentionality to turn that off, and to decide to do something else. And people of all ages struggle with that. So it's not just children and teens. 

Those are examples. So autoplay, algorithms, hate speech, or content that exacerbates an existing insecurity, such as body image concerns and things like that. The tricky thing with teens and social media platforms is that the good and the bad can often be present on the same platform, depending on how they're using the platform, and depending on who they're following and what content they're coming across. 
So that's where community support really comes in, and having open conversations, starting from a point of genuine empathy and curiosity, and going from there, and also sharing your own struggles. We all have struggles with technology. 

And in one of my studies, actually, we compared a bunch of parents and a bunch of their teen children. And we found that the way they spoke about their own phone use was so remarkably similar. And the way they spoke about each other's phone use was also remarkably similar and displayed less empathy. 
So they were very empathetic towards themselves in their own struggles, both parents and teens. But when they thought about the other-- when parents thought about their teens, or when teens thought about their parents-- they really weren't as much thinking about, what's compelling my parent, or what's compelling my teen to check their phone, or to go on Instagram or TikTok? 

And so really starting from a point of, well maybe their experiences, maybe the specifics are different. But the struggles are often very similar. 

JILL ANDERSON: I'm sure there's a lot of people who would agree with that, myself included. How do parents know if their child is using appropriate technology for where they are in development? 

KATIE DAVIS: When we think about young kids, there's so many different developmental milestones that are happening. But a really important one is learning how to control your behavior and your emotion. And so researchers call that self-regulation. Sometimes, they call it executive function. 

And so really, that developmental skill in young children is so important, and it predicts a lot of future outcomes, such as academic performance, peer relationships, all sorts of things. And so for parents who have young kids, that's the piece that I would really zoom in on. 

Is this technology supporting or undermining their ability to regulate their own behavior and their own emotions? And oftentimes, the way technology is designed undermines young children's ability to self-regulate. 

Researchers call these dark patterns. So these are specific design features that are incorporated into technology specifically to keep users engaged, without regard to how that engagement is affecting their well-being. 

And so things like rewards-- I was talking about the Paw Patrol game, badges and points. Those are the kinds of dark patterns that are really there just to keep kids engaged, to co-opt their attention, in a way, and just keep them playing, without any regard to what that playing is doing or how it's influencing them. 
It's very difficult for children to learn to self-regulate in that context. Other examples of dark patterns that are sometimes used in technologies geared towards younger kids would be things like parasocial relationships. So for example, you might have a character in a game who cries if you log off. And so the child is compelled to keep playing, because they don't want this character to cry and to be sad. 

Even things as simple as making it difficult to find your way to the home screen, that's a type of dark pattern. If you can't find your way home, you're kind of stuck in this app. You keep on playing. And your attention, again, it's not in your control. 

So for young kids, I would say, pay attention to whether this particular technology is allowing children to exercise their ability to regulate their behaviors and their emotions. 

The other piece of that is that-- and I have certainly done this as a parent-- when my son is super tired or cranky, sometimes I have put them in front of a screen to help calm him down. And so I'm using that technology to regulate his emotion and his behavior. 

A little bit of that, I think, is fine. If you're a busy parent you're at your wits end, you can sometimes feel like that's your only choice. But if that becomes the default, that you're always using technology as a security blankets-- is a pacifier, really-- then your child is not learning how to regulate their behavior. And that's a really important skill to learn in early childhood. 

JILL ANDERSON: As you're talking, I'm sitting here nodding because I can see all of these experiences happening in my own home. And it's a little nerve wracking as a parent. You think, well, what do I do? Do I just take these things away from my child and not let them have those specific games or apps, or even when you think about the autoplay on streaming networks? 

KATIE DAVIS: Well, this is where the idea of the good enough digital parent comes in. And this idea actually comes from the middle of the 20th century. There was a pediatrician named Donald Winnicott. He didn't write about the good enough digital parent. He wrote about the good enough mother. Because this is the 20th century, after all-- and the middle of it, to boot. And so assuming that, I guess most fathers by default are good enough. 

But to be a good enough Winnicott talked about, it's actually doing your child a disservice to be 100% available to them 100% of the time. Because if they have a problem, you're there immediately to fix it. They're not developing their own resilience. They're not developing the ability to get out of their own binds or to get un-bored if they're feeling bored. Again, it relates to the ability to regulate your own behavior and your own emotions.

And so the concept of the good enough digital parent, I think, fits really well with this idea. Good enough digital parents, they're not settling for imperfection, but rather embracing it. And so an incredibly important part of this is that they're doing their best. They're trying out different technologies. 
Maybe their child is really into Pokemon or Bayblades, as my child is. And they're saying, OK. Well, you want to watch that show? Let's see what it's like. They try it out. They decide they don't really like that. They put it aside and steer them towards something else. 

So good enough digital parents do their best to steer their children toward self-directed and community supported digital experiences. But they recognize that not every experience with technology is going to have these qualities to the same degree, and that's OK. 

So they're going to try as best as they can to maximize these positive experiences and minimize the less positive. But they're going to make mistakes. They'll learn from those mistakes, adjust, and move on. 
And I really don't think it's realistic to expect parents to ban certain technologies outright. Sometimes, it's just not practical. But really paying attention to, where can I help? Where may I need to make some adjustments? That's really, I think, key. 

And then when it comes to their own use of technology, good enough digital parents know that occasionally being distracted by a screen or other device is OK. And I think this really speaks to the guilt that so many parents have. And I know I have it as well. 

There's such high expectations on us in the 2020s to be the best parents we possibly can. There's even a name for it now. It's called intensive parenting. Parents today are spending more one-on-one time with their children, literally on the ground with their children, then stay at home parents of the 1950s and 1960s. And oftentimes, parents today are working at least one job. 

And so there's a ton of pressure on us. And that pressure extends to, are we paying attention to our children's technologies? Use are we making sure that we're limiting their screen time? Are we keeping them away from dangerous sites, and so on? So there's a lot of pressure on us. 

There's also a lot of pressure on us to not use our own technologies around our kids. And of course, the pandemic showed us how difficult that was. There's a lot of guilt there. 

And so good enough digital parents know that occasional distractions are no big deal. They can often be used as teachable moments. So whenever I find myself getting distracted on my phone or on my computer, I'll say, oh look! I've let myself get distracted by my phone. So let me put that away, and let's get back to where we were and what we were playing with. 

That, I think, is totally reasonable. And that's a quality of good enough digital parents. 
The other and final important piece of this is that good enough digital parents understand that they didn't create these challenges. So these challenges of a pinging phone or a game that's keeping your kid constantly playing, this is not something that we created. These are features that were very carefully and intentionally designed by technology companies to keep and hold our attention. 

And so while we do our best as parents to steer our children towards positive technology experiences, and we do our best to monitor what they're doing, ultimately, the challenges are bigger than what we can solve within our family, and it really takes more than individual families to address these challenges. It takes government regulation, and it takes technology companies changing their practices. 

JILL ANDERSON: When you say government regulations, I'm wondering what that might look like. 

KATIE DAVIS: Back in 2020, the United Kingdom passed their age appropriate design code. And when they did that, all of a sudden, some of these technology companies made some pretty major changes to the design of their platforms. 

So for instance, YouTube turned off autoplay features for children under 18, and they created take a break reminder when children have had YouTube open for several hours. Facebook and TikTok made all new accounts created by teens private by default. 

So these are all changes that we're compelled by the passage of this law. More recently, California just passed a child privacy law that's going to take effect in 2024. And it's modeled on the UK's age appropriate design code. And again, I expect that once this takes effect, and actually probably even before it takes effect, technology companies will change some of their features in order to make sure that they are compliant with the law. 

So I think these examples of government regulation, they show that first, relying on technologies to self regulate themselves has failed, and we can't do that. Because too often, the bottom line is conflicting with what's healthy for child development. not always, but too often, it does. 

And so that's where government regulation can really compel technologies to take individual users' well-being into account. And I think there's really concrete evidence that has worked. 

JILL ANDERSON: Your research is really looking at so many different technologies beyond solely screens, which we often hear so much about. And I was fascinated by the examination of, I think you're calling them conversational agents. Is that right? And video chatting, and even video games. And you're never really coming out and saying something's really bad, which I think has been the narrative for a really long time about some of these technologies. 

KATIE DAVIS: What I'm trying to argue for is, instead of saying this technology is good, this technology is bad, to pay attention instead to how it was designed and what those design features-- what sorts of actions they give rise to for specific children. This is where you have to be specific about the technology and its design, and then how that interacts with the specific child and their developmental level and their dispositions and their interests and abilities, and then the context that they're living in-- their cultural and socioeconomic context. 

All of those things need to come together before you can make a judgment of is this good or is this bad. You're right. I don't come out and say categorically, this is good. This is bad. However, there are certainly instances when technology use is not supporting a child's development. And too often, I think that the design of technologies isn't taking children's well-being into account. And I would really love to see that start to change. 

And so in the work that I'm currently doing with some of my colleagues at the University of Washington is, we're trying to think about, what would it look like to measure well-being instead of engagement? So the companies are paying attention to how long kids are spending on their platforms, and things like click through rates and time on site. 

But what if we come up with a whole different set of metrics that indicate when this platform is actually supporting a child's well-being and when it's undermining a child's being? What would that look like? 
It's not easy. It's hard. And there's a reason why this isn't the default. But it's important work to really try to figure out when and under what circumstances is a particular technology or particular platform supporting development and not. 

JILL ANDERSON: We hear so much about, screens are kind of becoming the linchpin or the sole area that gets blamed for everything from kids' mental health problems to, you name it. I feel like everyone's always pointing to the screens. 

KATIE DAVIS: Yeah, certainly. So the first point to make here is that teens and adults, I think, are more similar than they are different in terms of the struggles that they have. And I think that can be a really important basis for teens and their parents to come together and almost commiserate together on the challenges that they are experiencing. 

I think certainly, they have specifics that are different. So for maybe an adult or a parent, the stress is coming from work, whereas for a teen, the stress is coming from dynamics in their peer group. But the stress is still there. And it's often amplified by social media and other forms of technology. 
And so starting with that common ground, I think, can be really powerful and important. But then from there. I do think it's important for parents to understand that teens' brains are different. And how their brains respond to the things that are going on in their environment is different. 

And most notably, they have an increased sensitivity to social situations. Whereas you or I might react to a particular social situation in a measured way, for teens, the emotional reactivity is often much higher. And this kind of comes to play with the fact that the stakes are often higher for them. Because during adolescence, figuring out who you are and who you are in the context of your peer group is kind of a central developmental task. 

Anything that impinges on that, anything that relates to your identity development and your social development, has really high stakes. And of course, on social media, it's all social, and it's all about how you're presenting yourself. So everything there is really high stakes. And so I think it's important for parents to keep that in mind. 

JILL ANDERSON: What's your take on this idea of cell phone bans? 

KATIE DAVIS: My take on cell phone bans has changed, actually, over the years. When I first started my research back in 2005-2006, I would have probably pushed back more heavily on cell phone bans in schools, now, almost 20 years later, I am definitely more open to the idea. 

And that's because I think there's just so much distraction that comes with cell phones, that it can often outweigh the benefits. Back in 2005, 2006, 2007, we were talking about, we could use cell phones as a learning tool. Or what if kids need to contact their parents? If kids need to contact their parents, they could go to the office and contact their parents, or they could get their cell phone after school and contact their parents. Can we use cell phones as a learning tool? I haven't seen any evidence that there's some amazing property of cell phones that automatically turns them into an incredible learning tool. Maybe in certain circumstances-- maybe there's opportunities for augmented reality, for instance, on a field trip or something like that. 

But on the day-to-day inside the classroom, I haven't seen any good examples. And so the cost to attention and students' ability to focus I think outweigh any benefits there. So yeah, I'm definitely more open to that idea than I was. 

JILL ANDERSON: Katie Davis is an associate professor at the University of Washington and director of the Digital Youth Lab. She's the author of “Technology's Child-- Digital Media's Role in the Ages and Stages of Growing Up.” 

I'm Jill Anderson. This is the Harvard EdCast, produced by the Harvard Graduate School of Education. Thanks for listening. 

EdCast

An education podcast that keeps the focus simple: what makes a difference for learners, educators, parents, and communities

Related Articles