Life is full of scams. And you know at least one of them intimately if you’re a parent who’s been duped by a company peddling a product that swears it’ll make your child smarter, more talented, or happier.
The rise of the internet, with its countless apps, games, and websites, makes judging such marketing claims even harder than it was in the analog world. Not only do you need to assess whether an app or site will live up to its promise, you also worry about strangers and bullies dropping into your child’s DMs, whether she’ll forsake sleep for the glowing screen, and if the time she spends on a device will lead to loneliness and social anxiety instead of confidence and fulfillment.
When confronted with these overwhelming possibilities, many parents understandably give up on trying to steer their children toward the safest online experiences. That can be especially true when the demand for screen time is coming from a preschooler in mid meltdown or a preteen begging to join a digital clique of friends playing Fortnite. Handing over your phone and hoping for the best has become a quintessential parenting experience of the 21st century.
“Handing over your phone and hoping for the best has become a quintessential parenting experience of the 21st century.”
It is possible, however, to become a savvier parent who is more intentional and empowered when it comes to where your children spend time online. Getting there requires rethinking how you view a child’s safety on the internet by expanding the definition beyond traditional fears: strangers and bullies. A digital product that is reasonably free of those menaces still isn’t necessarily safe for a child. What you want are online experiences that also prioritize privacy, civility, creative expression, meaningful connections, intellectual stimulation, and IRL experiences.
To help you find those products, we’ve rounded up a list of 12 of the safest places on the internet, which you can find here. The steps below lay out a comprehensive strategy to help you look for and identify safe online environments, using principles and recommendations shared by child development researchers and screen time experts. These steps are designed to make you feel more in control and ready to discuss any request from a child to download an app, join a website or platform, or play a game.
Step 1: Understand how to fight back, even when you feel helpless
Jenny Radesky, a pediatrician who sees patients with autism, ADHD, and developmental delays, knows that trying to manage a kid’s screen time often feels demoralizing. In her developmental behavioral clinic at the University of Michigan, where she’s an assistant professor of pediatrics, she talks to patients’ caregivers, many of whom feel like they lack the authority or conviction to guide a child’s screen time.
“They’ll say, ‘This is just going to happen … I don’t have any control over this,” Radesky explains.
What those parents might not know is that Radesky is one of the nation’s foremost experts on screen time. She was the lead author of the 2016 American Academy of Pediatrics guidelines on young children’s media use, which recommended avoiding digital media (save video chatting) for kids younger than 18 months, and proposed only one hour per day of screen use for children ages 2 to 5. Ideally, that one hour is filled with high-quality programming or content that a parent and child watch together. Co-viewing with children of any age should lead to a conversation that helps kids understand what they’re learning and then apply lessons, knowledge, or insights from educational content to the world around them.
Some parents see these guidelines, along with other expert recommendations to curtail screen use, and sigh deeply. One hour a day may feel like nothing to the single parent who needs to get dinner on the table, the stay-at-home mom, dad, or caregiver who can’t afford a full week of preschool, or the parents who entertain their children with tablets during a long daily commute.
Radesky understands parents’ frustration and resentment, and it’s because she believes the internet is designed to hold us all captive. Engineers, she says, deploy persuasive design to maximize engagement and clicks. Such design can exploit the brain’s systems for managing control and enjoying rewards when we encounter something new or receive external affirmation. These systems are particularly immature in children’s brains, so they are often drawn to anything new and shiny.
And while Radesky encourages parents to co-view media with their children, she understands that digital environments made for them are so enticing that kids often can’t shift their gaze away from a screen to engage in a conversation about what they’re watching or seeing, never mind translate anything they’ve learned to their own lives.
“There’s only so much any of us can do when the internet is designed to tap into and take advantage of the way the brain works,” says Radesky.
That sense of helplessness can be even worse for parents who already feel ineffective, perhaps because they’re constantly exhausted, have a child with intense needs, or otherwise struggle at home. Then there’s the question of resources. Radesky says her lower-income patients end up more reliant on free games and apps for their children, which typically contain more ads and trackers and aren’t optimized for meaningful engagement and intellectual stimulation. Instead, they’re more likely to feature “sticky” design components like streaks, rewards, and endless scrolling.
“If your kids’ eyes are so stuck on what’s going on on the screen that you can’t get them to look up at you, then maybe it’s a little too sticky,” says Radesky.
Radesky suggests parents rely on a few key tactics to combat feeling overwhelmed, powerless, and ineffective. First, remember that persuasive design, including algorithms, put the equivalent of visual candy in front of your kid. That’s not about a child’s best interests; it’s a move to maximize engagement, and therefore, profit. Take the time to explain that to a child and help them develop a healthy skepticism of technology. Even kindergartners may grasp the concept of being manipulated, but the idea may be of particular interest to adolescents and teenagers eager to indulge their rebellious impulses.
In general, many parents find it difficult to adapt screen time practices based on developmental stages. A 4-year-old who occasionally uses a literacy app will need different guidance than a 14-year-old who plays Minecraft every day. Common Sense Media, an international nonprofit organization, helps parents navigate that challenge by rating and reviewing children’s media, including apps, online games, and YouTube channels. Its ratings are based on child development research, and reviewers consider whether products are age-appropriate and have educational value, in addition to looking for signs of positive messages, violence, sex, consumerism, risky behavior, and other important themes.
Regardless of a child’s age, Radesky says one critical strategy is to take an active role in what they consume online and tailor screen time rules to their specific needs. You can do this by paying attention to how they respond to various types of content and experiences. If they come to you with a new idea, piece of knowledge, or funny joke — in other words, if you like the way they think and behave after interacting with certain media — that’s a sign that it’s a good fit for them, says Radesky.
“Guilt about technology is such a useless emotion because it just paralyzes us and makes us feel like we’re not doing a good job.”
Jenny Radesky, developmental behavioral pediatrician
Content that’s fast-paced, packed with distractions or transitions, or relies on usage streaks and constant scrolling, on the other hand, might negatively influence a child’s ability to regulate their own emotional responses. So if you notice that your child can’t be pulled away from the screen and then can’t describe what they saw or watched, or becomes whinier, more demanding, or increasingly sullen, consider talking to them about what’s happening and potentially moderating their access to that content.
Radesky also recommends that parents reflect on their own screen time choices. If you reflexively pick up a phone because you’re bored, want to be distracted, or need to de-stress, that’s a moment to pause, consider what you’re feeling, and solve that problem without using your phone. Your routine choices in those moments model how to respond to boredom and anxiety, says Radesky. You can bet that your child will mimic you.
If that’s already started, Radesky recommends skipping the part where you feel bad about your child’s screen time habits: “Guilt about technology is such a useless emotion because it just paralyzes us and makes us feel like we’re not doing a good job.”
Step 2: Expand your definition of online safety
When the internet first came into homes via dial-up modems and AOL CD-ROMS, our biggest worry for children was contact with strangers who had an ulterior motive. By the mid-aughts, tragic stories of digital bullying became a focal point for parents’ concerns. When you hear the words online safety, those fears probably come to mind first.
While it’s common for adolescents and teens to make new friends online, it’s unclear how frequently strangers with harmful intentions who are not in a child’s peer group actually contact young users. Anonymous chat apps certainly make these interactions much easier, which is why it’s important to know what tools your child regularly uses or has downloaded. Meanwhile, inconsistent research makes it difficult to measure the prevalence of cyberbullying; studies have shown that it can range from just 3 percent to nearly three-quarters of youth surveyed, depending on the time frame and definition of cyberbullying used.
If parents want to worry less about a child’s contact with strangers or bullies, they can find out what kind of blocking and reporting tools are available on the sites, platforms, and games children frequent, and have a discussion about when and how to use those tools. Elizabeth Englander, a researcher who focuses on bullying and digital behaviors and is director of the Massachusetts Aggression Reduction Center, recommends talking to children about what they’re doing online, asking specifically what’s fun about various apps, whether users post mean or hostile comments, and how they handle encountering such content.
“We really should be focused on how to help kids maximize mental health in this world we’ve put them into.”
Elizabeth Englander, director of the Massachusetts Aggression Reduction Center
Englander’s own research suggests that experiencing a form of cyberbullying doesn’t mean it’ll automatically affect a child’s mental health. Parents may assume they’ll be devastated, but what’s more important is the child’s broader social context, she says. If a kid is embarrassed by a peer online, yet it appears to be a one-time incident and they have otherwise strong relationships and support at school, the event is unlikely to significantly affect them. If the person who targeted them online also does the same at school, and the child additionally lacks supportive connections and encouragement offline, however, it can indeed be devastating. A recent Lancet study points to the complexity of how cyberbullying can affect children. It found that frequent social media use affected girls’ well-being by potentially exposing them to cyberbullying but also perhaps because that screen time reduced their sleep and physical activity.
Englander wants parents to look beyond the possibility of stranger contact and cyberbullying. Like other experts who study children’s screen time, she prefers not to even use the word “safety” to describe the quality of youth online experiences. She believes the way we talk about the subject can be misleading, prompting some parents and adults to obsess over unlikely scenarios involving strangers and bullies. That can obscure the real problem, which Englander frames with a provocative question: “How can children use this incredibly compelling form of interaction in a healthy way?”
“We really should be focused on how to help kids maximize mental health in this world we’ve put them into,” she adds.
Evidence suggests, for example, that screen time can be associated with poor sleep and increased feelings of loneliness, depression, and anxiety. A recent study published in JAMA Psychiatry, which accounted for a history of mental illness, found that adolescents who spent more than three hours a day on social media may be at higher risk for mental health problems.
Researchers disagree over the significance of findings like these, but it’ll be years before we can confidently understand the relationship between screen time and youth mental health. That’s why some experts argue that we can’t wait until scientists have proven a causal relationship between device use and well-being. Beyond potential mental health risks, there’s the fact that just spending time on certain chat boards and forums, like Reddit’s r/TheRedPill and 4chan, can introduce young people to violent views, creating a gateway to radicalization for misogyny, white supremacy, and religious extremism.
If that seems like a distant parenting concern, consider the popularity of the now-banned YouTuber Soph, a 14-year-old who shared racist and anti-Muslim screeds with an audience of nearly a million subscribers. For parents who imagine that their child’s internet use looks similar to their own — think banal Facebook updates from friends, YouTube videos here and there, a stream of stories from Apple News or Twitter — Soph is an unsettling reminder that kids can easily stumble into an online ecosystem that’s wildly different from the one adults experience.
Hilarie Cash, a psychotherapist and co-founder and chief clinical officer of reSTART, a facility in Washington state that treats gaming disorder, says the young men she sees often grew up feeling isolated on the internet and had very early exposure to pornography (as young as 6) they’d not intended to see. They eventually found solace in gaming and forums, many of which trade in misogynist perspectives and beliefs.
By the time they reach Cash, they’re unable to do much else but game or spend time online and they experience anxiety and depression. Cash says their symptoms begin to “evaporate” once they’re away from screens, sleeping enough, and socializing with others in person.
While those are extreme cases, it’s important to remember that online threats to a child’s well-being often look different depending on their identity and background. When Rutgers University sociologist Jeffrey Lane spent years studying youth in central Harlem, he found their social media use often put them at greater risk for digital police surveillance. In certain instances, authorities and adults, including parents, saw troubling social media posts and positively intervened. But in other cases that monitoring led to indictments for being associated with alleged gun violence.
And while many parents worry that their child has too much internet access, Lane says the low-income teens he studied often suffered without digital technology, because they used text and instant messaging to help each other pay bills, get clothes, and eat meals.
“Screen time in the context of these kids wasn’t necessarily a fear of missing out, it was where their next meal would come from.”
Jeffrey Lane, Rutgers University sociologist
“That’s where you see how vital having a phone is for young people who don’t have a lot of family support, and who don’t feel supported or protected in their schools or don’t have that many resources,” says Lane, whose research became the book The Digital Street. “Screen time in the context of these kids wasn’t necessarily a fear of missing out, it was where their next meal would come from.”
Meanwhile, says Lane, some of those teens tried to become entrepreneurs on social media platforms, drumming up potential business for a fledgling personal care brand or music career. For those teens, being able to watch and learn from, or even post their own YouTube tutorials, may represent their primary access to digital tools that promise to transform their life.
Such complexities are why it’s critical to rethink your definition of online safety so that it reflects your child’s broader sense of personal well-being, which includes sleep, privacy, intellectual engagement, creative expression, relationship building, and mental health.
Step 3: Understand the culture of an online experience
Just like your child’s school, sports team, or friend circle, digital environments have their own distinct culture. Those values and practices are directly tied to a product’s business model, the rules and community guidelines it sets, and how users interact with the environment and each other.
Given that most digital products are built on an advertising model that rewards user engagement as well as harvesting personal data for marketing purposes, parents simply cannot entrust purveyors of apps, games, platforms, and sites with their child’s safety and well-being. Instead, products serve up ads that, at a minimum emphasize consumerism, and at worst are inappropriate for children. Many also offer in-app rewards or purchases designed to drive longer engagement, rather than thoughtful play. In other words, aside from following federal law forbidding companies from tracking or collecting personal information from users under 13, not many do much to create a culture of safety or high-quality experiences specifically for youth.
The exceptions to that rule, however, offer important insights for parents. When asked which creators of kids’ digital entertainment they most respect, child development experts single out one again and again: PBS KIDS.
PBS KIDS develops “curriculum-based entertainment” as an arm of the national public broadcaster PBS. Their business model isn’t about selling ads. Instead, their money comes from government funding, donations, grants, and sponsorships, and they spend it partly on experts who help create enriching, fun content primarily for kids 2 through 8. Then kids in the target age group, many of whom come from low-income households, test the content and provide feedback so the creators can ensure it’s engaging and educational.
Sara DeWitt, vice president of PBS KIDS Digital, says the brand’s philosophical approach focuses on finding ways to use new technology and media so that it improves a child’s overall well-being, including their educational development, relationships, and ability to thrive in society. PBS KIDS designs games, apps, and experiences so that the child is in charge of the experience, but within boundaries designed to protect their safety and enhance their well-being. (Some apps are free, others can be purchased for a one-time cost.)
DeWitt says her team is against persuasive design and doesn’t believe time spent should define the success of a digital experience. That means PBS KIDS apps aren’t built to draw children into nonstop play. Some apps even encourage children to explore the real world.
Nature Cat’s Great Outdoors app / PBS KIDS
Play&Learn Science app / PBS KIDS
“Every single digital piece has a connected, offline activity,” says DeWitt.
The Nature Cat’s Great Outdoors app, which is based on the animated PBS KIDS series Nature Cat, encourages kids to record what they hear while sitting under a tree. The Play&Learn Science app includes tips for parents on how to conduct real-life experiments with water, shadows, and weather. In most cases, the images or sounds children might capture with a PBS KIDS app are stored locally on their device, and not collected by the developers.
On the PBSKids.org website, which receives 6.1 million visits per month, users can browse games and environments that mirror the world of a character from a PBS show. But these experiences are designed so that a child can’t easily click on an external link. If they do manage to click on such a link, perhaps one meant for parents, a pop-up sign makes it clear they’re leaving PBSKids.org.
The site itself is fully encrypted, and human moderators review user-generated content that could be objectionable, like drawings made with the website’s tools. PBS KIDS doesn’t collect personally identifiable information from children. When a child logs into the site’s virtual world, Kart Kingdom, their username must be approved by a moderator and encrypted passwords are automatically generated, which prevents them from sharing personally identifying information.
A selection of games available at PBSKids.org.
PBSKids.org
PBS KIDS has created an online culture with their products that reflects PBS’ longtime mission of supporting children, which DeWitt says goes all the way back to Fred Rogers. Few organizations or companies can draw on such a rich history, but parents can look at the role of experts, the emphasis on play and enrichment, and the importance of privacy and security, and search for products that create a similar culture for children.
While PBS’ business model makes it possible to focus holistically on children’s well-being, it’s not impossible for privately funded companies to do the same. When Zach Klein co-founded JAM, a site for kids 17 and younger to learn skills via how-to videos, he knew safety would be critical to the platform’s success. As the co-founder of Vimeo, Klein knew well the dangers of “anything goes” digital culture.
“We discovered that’s not practical for maintaining a safe community,” he says. “You have to draw a line about what good citizenship looks like.”
“You have to draw a line about what good citizenship looks like.”
That’s why JAM, which recently merged with and became DIY.org, adopted community guidelines that focus on kindness, friendship, and respect, in addition to being “an awesome digital citizen” and keeping personal (yours and other users’) information safe. Users are instructed to never give out any personal information that could identify them in real life. That includes their address, links to social media accounts, and full name. They’re also told to avoid asking other DIY’ers for their personal information, and to always get permission before posting personal details about someone else. One of the site’s guidelines is a sentiment other platforms don’t make explicitly clear: “When in doubt, report.”
These guidelines aren’t relegated to the fine print, either. When someone becomes a member, DIY.org asks them to record a video of themselves reciting a playful pledge to live up to the rules: “I promise to be AWESOME at all times. To be fearless to try and fearless to fail. To never put myself or others down. To spend more time under the sun than staring at my screen. To keep my family’s top secret spaghetti recipe secret. To help make this special place better than I found it.”
The platform only publishes courses and videos created by vetted experts. DIY.org’s community of fellow users cheer each other on. There is no way for kids to complete a course, challenge, or project, like drawing bootcamp, taking a landscape photo, or making fluffy slime using tools within the platform; all of that work must be done in real life, off the app and site.
“[Parents] are satisfied because they see that hour when their kids use our app on a screen, it activates them to get off the screen,” says Klein. “That’s what we’re designing for.”
That has sometimes been a hard sell for investors who measure success by time spent, says Klein. Currently, DIY.org is available through a monthly or annual subscription for the equivalent of $15 to $25 per month, depending on which plan you choose. Klein knows the cost of DIY.org presents a significant barrier that a free service like YouTube does not, but he looks for corporate partnerships that provide complimentary memberships and courses to users, like this one with the Cartoon Network. For now, paying for a subscription is surely a deal-breaker for countless families, but the DIY.org experience is a radical contrast from watching instructional videos on YouTube.
“Our ambition is to help any kid anywhere, learn any skill,” says Klein. “We want to create a space where kids aren’t afraid to try.”
On DIY.org, for example, an algorithm doesn’t eventually lead to questionable or horrifying content. There are no ads hawking products to kids. When a child shares a video of her completed project, creeps don’t show up in the comments. Instead, other users post encouraging feedback. The team at DIY.org is set on keeping bad actors off the platform through a combination of parental verification, software monitoring, and full-time human moderators.
A brand like YouTube can employ the same strategies but isn’t able to create the safest environment possible given its business model and scale. (The F