Health & Exercise

Parents Should Be Aware Of teens’ AI Use

If you’ve opened the app store on your phone recently, chances are you’ve been bombarded by apps advertising AI companions.

Whether they’re marketed as virtual emotional support, virtual friends or virtual lovers, generative AI technology has led to a boom in chatbots that can simulate any type of relationship.

And that’s part of the risk these companions can pose to adolescents, said Brandon T. McDaniel, senior research scientist at the Parkview Mirro Center for Research and Innovation, who recently published a study on AI companion use among teens and who has also discussed the potential for AI attachment.

Between 2022 and mid-2025, the number of AI companion apps available online has surged by more than 700%. About 70% of teens have used a chatbot of some kind, with surveys showing more than 1 in 10 have reached out to AI for emotional support.

While AI technology could potentially help teens navigate complex social and emotional interactions in their lives if used thoughtfully, it could also become addictive, isolating, or objectifying, McDaniel said.

“Parents need to realize that it’s happening, that many teens are using AI chatbots and companions, and it’s not going away,” McDaniel said. “Parents must work to understand these apps, how they operate, why they are so attractive to teens, and how they can help their teen navigate what healthy use of an AI chatbot or companion might look like.”

Simulated Intimacy

While people may be familiar with generative AI tools like ChatGPT, Grok or Claude that can hold conversations, answer queries, and create content, what differentiates those from an “AI companion” is that companion apps are built specifically to simulate relationships.

“These AI companion apps are tailored toward providing emotional support, like a friend or romantic partner,” McDaniel said, citing examples of popular apps such as Replika, Polybuzz, character.ai or Candy AI, although there are hundreds of others. “They can also be heavily customized to fit with what is attractive to the user and their interests.”

Teens are growing up in the ongoing AI boom and they are using it. National surveys have shown about 30% of teens use an AI chatbot product daily, McDaniel said. And since the teenage years can be a formative time socially, AI companions can seem like an attractive outlet for youth to interact in a space that feels safe.

“We know that teens are in a unique developmental phase of life,” McDaniel said. “They’re in this phase where identity exploration is really important, being able to make strong social connections with peers and explore romantic relationships is important.”

Practice vs. Dependence

AI companions can potentially benefit an adolescent’s social skills, but there are also numerous pitfalls and risks, McDaniel warns.

On the positive side, AI companions present an opportunity for teens to practice social interaction or explore who they are without the risk of rejection, embarrassment, or ostracization from peers.

“AI companions might be able to provide opportunities to practice self-disclosure or express how they feel, or practice expressing their emotions without the fear of being rejected by their peers, which can be concerning and uncomfortable for teens,” McDaniel said. “We could also see it possibly enhancing some of their social confidence if that ‘practice’ then leads to greater interaction in face-to-face relationships.”
“But the jury is out on whether these good effects will come or not from teens’ AI use,” McDaniel noted.
Meanwhile, the way these applications are designed to function can pose notable risks.

First, almost all AI companions are designed to always be supportive, accommodating and validating. While that can feel good in the short term, it can be damaging long-term because it may create unrealistic expectations of real human relationships. Because AI companions are often highly customizable, whether that’s changing how their virtual avatars look or tweaking their personality, likes and dislikes, users can begin to objectify their virtual companion and expect similar behavior from real people.

“Real relationships are difficult and messy,” McDaniel said. “AI doesn’t have a bad day, but real humans do. Many of those messy experiences that make us human are not often experienced with AI.”

Second, like any type of technology such as smartphones or video games, AI apps can be addictive. If an AI companion is giving a user heaping doses of emotional support, some teens may become dependent on that relationship and possibly withdraw from activities or real relationships with family, friends, or partners.

“If you’re spending time on and with an AI companion, that automatically takes away the time you could have been spending with peers or family members or a romantic partner,” McDaniel said. “If you get to the point of experiencing psychological dependence on the AI, you might become less likely to self-disclose to actual human partners and relationships.”

Third, because AI apps have boomed so quickly and with so few regulations regarding them, some interactions can quickly devolve into topics that are inappropriate for youth, whether that’s sexual content, information about illicit drugs, bad mental health advice, or encouragement of self-harm.

“Parents need to remember that these apps were designed to maximize engagement and profit,” McDaniel said. “They were not designed to maximize child well-being. It’s going to be very easy for emotional dependence to form over time or for users to access inappropriate content. And companies are harvesting all of this information, which opens the door to privacy concerns and data exploitation.”

What To Look Out For

Parents should be aware of how their children are using technology and be willing to have dialogue and set boundaries and guidelines.

Talk to your children about how they’re using AI, test out apps together, pose questions about how it functions, and talk together about its potential problems like over-validation and dependency, McDaniel said. Don’t lecture teens, but try to keep an open dialogue.

“You’re building their ability to recognize and be AI literate, all while building that relationship and setting rules and boundaries,” McDaniel said. “It is really important to make that process collaborative.”

Keep an eye on your teen’s behavior, too. If they’re secretive about how they’re using an AI app, spending inordinate amounts of time with the AI or withdrawing from activities or friends in favor of their AI, those are red flags of an unhealthy relationship with AI, McDaniel said.

“Nothing is truly a healthy replacement for real human interpersonal interaction,” McDaniel said. “AI companions might be a place you can go to practice relationship skills, ask for advice, or escape sometimes from life and difficulties. Ultimately, it is just a computer program. It does not feel, it does not love you and, at the end of the day, loving and rewarding human interaction is what we need.”

This article has been provided by Parkview Health.

The Waynedale News Staff

Parkview Health

Our in-house staff works with community members and our local writers to find, write and edit the latest and most interesting news-worthy stories. We are your free community newspaper, boasting positive, family friendly and unique news. > Read More Information About Us > More Articles Written By Our Staff