Hello Mother, Hello Father: Bipartisan AG Coalition Calls on TikTok for Stronger Parental Controls, Content Moderation

  • North Carolina AG Josh Stein and Mississippi AG Lynn Fitch led a bipartisan coalition of 44 AGs in calling on popular social media applications TikTok and Snapchat to strengthen internal parental content controls and content moderation and urging the companies to collaborate with third party parental control applications that alert parents when their child has been exposed to potentially harmful messages.
  • In the letter, the AGs discuss how children are exposed to a wide range of online dangers on social media platforms, including cyberbullying, drug use, sexual predation, and disturbing sexual content including depiction of abusive sexual relationships. While acknowledging that the platforms include some content moderation policies, the letter observed that some areas, such as direct messaging, are not monitored to the same degree, and that internal parental control settings can be changed or bypassed.  According to the letter, third-party parental control apps, which many platforms already use, can monitor content that the platform itself does not monitor, such direct or private messages, which allow parents the ability to intervene when their child is exposed to a threat.
  • As previously reported, another bipartisan coalition of AGs recently announced an investigation into whether TikTok has violated state consumer protection laws by designing and promoting its social media platform to children, teens, and young adults, in a way that exacerbates physical and mental health harms.