Skip to content
Published:

Senator Hassan Calls On Social Media CEOs to Stop Potential Exploitation of Minors

WASHINGTON – Today, U.S. Senator Maggie Hassan (D-NH) called on the CEOs of Meta (the parent company of Instagram and Facebook), TikTok, and X (formerly Twitter) to respond to a recent investigation that raised concerns about the ineffective enforcement of policies designed to protect children online. Currently, many social media companies require children to be at least 13 years old to have their own account, but younger children are often featured in potentially exploitative ways on accounts managed by parents. Reporting shows that some of these accounts feature children as young as five years old, with predators leaving concerning comments and even buying their worn clothing. Senator Hassan is pushing the CEOs to review their online and child safety guidelines to better protect children.

Senator Hassan continues to work to protect children in New Hampshire and across the country. Earlier this year, the Senate passed bipartisan legislation that Senator Hassan and her colleagues introduced to allow current grant funds to be used to train and educate students, teachers, caregivers, and other adults who work with children in a professional or volunteer capacity on how to prevent, recognize, and report child sexual abuse. In 2022, the Senate passed bipartisan legislation that Senator Hassan co-sponsored doubling the mandatory time period for preserving information about child sexual abuse images that technology companies report to the CyberTipline of the National Center for Missing and Exploited Children (NCMEC).

Click to see the full letters sent to Meta, TikTok, and X, and the text of the letter sent to Meta is below:

Dear Mr. Zuckerberg:

I write today to express my serious concern over recent reporting in the New York Times regarding potentially exploitative social media accounts of minors. According to this reporting, many of the followers of these accounts are adult men, including those who frequently post predatory comments on those potentially exploitive accounts. This is extremely troubling, especially given the assurances that you made in your appearance in a recent Senate Judiciary Committee hearing on child safety online. Instagram must strengthen oversight of inappropriate activity related to these accounts as soon as possible.

In accordance with federal law, your platform requires children to be at least thirteen years old to have their own account. However, the accounts at issue appear to be often created, operated, and managed by the parents of a minor child – including children as young as five.

Further, these accounts can be monetized by the account holder through various mechanisms, including in-platform monetization tools, direct messaging between the account holder and other accounts, and directing users to other sites using links displayed on their Instagram account. For instance, the New York Times article reports that parents have sold “photos, exclusive chat sessions and even the girls’ worn leotards and cheer outfits to mostly unknown followers.”

These deeply concerning practices appear likely to violate child safety, user safety, and terms of use policies. Adults who inappropriately contact or otherwise engage with those accounts likely violate Instagram terms of use. 

Given the importance of addressing these issues as soon as possible, please respond to the following questions no later than April 8, 2024:

  1. Is Instagram aware of potentially exploitive minor accounts on its platform? Was Instagram aware of those types of accounts prior to the New York Times article?
    1. If yes, please provide deidentified, anonymized data on the number and disposition of any complaints that Instagram has received related to these types of accounts, content, and user interactions. Please provide data broken down by month and year, as well as policy changes, if any, made in response to any complaints.
  1. Does your service monetize (display ads on or within content, charge a fee for access or subscription, establish partnerships with the account, or any other similar action) content appearing on accounts like these?
    1. If so, please describe the process by which Instagram evaluates content for monetization initially, for monetization following complaints made against the account, and for monetization following adverse or punitive actions taken against an account following a sustained complaint or other company-determined violation of any company policy related to child online safety.
  1. Do these accounts violate Meta’s Community Standard on Child Sexual Exploitation, Abuse, and Nudity, Community Guidelines, or any other relevant policy?
    1. If they do not violate your company’s relevant policies, please describe the public and user interest that is served by allowing interactions between minors, a minor’s parent or guardian, and third-party accounts like the kind referenced in the New York Times article. 
    2. Please describe the process that your company uses to determine a violation of any relevant policy, including human or machine review and analysis.
    3. What resources does your company devote to analyzing minor children’s accounts as described in this letter and in the New York Times article? Please provide, to the extent possible, information on the resources devoted to those analyses in each of the past five calendar years.
    4. Please provide a description of the evaluation process, along with deidentified, anonymized data on the number and disposition of complaints (whether user-submitted or company-identified) relevant to accounts that follow, comment, message, solicit, or otherwise engage or interact with potentially exploitative accounts that were determined by your company to violate your terms of service or child protection policies.
    5. Please provide the same information and action requested in 3(d) for these types of accounts that do not violate your terms of service or child protection policies.
  1. Does Instagram remove, suspend, or take any other action when your company becomes aware of accounts that follow, comment, message, solicit, or otherwise engage or interact in inappropriate ways with minors, regardless of if an account is made by a minor child or a minor child’s parent or guardian? Please provide deidentified, anonymized examples and deidentified, anonymized data responsive to the question.
    1. Are accounts that are frequently blocked or reported for inappropriate messages or interactions with minor children’s accounts or accounts featuring minor children flagged, tracked, or in any way additionally scrutinized or monitored?
  1. Is there any proactive review or oversight processes specifically for accounts, including accounts that your company knows or has a reasonable basis to know are those of adult men, that follow, comment, message, solicit, or otherwise engage or interact with accounts featuring minor children?
    1. If yes, please describe the review or oversight process and provide deidentified, anonymized examples and deidentified, anonymized data responsive to the question.
    2. If yes, are there different policies in place to review accounts, including accounts that your company knows or has a reasonable basis to know are those of adult men, that comment, message, or otherwise engage with accounts:

1.                    Created by minor children between 13 and 18 years of age? If so, please describe the policy and provide deidentified, anonymized examples and deidentified, anonymized data responsive to the question.

2.                    Created by adults but prominently feature minor children, including those under 13 years of age? If so, please describe the policy and provide deidentified, anonymized examples and deidentified, anonymized data responsive to the question.

    1. If no, will Instagram commit to reviewing its online and child safety guidelines to address incidents of accounts, including accounts that your company knows or has a reasonable basis to know are those of adult men, that follow, comment, message, solicit, or otherwise engage or interact with minor children’s accounts or accounts that feature minor children?

1.                    Will Instagram commit to remove those accounts that interact inappropriately with minor children’s accounts or accounts that feature minor children, and to prevent the creation of future accounts that use identifying information associated with those accounts (removed or otherwise) that interact inappropriately with minor children’s accounts or accounts that feature minor children?

  1. Will Instagram commit to reviewing its online and child safety guidelines to address this type of content, to address any limitations on reporting and blocking accounts, address policies covering those who engage in inappropriate comments or messages, and take steps to address this on its platform?

Thank you for your prompt attention to this matter.

###