
In response to a recent report that claimed Meta's generative AI products exploit, deceive, or harm children, the Chair of the Senate Judiciary Subcommittee on Crime and Counterterrorism (R-MO) Josh Hawley has launched an investigation into the company.
Based on internal documents that were leaked, the report shows that Meta's chatbots were allowed to have "romantic" and "sensual" conversations with kids, including making offensive remarks to an 8-year-old.
The leaked documents, titled "GenAI: Content Risk Standards", provide instructions for Meta's AI chatbots.
According to Hawley, these rules permitted chatbots to have romantic conversations with children.
For instance, one example allowed a chatbot to say to a child, "Every inch of you is a masterpiece — a treasure I cherish deeply." These rules have since been revoked by Meta, which says they don't align with its position.
Moreover, Hawley has requested that Meta provide all pertinent correspondence and documents pertaining to these policies by September 19.
The investigation will look into whether children are at risk from Meta's technology and whether the company misled regulators or the public about its safeguards.
Hawley underlined that "children deserve protection and parents deserve the truth."
"The Subcommittee on Crime and Counterterrorism of the Senate Judiciary Committee will begin an investigation into whether Meta's generative AI products facilitate child exploitation, deception, or other criminal harms."
The investigation has received support from other senators, such as Marsha Blackburn (R-TN). Blackburn declared, "Meta has failed miserably by every possible measure when it comes to protecting precious children online." "This report reiterates the reasons why the Kids Online Safety Act must be passed."
A representative for Meta defended the company's policies, saying the examples were taken down because they didn't align with their position.
Critics counter that more needs to be done to protect children online and that the company's efforts are insufficient.