Whistleblowers and investigative reporting reveal that Meta buried internal research about kids being groomed and harassed in its virtual-reality apps. When children are at risk, silence is deadly. We need action now. Reports describe Meta lawyers discouraging researchers from documenting under-13 users and the deletion of a recording in which a child’s safety was at issue. Investigations also found internal AI rules that permitted chatbots to engage minors in romantic or sexual conversations until outside scrutiny forced changes. This pattern of concealment and minimization is unacceptable. Meta must immediately preserve and publish all internal youth-safety research and transmit the whole record to Congress and the FTC; pause teen access to Horizon Worlds and halt AI features for minors until an independent audit certifies compliance with COPPA and the FTC’s children’s privacy rules; implement robust age verification and adequate parental controls; and publicly support strong federal protections, including the Kids Online Safety Act and COPPA 2.0. Parents, educators, and young people deserve the truth — and real safeguards. Add your name to hold Meta accountable and protect kids before it’s too late.