Sorry, you need to enable JavaScript to visit this website.
Skip to main content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.

Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.

The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Kids Online Health and Safety Task Force Principal Listening Session at Stanford University

Officials from the Biden-Harris Administration’s Task Force on Kids Online Health and Safety engaged with and heard from expert on the health and safety of youth online at a listening session hosted by Stanford’s Internet Observatory and Social Media Lab with the Stanford Center for Youth Mental Health and Wellbeing, in collaboration with the Task Force. Officials from the White House Office of Science and Technology Policy, the National Telecommunications and Information Administration, and the Substance Abuse and Mental Health Services Administration joined with Stanford to welcome guests and to detail the Administration’s ongoing work to advance the health and safety of youth online. Representatives from the US Surgeon General’s Office (Director of Science and Policy), the Department of Justice’s Child Exploitation and Obscenity Section and the National Institute of Standards and Technology’s Applied Cybersecurity Division were among the government attendees at the event.

In addition to fireside chats, participants discussed a range of topics, including:

  • Young people are exposed to, and navigate, online communications at young ages, often with little direct help from parents, schools, or platforms themselves.
  • There is need for more transparency about what services and features involve – including using language for younger kids – well beyond when kids first start to use a service.
  • Better mechanisms are needed—and in some cases, already exist—to help shape kids’ online experiences as they grow up. These can include increasing levels of control for kids themselves as well as tools that allow for parental oversight.
  • Young people reported feeling a continued compulsion to use online services, despite negative impacts on their lives including loss of sleep, anxiety, and depression. They attributed this compulsion to both technical design features of online services as well as social pressures (the “fear of missing out” on what their peers were doing and saying online).
  • Unlike physical safety concerns for young people (such as the appropriate age to stop using a car seat), online safety issues can lack objective measures: Medical experts cannot pinpoint for parents or for companies at what precise age specific features and media are safe for the development and well-being of young people across the board.
  • Most online platforms and services have been designed to take user privacy, safety, and satisfaction into account to some degree, but few of these services were designed to consider young people’s well-being, specifically.
  • Online methods for stopping child sexual exploitation media are not adequate. There is a particular concern, given the advent of advanced image generation technology, that both platforms and law enforcement will soon be overwhelmed by synthetic CSAM, which could further interfere with efforts to identify and intervene in cases involving the current exploitation of real children. Machine-learning based image classifiers could be helpful in detecting CSAM (whether real or synthetic) but their development is severely constrained by existing law.
  • Industry and researchers lack common data formats and metrics for measuring youth well-being that would allow for better assessments of what is happening to kids online and to measure the efficacy of mitigation efforts.

This listening session will inform the Biden-Harris Administration’s ongoing efforts to address the harms America’s children and youth face online.

A wide variety of nearly 100 participants, including youth advocates; experts in mental health, design, safety and privacy; parents; and companies engaged in discussion. The session was held under the Chatham House Rule; below are the participants who consented to be publicly identified as participating:

  • Scott Babwah Brennen, Head of Online Expression Policy, Center on Technology Policy, University of North Carolina at Chapel Hill
  • Jeff Hancock, Harry and Norman Chandler Professor of Communication, Stanford University
  • Vicki Harrison, Program Director, Stanford Center for Youth Mental Health & Wellbeing
  • Ravi Iyer, Managing Director, USC Marshall School Neely Center
  • Parker Kasiewicz, Research Assistant, Stanford Internet Observatory
  • Daphne Keller, Director, Program on Platform Regulation, Stanford University
  • Jen King, Privacy & Data Policy Fellow, Stanford HAI
  • Sunny Xun Liu, Director of Research, Stanford Social Media Lab
  • Megan Moreno, Co-Medical Director of the AAP Center for Social Media and Youth Mental Health, University of Wisconsin–Madison
  • John Perrino, Policy Analyst, Stanford Internet Observatory
  • Riana Pfefferkorn, Research Scholar, Stanford Internet Observatory
  • Andrew Przybylski, Professor of Human Behaviour and Technology, University of Oxford
  • Jenny Radesky, Associate Professor of Pediatrics, University of Michigan Medical School
  • Stephanie Reich, Professor, University of California, Irvine
  • Thomas Robinson, Irving Schulman MD Endowed Professor in Child Health and Professor of Pediatrics and of Medicine, Stanford University
  • Sara Shah, Stanford University
  • Alex Stamos, Lecturer and Researcher, Stanford Internet Observatory
  • Dave Willner, Fellow, Stanford Cyber Policy Center

  • Aliya Bhatia, Policy Analyst, Center for Democracy & Technology
  • Anne Collier, Executive Director, The Net Safety Collaborative and NetFamilyNews.org
  • Maria Conticelli, Family Online Safety Institute (FOSI)
  • Corbin Evans, Senior Director, American Psychological Association 
  • Jennifer Heifferon, Program Director, Child Well-Being, California Partners Project
  • Larry Magid, CEO, ConnectSafely
  • Matt Motyl, Senior Advisor and Resident Fellow, USC Neely and the Integrity Institute
  • Mitch Prinstein, Chief Science Officer, American Psychological Association
  • Natalie Shoup, Industry & Data Lead, Safe Online
  • David Sullivan, Executive Director, Digital Trust & Safety Partnership
  • Charlotte Willner, Executive Director, Trust & Safety Professional Association and Trust & Safety Foundation

  • Josh Blumenfeld, Government Affairs and Public Policy, YouTube / Google
  • Hank Dempsey, Head of US State Policy, Snap Inc. 
  • Brian Fishman, Chief Strategy Officer, Cinder
  • Sonia Gill, Director, U.S. Public Policy, Meta
  • Courtney Gregoire, Chief Digital Safety Officer, Microsoft Corporation 
  • Alison Huffman, VP of Community Health, Twitch
  • Jennie Ito, Senior Product Policy Manager, Roblox
  • Vaishnavi J, Founder, VYS
  • Alex Leavitt, Principal Researcher, Trust & Safety, Roblox
  • Megan Jones Bell, Clinical Director, Consumer and Mental Health, Google
  • Michelle Lee, Partner and Managing Director, IDEO
  • Patricia Noel, Mental Health Policy Manager, Discord
  • Elizabeth Plageman, Senior Trust & Safety Product Policy Advisor - Child Safety, Apple, Inc.
  • Leslie Taylor, Assistant Vice President, Trust & Safety, Genpact 
  • Catherine Teitelbaum, Principal, Family Trust, Amazon Kids, Amazon
  • Cathryn Weems, Independent Trust and Safety Consultant

  • Saahil Mishra, Founder and Co-Director, Unwiring
  • Khoa-Nathan V. Ngo, Youth Leader, GoodForMEdia
  • Zenia Rehan, Youth Leader, GoodforMEdia
  • Audrey Wang, Youth Leader, GoodforMEdia
  • Katy Zhen, Youth Leader, GoodForMEdia