While potential harms from children's online presence are becoming more clear, mitigation tactics are not as cut and dry. The U.S. Federal Trade Commission remains committed to searching for the answers to best navigate those harms and prevent children's sensitive data from being collected.
Concerns surrounding potentially addictive algorithms, data collection practices and predatory acts have been a key focus for the FTC's online safety efforts. Updates to the Children's Online Privacy Protection Act Rule in January and upcoming enforcement of the recently signed TAKE IT DOWN Act covering nonconsensual deepfake images represent the agency's initiatives to combat some of the issues.
At the FTC's recent children's online safety forum, Chair Andrew Ferguson said, "We can make the internet a safe place for kids … We can do this while ensuring that America remains the world's beacon of innovation, and that we win the (artificial intelligence) race against our foreign rivals."
Ferguson called for a revision of COPPA that would strengthen age verification systems to prevent "unfettered access to online services on nothing more than an unverified, self-reported birthdate."
He noted Congress should pass privacy legislation to bolster parental rights and allow children's guardians to "erase any trace left by their children on these platforms, at all levels of granularity, from individual messages to entire accounts."
Legislative work
The FTC's updated COPPA Rule implemented additional parental controls, including "a separate consent requirement for non-integral disclosures to third parties, such as for third-party advertising, enhances transparency and enables parents to make more deliberate and meaningful choices."
Ferguson said legislation protecting children's privacy online should be "aimed at assisting parents in the exercise of their right to exert meaningful control over their child's activities online and the data generated by those activities."
Despite highlighting the importance of parental consent, Ferguson said the FTC cannot let its "zeal to assist parents in protecting their children online lead us to regulate too heavily and too broadly."
U.S. policymakers are also looking to bolster online protection for underage users with the Kids Online Safety Act, which was recently reintroduced to Congress after being voted out of the Senate and the House Committee on Energy and Commerce in the 118th Congress.
U.S. Sen. Marsha Blackburn, R-Tenn., is working to advance the bill in the face of ongoing concern from House Republican leadership over potential free speech violations the KOSA may bring. Blackburn highlighted the importance of the KOSA, noting it is "past time that we put in place protections for our kids in the virtual space, and it is past time that we give parents and kids the ability to protect themselves in the virtual space."
Enforcement themes
A primary concern for regulators is the use of children's data for advertising purposes. While COPPA requires parental consent to collect the data of children under age 13 for targeted advertising purposes, schools are allowed to consent on behalf of parents.
Georgetown University Communication, Culture, and Technology Program Associate Professor Meg Leta Jones raised red flags on education technology providers and their collection practices. She warned against insufficient transparency between schools and parents regarding whether edtech companies are receiving affirmative consent and how opt-out consent is being offered.
The FTC adopted a policy statement in 2022 committing to strong edtech oversight and enforcement, noting that "even as companies across the economy become more aggressive in harvesting and monetizing individuals’ data, edtech providers cannot do the same." The 2023 enforcement action against edtech provider Edmodo was the first case under the FTC's policy statement, ordering prohibitions around collection and establishing appropriate data retention schedules following allegations of nonconsensual collection.
In addition to ensuring edtech companies comply with data protection obligations, the FTC has focused its efforts on protecting underage users from harmful and addictive algorithms.
"The big thing is holding these platforms to account for the promises they make," FTC Chief Technology Officer Jake Denton said. "We have to really scrutinize the promises they've made and make sure that they're held accountable."
Denton added that the use by social platforms of individuals’ behavioral data can incentivize advertising companies to collect substantial amounts of user data. "The ad revenue is the treasure, and these kids are essentially the casualties, the collateral damage," he said.
TAKE IT DOWN Act
The FTC will begin enforcing the TAKE IT DOWN Act in May 2026, expanding the agency's enforcement of children's online safety requirements.
The new law, endorsed by first lady Melania Trump, will require social platforms to delete nonconsensual explicit images including AI deepfakes. The first lady issued a statement through her office Director of Policy Sarah Gesiriech. Melanie Trump indicated the legislation reflects the White House's efforts to protect individuals from harmful content online.
The Trump administration will continue to "work together to develop tools to empower parents and youth, and we will lean on tech executives in the private sector to do their part," Gesiriech said while reading the first lady's statement.
As the U.S. maintains its focus in establishing itself as a leader in technological advancement, Ferguson said the "purpose of innovation in a just society is to promote the flourishing and success of ordinary families in that society. We must keep this purpose in mind as we consider which tradeoffs we are willing to make for technological progress."
Lexie White is a staff writer for the IAPP.