Published Oct 12, 2025 ⦁ 12 min read
FTC COPPA Updates: What Parents Should Know

LongStories is constantly evolving as it finds its product-market fit. Features, pricing, and offerings are continuously being refined and updated. The information in this blog post reflects our understanding at the time of writing. Please always check LongStories.ai for the latest information about our products, features, and pricing, or contact us directly for the most current details.

FTC COPPA Updates: What Parents Should Know

The FTC's 2025 updates to the Children's Online Privacy Protection Act (COPPA) strengthen privacy rules for kids under 13, addressing modern challenges like AI and biometric data. Here's what you need to know:

  • Expanded Protections: The definition of “personal information” now includes location, voice, facial recognition, and other biometric data.
  • Parental Control: Companies must provide clear, simplified explanations of data practices and obtain separate parental consent for data use, especially for AI training.
  • Data Retention Rules: Platforms must limit how long they keep children’s data and ensure it’s deleted when no longer needed.
  • Biometric Data Restrictions: Explicit parental consent is required before collecting sensitive identifiers like voiceprints or facial metrics.

These changes aim to give parents more control and hold companies accountable in an AI-driven world. The updated rules take effect on June 23, 2025, with full compliance required by April 22, 2026. Stay informed and review platforms your child uses to ensure they meet these new standards.

What’s Next in Children’s Privacy: An Update on the FTC’s Proposed Changes to the COPPA Rule

Key Changes in the 2025 COPPA Updates

The Federal Trade Commission's (FTC) 2025 updates to the Children's Online Privacy Protection Act (COPPA) bring significant adjustments to keep pace with modern data practices, especially in the realm of AI. These updates focus on expanding protections for children's online activities, tightening parental consent procedures, and imposing stricter rules for data retention.

Expanded Definition of Personal Information

The updated rules now include a broader range of data under the term "personal information." This change reflects the diverse types of data collected through online platforms today, ensuring that more aspects of a child's digital behavior, such as location data or biometric details, are safeguarded under COPPA.

The revised guidelines demand a more transparent and thorough process for obtaining parental consent. Companies are now required to clearly explain what data they collect, how it's used, and whether it will be shared with third parties. Only after providing this detailed information can they seek explicit, informed consent from parents. This gives families greater oversight and control over their children's online data.

Tighter Data Retention and Deletion Rules

The new rules also introduce firm restrictions on how long companies can retain children's data. Organizations must create and publicly share clear data retention policies, ensuring they only keep information as long as necessary for its intended purpose. Once the data is no longer needed, it must be deleted. These measures aim to minimize risks associated with prolonged data storage and protect sensitive information over time.

These updates set an important foundation for addressing consent in AI training within platforms designed for children, ensuring their privacy remains a priority.

The FTC now mandates that platforms obtain separate, verifiable parental consent before using or sharing children's personal information for AI training purposes. This requirement stems from the fact that such activities are not considered part of a platform's core services. The FTC has emphasized that any sharing of children's data with third parties for AI training must have explicit parental approval.

Under the updated COPPA Rule, which takes effect on June 23, 2025, with full compliance expected by April 22, 2026, parents will encounter distinct consent requests: one for basic service functionality and another specifically for AI training.

For example, if a platform like LongStories.ai plans to use content created by children or analyze their interactions to improve its AI algorithms, it must first secure separate parental consent. This ensures parents have clear control over how their child's data is utilized in the development of AI systems.

This additional layer of consent not only increases transparency but also empowers parents to safeguard their children's digital privacy. It also sets the stage for further discussions about regulations surrounding biometric data.

Biometric Data in Children's Platforms

The 2025 COPPA updates have introduced stricter measures to address the collection and use of biometric data on platforms designed for children. This focus stems from the unique risks tied to biometric data, which can serve as permanent digital identifiers. Unlike passwords or usernames, biometric markers cannot be changed, making their protection especially critical.

What Is Biometric Data?

Biometric data refers to physical or behavioral traits that can uniquely identify an individual. On children's platforms, this often includes:

  • Voiceprints: Captured through voice-enabled AI features, these record speech patterns that could identify a child across different services.
  • Facial data: Used for creating avatars or applying filters, facial recognition technology collects precise metrics that the FTC now treats as highly sensitive.
  • Other markers: Traits like eye tracking, gait analysis, and typing patterns - common in educational and fitness apps - also fall under biometric data.

Each of these examples highlights the growing presence of biometric data in tools and apps aimed at younger audiences.

How Biometric Data Is Regulated

The FTC has prioritized biometric data due to its permanence and the risks tied to misuse. If compromised, biometric identifiers like voice or facial data cannot be reset, unlike passwords. This creates long-term vulnerabilities, especially for children who may not fully grasp the implications of such breaches.

The updated COPPA rules now require platforms to obtain explicit parental consent before collecting biometric data from children under 13. This consent must clearly explain how the data will be used, stored, and shared. The regulations also emphasize:

  • Identity theft prevention: Stolen biometric data could be exploited to impersonate a child in future digital interactions.
  • Cross-platform tracking concerns: Companies could use biometric identifiers to build detailed profiles of children's online behavior.
  • Data breach risks: Once leaked, biometric data cannot be reissued or replaced, leading to lasting consequences.

To address these risks, platforms must adopt data minimization practices, collecting only the biometric information necessary for their core functions. For example, biometric data cannot be used for advertising personalization or behavior tracking without additional parental consent. Platforms are also required to store this data securely and delete it once it's no longer needed for its original purpose.

For apps like LongStories.ai, which use voice analysis or facial recognition to create personalized content, these rules ensure that parents retain control over their child's biometric data. Parents can now decide whether the benefits of these AI-driven experiences outweigh the risks tied to biometric data collection.

These updated regulations reflect a broader commitment to protecting children's digital privacy in an increasingly complex online environment.

sbb-itb-94859ad

How to Check Platform Compliance: A Parent's Guide

With the updated COPPA rules now in place, it's more important than ever for parents to know how to assess whether platforms designed for children meet the latest privacy standards. A good starting point? Ask the right questions about how these platforms handle data.

Key Questions to Ask About Data Practices

When your child expresses interest in using a new platform, here are some critical questions to guide your evaluation:

  • What data does the platform collect? Look for clear and detailed privacy policies that outline the types of information being gathered. This could range from basic account details to more sensitive data like voice recordings or facial recognition.
  • How long is the data retained? Platforms are now required to specify how long they keep your child's information. Avoid services that rely on vague terms like "as long as necessary." Instead, look for clear timelines and assurances that data will be automatically deleted when no longer needed.
  • Who has access to the data? Platforms should disclose whether they share data with partners, advertisers, or third-party service providers. If they can't provide a straightforward answer, consider it a red flag.
  • Can you review and delete your child's data? A compliant platform should make this process simple and accessible. If the steps are overly technical or hard to find, the platform may not be prioritizing privacy.

Once you’ve clarified these data practices, it’s time to dig into the consent forms to uncover any potential issues.

Certain warning signs can indicate that a platform isn't adhering to the updated COPPA requirements. Here’s what to watch for:

  • Pre-checked boxes: Legitimate platforms require parents to actively give consent by checking boxes or clicking buttons. Anything pre-selected suggests non-compliance.
  • Confusing language: If the consent forms are hard to understand or don’t clearly explain what data is being collected and why, the platform may not meet the new transparency standards. Look for straightforward, plain-language explanations.
  • Bundled consent requests: Platforms must separate different types of data collection into individual consent requests. For example, they can't combine permission for account creation with biometric data collection in one form. Each type of data should require its own approval.
  • Lack of AI-related disclosures: If the platform uses AI features but doesn’t explain whether your child’s data will be used to train these systems, it’s not meeting the new requirements for AI transparency.

LongStories.ai as a Compliant Example

LongStories.ai

A great example of a platform that meets these standards is LongStories.ai, a service that has created over 5,000 personalized video adventures for children. The platform allows kids to star in fully voiced, animated stories generated from a single text prompt - all while respecting privacy.

LongStories.ai stands out for its commitment to minimal data collection. The platform only gathers the necessary information to create personalized animated content, avoiding invasive practices. For instance, when generating custom scripts, illustrations, and voices in under a minute, it does so without requiring sensitive data.

The platform also avoids common compliance pitfalls. Parents are given clear explanations of what information is needed and why. Consent forms are straightforward, with no bundled requests or pre-checked boxes to confuse users.

Beyond privacy, LongStories.ai focuses on educational value. The platform turns screen time into a learning opportunity, offering content that teaches science, history, and life lessons. Characters like Manny the Manatee and Professor Time create engaging, personalized stories without demanding excessive data, proving that platforms can balance creativity and compliance.

LongStories.ai serves as a model for what parents should look for: transparent data practices, minimal collection, and a focus on meaningful, educational content that respects both children’s privacy and parental authority.

Steps Parents Can Take

With the COPPA updates reshaping data practices, your involvement as a parent is more important than ever to protect your child’s digital privacy. The new rules provide greater control, but it’s up to you to take action and manage your child’s online footprint effectively.

Audit Current Apps and Platforms

Start by listing every app and platform your child uses. This includes everything from educational tools on tablets to gaming platforms, social media accounts, and AI-powered services they might use for homework or fun.

For each platform, locate and review their privacy policy - especially any updates made after the 2025 COPPA changes. Many companies have revised their policies to comply with the new rules, but some may still be lagging behind. Pay close attention to details like data retention timelines and whether they clearly disclose how AI is used.

To stay organized, consider creating a simple spreadsheet. Note which platforms meet the new standards, such as having clear retention timelines, transparent AI usage policies, and easy-to-follow data deletion processes. This will help you quickly spot platforms that need closer scrutiny or immediate action.

After completing your audit, take a moment to review and update your consent settings on each platform to ensure they align with the new rules.

The updated COPPA rules give parents more control over their child’s data. One key feature is granular consent, which allows you to approve certain types of data collection while rejecting others on the same platform.

Revisit the consent agreements for your child’s accounts. If possible, withdraw permission for non-essential data collection, particularly for sensitive information like biometric data or data used for AI training purposes. Many platforms now offer consent management dashboards, making it easier to adjust these settings without needing to contact customer support.

Make it a habit to review these preferences regularly. Kids’ interests and activities change quickly, and what was necessary a few months ago might no longer be relevant. Use this as an opportunity to delete inactive accounts and tighten privacy settings for the ones still in use.

Keep a record of your consent decisions and any correspondence with platforms about data deletion or changes. If a platform doesn’t respond to your requests within a reasonable timeframe, you can file a complaint with the FTC.

Stay Informed About COPPA Compliance

Managing privacy doesn’t stop with audits and consent - it’s equally important to stay informed about the ever-changing landscape of children’s privacy laws. Subscribe to updates from the Federal Trade Commission and keep an eye on their enforcement actions to see which platforms are falling short.

Parent forums can also be a valuable resource. These communities often share updates about policy changes, compliance issues, and suggestions for alternative platforms that prioritize privacy.

You might also want to follow organizations like the Electronic Frontier Foundation or Common Sense Media. They regularly publish guides and alerts about children’s digital privacy, breaking down complex policies into actionable advice for families.

When considering new platforms for your child, make COPPA compliance a dealbreaker. By prioritizing platforms that meet these standards, you can address potential privacy risks before they affect your family.

Conclusion: Navigating the New Era of Children's Privacy

The 2025 COPPA updates mark an important step forward in safeguarding children's privacy in an increasingly digital world. Since the original law was introduced in 1998, the online landscape has transformed dramatically, and these updates reflect the need for stronger protections.

As a parent, these changes give you more control over your child's personal information. With features like detailed consent options, stricter rules for AI training, and tighter safeguards for biometric data, you now have tools to make thoughtful decisions about which platforms deserve access to your family's data. But these tools require your active participation.

Taking charge means staying aware of your child’s online activity, regularly reviewing their digital presence, and using your consent rights wisely. Look for platforms that prioritize privacy and compliance, like LongStories.ai, which has created over 5,000 personalized video adventures while adhering to strict privacy standards. This shows how technology can innovate responsibly.

As new technologies and AI continue to advance, fresh challenges to privacy will emerge. By building strong habits now - such as questioning how data is used, demanding clear policies, and supporting platforms that respect privacy - you’re not just protecting your child today. You’re also teaching them to value privacy and make informed digital decisions as they grow.

The updated COPPA regulations shift the balance of power toward parents. Use these new rules to take decisive action and safeguard your child’s privacy in this ever-changing digital world.

FAQs

How can parents make sure their child's AI platforms follow the updated COPPA rules, especially regarding data and privacy?

To align with the updated COPPA rules, parents should make sure that platforms obtain verifiable parental consent before gathering any personal or biometric data from children. It's important to choose platforms that offer clear and accessible privacy policies, prioritize secure data handling, and provide tools for parents to review and manage their child’s information.

It's also wise to stay updated on how platforms handle sensitive details, particularly biometric data, and ensure they adhere to the enhanced privacy protections outlined in the revised regulations. Prioritizing platforms with strong transparency and reliable security measures can help protect your child’s online safety.

What should parents do if they find a platform violating the updated COPPA rules?

If you notice a platform isn't adhering to the updated COPPA rules, your first step should be reaching out to them directly. Ask them to address the issue and comply with the regulations. Take some time to review their privacy policies, and wherever possible, adjust settings to limit or completely opt out of data sharing.

If the platform fails to resolve the problem, you can escalate the matter by reporting the violation to the Federal Trade Commission (FTC). Filing a complaint is straightforward through their official website. Keeping up with FTC announcements and enforcement actions can also help you stay proactive about protecting your child's privacy.

For more serious concerns, consider consulting a legal professional. They can guide you on the best course of action to ensure your child's personal information remains secure.

Parents are now required to give specific consent for AI training, thanks to recent updates to COPPA (Children’s Online Privacy Protection Act) by the FTC. These updates mandate explicit approval before a child’s personal data can be used for purposes like developing AI systems. The goal? To give parents more control over how their child’s information is managed.

This requirement for separate consent plays a key role in preventing unauthorized data use or sharing, especially with third-party AI platforms. It’s an extra layer of protection designed to safeguard your child’s privacy and security in today’s increasingly digital landscape.

Related posts