WhatsApp plans supervised accounts for children under 13 with parental controls, restricted features, and end-to-end encryption.
Meta Platforms is planning a major update to allow younger users to access WhatsApp under strict parental supervision. The initiative aims to address growing concerns about how children use messaging apps and social media platforms in their daily communication.
The new system will initially launch in the United Kingdom and will allow children under the age of 13 to use WhatsApp with special safeguards. Instead of creating accounts independently, younger users will require parents or guardians to set up and manage their profiles.
This move comes amid increasing debate about the effects of digital platforms on young people. Governments, educators, and parents have been urging technology companies to introduce stronger safety measures to protect children online while still allowing them to benefit from modern communication tools.
Parental Control System for Under-13 Users
Under the proposed system, children’s accounts will be directly linked to a parent or guardian’s profile. Parents will be responsible for creating the account and verifying it, ensuring that the platform knows a responsible adult is overseeing the child’s digital activity.
The connection between the two accounts will remain active until the child turns 13. During this period, parents will have the ability to manage important safety settings and monitor who is allowed to contact their child on the messaging platform.
This system is designed to balance independence with safety. Younger users can communicate with friends and family, but parents remain in control of the contacts and permissions that determine how their children interact with others on the app.
Privacy Maintained Through Encryption
Despite giving parents control over certain settings, WhatsApp will continue to maintain its core privacy principle: end-to-end encryption. Messages exchanged between users will remain encrypted so that only the sender and receiver can read them.
This means parents will not be able to access the content of their children’s conversations. While some may see this as a limitation, the company argues that maintaining encryption is essential for protecting user privacy and preventing unauthorized access to personal communications.
By combining encryption with parental controls, the platform hopes to provide a balance between protecting children’s safety and respecting the privacy expectations that millions of WhatsApp users rely on globally.
Limited Features for Child Accounts
Accounts created for children under 13 will include several restrictions designed to minimize risks. These accounts will allow basic communication functions such as sending text messages and making voice calls, which are the primary uses of messaging apps.
However, several advanced features will not be available to younger users. This includes Meta’s AI chatbot, status updates, and channels that allow users to follow public content streams. Removing these features helps reduce exposure to unknown users and potentially harmful content.
Additionally, disappearing messages — a feature that automatically deletes conversations after a certain period — will also be restricted. This measure is intended to ensure greater transparency and accountability in communications involving younger users.
Regulatory Pressure Driving the Changes
The update comes at a time when governments and regulators are closely examining how technology companies manage children’s data and digital safety. In the United Kingdom, regulators have introduced strict guidelines requiring companies to provide child-friendly digital services.
The Information Commissioner’s Office has emphasized that online platforms used by children must comply with the country’s Children’s Code, which outlines how companies should protect young users’ privacy and personal information.
These regulations require companies to design services with child safety in mind from the beginning. Features such as parental supervision, limited data collection, and safe interaction controls are all considered important components of responsible digital platform design.
Responding to Parents’ Concerns
Meta says the new parental control system was developed partly in response to requests from parents. Many families want their children to have access to messaging platforms so they can stay connected with friends and relatives, but they also want stronger safeguards in place.
By introducing supervised accounts, the company hopes to provide a solution that satisfies both needs. Parents can monitor contact permissions and ensure their children interact only with approved users while still allowing them to participate in digital communication.
The update also reflects broader changes in how technology companies approach online safety. Platforms are increasingly expected to provide tools that empower families rather than leaving children to navigate digital environments on their own.
Balancing Access and Safety
Allowing younger users to access messaging apps presents both opportunities and challenges. On one hand, digital communication tools help children stay connected with family members, classmates, and friends, especially in a world where online communication has become normal.
On the other hand, unrestricted access to social media and messaging platforms can expose young users to risks such as online harassment, scams, or inappropriate content. Technology companies must therefore carefully design systems that provide access while minimizing these dangers.
Meta’s new approach attempts to strike this balance by limiting features and placing parents at the center of account management. This model ensures that younger users are not navigating the platform independently without oversight.
Potential Expansion Beyond the UK
While the new system is initially being introduced in the United Kingdom, it may eventually expand to other regions if the program proves successful. Many countries are currently debating how to regulate children’s access to social media platforms.
If the parental supervision model works effectively, it could become a template for similar systems worldwide. Technology companies are under increasing pressure to adopt consistent global standards for protecting young users online.
Meta has not yet confirmed which markets might receive the feature next, but industry analysts believe other European countries and North America could follow if regulators support the concept.
The Future of Child-Safe Messaging
The introduction of supervised WhatsApp accounts signals a broader shift in how messaging platforms approach youth safety. Instead of simply restricting children from using services, companies are exploring structured access models that allow participation under adult guidance.
This trend may lead to more platforms developing similar systems, where children can use technology within clearly defined boundaries. Features such as parental dashboards, content restrictions, and safety monitoring tools could become common across the digital ecosystem.
For Meta, implementing these changes is also an opportunity to strengthen trust among families and regulators. Demonstrating that children’s safety is a priority could help the company maintain its position as one of the world’s leading messaging platforms.
Conclusion
Meta’s plan to introduce supervised WhatsApp accounts for children under 13 represents a significant step toward safer digital communication for younger users. By combining parental controls, limited features, and strong encryption, the company aims to create a balanced environment where children can communicate while remaining protected.
The move also reflects growing global pressure on technology companies to prioritize child safety and responsible data practices. As regulators continue to focus on digital protection standards, similar initiatives are likely to emerge across the technology industry.
If successful, this model could reshape how messaging platforms approach younger audiences, offering families a safer way to introduce children to digital communication while maintaining essential privacy protections.
