Australia’s eSafety regulator calls out Big Tech for failing to protect children online.
Australia’s online safety watchdog has sharply criticised some of the world’s biggest technology companies for failing to do enough to protect children on their platforms. Companies named include Meta, Apple, Google, and Microsoft. The regulator says these firms have ignored repeated warnings and have not taken strong action against child sexual exploitation and abuse.
The criticism comes from Australia’s eSafety Commissioner, who says progress has been slow despite years of discussion and guidance.
Serious Gaps in Abuse Detection
According to the eSafety Commissioner, one of the biggest concerns is the poor detection of live abuse during video calls. Many platforms still struggle to identify real-time harm when it happens, even though the technology exists to improve monitoring.
Another major issue is the failure to detect newly created abusive material. The regulator says companies focus too heavily on known content while missing new images and videos that spread quickly online. This gap allows harmful material to circulate before action is taken, putting children at serious risk.
Lack of Language Tools to Detect Extortion
The regulator also highlighted the lack of effective language analysis tools. These tools are meant to identify signs of sexual extortion, grooming, and manipulation in messages. Australian authorities have already shared common warning signs and phrases used by offenders.
Despite this, many platforms still do not use strong systems to flag harmful conversations involving children. The commissioner said it is difficult to understand why these tools are not already active, especially after clear guidance was provided.
Regulator Expresses Frustration
In a public statement, eSafety Commissioner Julie Inman Grant expressed strong disappointment with the lack of progress. She said authorities have been working with these companies for a long time and expected far better results by now. According to her, the failure to act raises serious questions about responsibility and priorities. She added that protecting children online should not be optional or delayed, especially when risks are well known.
Responses From Technology Companies
Several major companies did not respond to requests for comment, including Apple, Meta, Google, and Microsoft. Snap, the company behind Snapchat, did issue a response. It said it would continue working with the eSafety office on child protection. Snap also welcomed recognition of faster response times when harmful content is reported.
However, the company did not directly address claims that it has not done enough to prevent abuse or improve detection systems.
Growing Global Pressure on Big Tech
Australia is not alone in raising these concerns. Regulators around the world are putting more pressure on large technology firms to take responsibility for harmful activity on their platforms.
Governments and watchdogs increasingly argue that tech companies benefit from massive user bases and profits, so they must also carry the duty of care that comes with that power.
Julie Inman Grant described the issue as one of corporate conscience and accountability. She said companies must move beyond promises and show real action.
Australia’s Stronger Stance on Child Safety
Australia has already taken tough steps to protect young users online. Late last year, the country introduced a world-first ban on social media use for children under 16.
The move was controversial but was defended as necessary to protect children from harm, including exploitation, bullying, and exposure to inappropriate content. The latest criticism of tech companies fits into Australia’s broader push to hold platforms accountable for user safety.
Why This Issue Matters
Online child exploitation causes lasting harm. Abuse can happen quickly and often leaves deep emotional and psychological scars. When platforms fail to act early, offenders gain more time and access to victims. This is why real-time detection, strong language analysis, and fast response systems are essential.
Experts say technology companies have the resources and data needed to improve safety. The challenge is whether they are willing to act fast enough.
What Comes Next
The eSafety Commissioner has made it clear that patience is running out. Continued failure to improve could lead to stronger regulation, penalties, or enforcement actions. As governments worldwide tighten digital safety laws, companies may soon face greater legal pressure to protect children on their platforms. For now, Australia’s message is clear. Child safety must come first, and tech companies can no longer afford to delay meaningful action.
