Australians will soon be required to undergo mandatory age verification and identity checks when accessing online services, as part of new regulations set to take effect in December. These measures, developed by the tech sector in collaboration with the eSafety Commissioner, are designed to protect children from harmful content, including pornography and violent material.
The new rules, introduced under the Online Safety Act, will apply to platforms such as search engines, social media sites, and app stores. These services will be required to implement age assurance measures for all users, which could include using account history, facial recognition, or even bank card checks to confirm users’ ages, according to The Guardian.
Beginning in December, search engines will be mandated to activate “safe search” features for users under 18, blocking access to inappropriate content. Platforms that host harmful material—such as self-harm or violent content—will also be required to ensure that children cannot view it.
eSafety Commissioner Julie Inman Grant emphasized the importance of these measures, telling The Guardian, “It’s critical to ensure a layered safety approach, placing responsibility at key points in the tech ecosystem.”
While these new rules target specific platforms, some critics argue that they give too much power to large tech companies. Non-compliance could lead to hefty penalties, including fines of up to $49.5 million or removal from search results.
Despite the concerns, the eSafety Commissioner’s office has defended the regulations, asserting that they are essential to safeguarding young internet users, particularly when it comes to search engines.
With the regulations set to be enforced in December, the full impact on internet users in Australia will soon become clear.