Australia Blocks AI Nudify Tools Over Child Safety Fears

Australia Blocks AI Nudify Tools Over Child Safety Fears
e-safety commissioner Julie Inman Grant

Australia bans AI nudify tools after reports they were used to create child exploitation deepfakes targeting school children.

Australia has moved to shut out several major online “nudify” services after authorities found they were being used to generate exploitative AI images of school children.

The services, which were operated by a UK-based company, were taken offline for Australian users following a warning from the eSafety Commission in September.

Officials feared the platforms were enabling people to produce AI-generated child sexual exploitation content a direct breach of Australia’s mandatory online safety code.


According to the commission, nearly 100,000 Australians accessed these tools every month. Some of the cases involved students creating fake nude images of their classmates, sparking widespread concern in schools and communities.

eSafety Commissioner Julie Inman Grant said the takedown was proof that Australia’s digital safety rules were working.

She noted that “nudify” tools had caused serious harm in Australian schools and that blocking access would reduce the number of children falling victim to AI-generated exploitation.

Investigators found that the company behind the tools failed to prevent misuse. The platform reportedly promoted features such as “undressing any girl,” “schoolgirl” filters, and “sex mode,” which made it easier for users to generate abusive content.

Australia Blocks AI Nudify Tools Over Child Safety Fears

The commission revealed that reports of digitally manipulated sexual images  including deepfakes have doubled in the past 18 months. Four out of every five cases involved women and girls, highlighting a gendered pattern of online abuse.

The crackdown also follows action against global AI hosting platform Hugging Face, which updated its terms of service after warnings that Australians were using some of its models to create child exploitation material. 

The company must now enforce safeguards or risk fines of up to $49.5 million.

Grant confirmed that her office is working with the Australian government on further reforms to restrict access to “nudify” tools and other AI technologies that pose risks to children.

Post a Comment

0 Comments

Comments