Australia takes aim at Apple, Microsoft over child protection online

In this file photo dated March 14, 2020, an Apple logo adorns the facade of the downtown Brooklyn Apple store in New York. (KATHY WILLENS / AP)

SYDNEY – An Australian regulator, after using new powers to make the tech giants share information about their methods, accused Apple Inc and Microsoft Corp not doing enough to stop child exploitation content on their platforms.

The e-Safety Commissioner, an office set up to protect internet users, said that after sending legal demands for information to some of the world's biggest internet firms, the responses showed Apple and Microsoft did not proactively screen for child abuse material in their storage services, iCloud and OneDrive.

Apple Inc and Microsoft Corp confirmed they did not use any technology to detect live-streaming of child sexual abuse on video services Skype and Microsoft Teams, which are owned by Microsoft, and FaceTime, which is owned by Apple, the e-Safety Commissioner said in a report published on Thursday

The two firms also confirmed they did not use any technology to detect live-streaming of child sexual abuse on video services Skype and Microsoft Teams, which are owned by Microsoft, and FaceTime, which is owned by Apple, the commissioner said in a report published on Thursday.

A Microsoft spokesman said the company was committed to combatting proliferation of abuse material but "as threats to children's safety continue to evolve and bad actors become more sophisticated in their tactics, we continue to challenge ourselves to adapt our response".

Apple was not immediately available for comment.

The disclosure confirms gaps in the child protection measures of some of the world's biggest tech firms, building public pressure on them to do more, according to the commissioner. Meta Platforms Inc, which owns Facebook, Instagram and WhatsApp, and Snapchat owner Snap Inc also got demands for information.

ALSO READ: Apple invests $100b in Japan, CEO visits chip epicentre

The responses overall were "alarming" and raised concerns of "clearly inadequate and inconsistent use of widely available technology to detect child abuse material and grooming", commissioner Julie Inman Grant said in a statement.

Microsoft and Apple "do not even attempt to proactively detect previously confirmed child abuse material" on their storage services, although a Microsoft-developed detection product is used by law enforcement agencies.

An Apple announcement a week ago that it would stop scanning iCloud accounts for child abuse, following pressure from privacy advocates, was "a major step backwards from their responsibilities to help keep children safe" Inman Grant said.

READ MORE: Microsoft 'seeks to settle' EU antitrust concerns over Teams

The failure of both firms to detect live-streamed abuse amounted to "some of the biggest and richest technology companies in the world turning a blind eye and failing to take appropriate steps to protect the most vulnerable from the most predatory", she added.