How your childhood photos could help stop child exploitation

Key points

  • Photos of children in safe situations will be used to teach artificial intelligence algorithms to recognise those in danger.
  • This means no real child exploitation material will need to be used in the building of the program which, once built, will help catch perpetrators and protect children.
  • Australian child exploitation investigations resulted in 114 children removed from harm  in 2021 – 65 internationally and 49 domestically. In 2020, 221 children were removed from harm – 130 internationally and 91 domestically.

When setting out to develop world-first technology to identify child exploitation images, researchers faced an ethical dilemma.

How could they obtain source material to “teach” the sophisticated algorithm how to recognise children in dangerous situations without using imagery that could exploit children’s safety?

People connected with the project donated the first of 100,000 childhood images needed to help “teach” artificial intelligence programs to identify children in safe situations versus children being sexually exploited.

To make the technology work — and to help prevent psychological damage to police officers having to look at endless volumes of harmful material — 100,000 images of children in everyday “safe” situations are needed.

They will be used to train the computer program to discern children from adults, and safe situations from dangerous ones.

Although it would be for honorable reasons, taking random images of children from the internet without their consent is not ethical.

Monash University IT experts and the Australian Federal Police collaborating at the AiLECS lab were struggling with the problem when data ethics expert Dr Nina Lewis made a suggestion that Leading Senior Constable Janis Dalins described as “an absolute facepalm moment”.

Her idea to invite adults to share safe pictures from their childhood — with no identifying information — was a eureka moment.

The My Pictures Matter campaign was born. People involved with the project and the university are digging out their old childhood photos so the images can be fed into the developing model.

The crowd-sourced effort is part of a global push to learn how to better identify child exploitation material which can be dangerously disturbing for human investigators to view.

“Reviewing this horrific material can be a slow process and the constant exposure can cause significant psychological distress to investigators,” Dalins said.

The computer model could identify victims faster and allow police to intervene to remove children from harm by learning to recognise the presence of children (ethically) in material that also contains sexual material.

Researchers are negotiating with producers of legal adult entertainment to acquire the necessary sexual material.

“It’s a response to the difficulties, legal and otherwise, of using actual abuse images to train the system,” said Lewis, who confirmed eSafety and police findings that technologically facilitated child exploitation had increased since people were spending more time online during the pandemic.

A spokeswoman for the Australian Federal Police said that each year, the Australian Centre to Counter Child Exploitation received more than 22,000 reports of child sexual exploitation, “and they continue to increase”.

Since the pandemic began, the centre has identified more than 800,000 registered accounts using anonymous platforms such as the dark web and encrypted apps “solely to facilitate child abuse material”.

The AFP arrested 233 people and laid 2032 child abuse-related charges in 2021. Investigations resulted in 114 children removed from harm – 65 internationally and 49 domestically.

In 2020, the AFP arrested 214 people and laid 2217 alleged child abuse-related charges. Investigations resulted in 221 children removed from harm – 130 internationally and 91 domestically.

Associate Professor Campbell Wilson, AiLECS Lab co-director, said machine learning would not replace human investigations but would help because the volume of material to be sorted “is increasing exponentially”.

“Anything that could prioritise where they should look first, and give early warning [of child exploitation] as they’re going through — that you might be about to come across some of this material — even from a psychological warning viewpoint would be very helpful.”

The subject matter was so confronting that during development work, “we are really careful even talking about it in the lab”, Wilson said.

Knowing the project might help protect children was rewarding.

“This is technology-facilitated crime and we just hope we can make a difference using the same sort of technology being used for its distribution,” Wilson said.

Crisis support is available from Lifeline on 13 11 14.

The Morning Edition newsletter is our guide to the day’s most important and interesting stories, analysis and insights. Sign up here.

Most Viewed in National

From our partners

Source: Read Full Article