The NSPCC reports a 25% increase in UK cases of child abuse imagery-related crimes.

Estimated read time 3 min read

The number of cases related to child abuse images has increased by 25% in the last year, based on data collected by the NSPCC.

Around 50% of the recorded incidents took place on Snapchat, with Meta’s suite of apps (Facebook, Instagram, and WhatsApp) making up an additional 26%.

The data, obtained through requests for information under the Freedom of Information Act from 35 police departments nationwide, reveals over 33,000 incidents in which CAI was gathered and shared within one year.

Peter Wanless, chief executive of the NSPCC, expressed concern over the continued increase in online child abuse. He believes tech companies should take action to ensure their sites are designed to be safe, rather than waiting for regulation to be implemented.

“Behind these crimes are children who have been targeted by adults who are able to organise and share sexual abuse with other offenders seamlessly across social media and messaging apps.”

The figures represent a sharp increase over the last five years. The first time the NSPCC carried out the investigation, for 2017-18, forces provided data for just under 18,000 offences, while even last year the total had barely broken 25,000.

According to the charity, the rising number of CAI posts being shared online shows a rise in the demand for content depicting child sexual abuse. Susie Hargreaves, the CEO of the Internet Watch Foundation, emphasized that those who view, share, and distribute this material must understand that it is not a victimless act. These are actual children who are experiencing real abuse and sexual violence, the consequences of which can have a lasting impact. The IWF collaborates with technology companies and law enforcement to promptly remove CAI from the internet.

Last month, the IWF reported that 90% of webpages containing CAI also included “self-generated” images, captured by the person in the photo, often in a pressured or forced situation. Over 100,000 webpages were identified as displaying self-generated CAI of children under 10, although the same image may have been used on multiple pages.

A representative from Snapchat stated that the presence of child sexual abuse is abhorrent and not tolerated on the platform. The company utilizes advanced technology to identify and eliminate such content, as well as collaborate with law enforcement to assist in their inquiries. Additionally, Snapchat provides additional safety measures for minors aged 13 to 17, including prompts that appear if they are contacted by unfamiliar individuals.

Last year, the company’s spokesperson stated that 98% of the content was detected before it was reported, an increase from the previous 94%.

The NSPCC cautioned that the numbers for identified CAI may decrease in the coming months because Meta intends to enable end-to-end encryption for direct messages on Instagram and Facebook Messenger. The organization stated that the expansion of this feature should be postponed until Ofcom has the chance to review Meta’s risk assessment for the plans, considering the new regulations implemented by the Online Safety Act passed last year.

  • In the UK, the NSPCC offers support to children on 0800 1111, and adults concerned about a child on 0808 800 5000. The National Association for People Abused in Childhood (Napac) offers support for adult survivors on 0808 801 0331. In the US, call or text the Childhelp abuse hotline on 800-422-4453. In Australia, children, young adults, parents and teachers can contact the Kids Helpline on 1800 55 1800; adult survivors can seek help at Blue Knot Foundation on 1300 657 380. Other sources of help can be found at Child Helplines International

Source: theguardian.com

You May Also Like

More From Author