back to top

Online grooming crimes have doubled since 2018, with victims as young as four, charity warns | UK News

Share post:

- Advertisement -



Online grooming crimes have doubled in the UK since 2017 to a record high, with one victim just four years old, according to the NSPCC.

Warning: Readers may find details in this story distressing.

The children’s charity said 7,263 offences were recorded by police in the year to March – almost double the 3,728 in the 12 months to March 2018.

Its new report, which described the eight-year rise as “deeply alarming”, was based on data from police across the country via freedom of information requests, with Lincolnshire Police the only force failing to provide information.

A tech platform was identified in a little more than 2,100 offences, with messaging app Snapchat the most widely used – in around 40% of the cases.

The NSPCC said 9% of cases happened on WhatsApp, and 9% on Facebook and Instagram. All those platforms are owned by Meta.

While girls made up 80% of victims in cases where the gender was known in the past year, the youngest victim in that period was a four-year-old boy, the charity said.

The NSPCC said it wasn’t told how the boy had been groomed, and declined to say which police force recorded this crime for fear the child might be identified.

One possible reason for the increase in cases may be due to the introduction of a new offence of sexual communication with a child, which was brought into force in England and Wales in April 2017, aimed at tackling groomers who target under-16s through mobile phones and social media.

The offence has been recorded in Northern Ireland since 2015, while a similar offence was introduced in Scotland in 2010.

There has also been the introduction of the UK’s Online Safety Act. Last month, a married father became the first person in the UK jailed for encouraging a child to self-harm – after he created a secret online world to control and abuse a young girl.

Despite the large increase, the actual number of victims could be even higher, the charity said, as each offence recorded by police may involve more than one victim and multiple methods of communication.

The society also warned that the true number of grooming offences being committed is likely to be “much higher, due to abuse happening in private spaces where harms can be harder to detect”.

The high proportion of offences on Snapchat could be down to its popularity among British children, as almost three-quarters use the platform, according to Matthew Sowemimo, the charity’s associate head of child safety online.

Mr Sowemimo pointed out that it’s easy for users to add each other via a ‘quick add’ that allows adults to send direct messages, contacting “a very large number of child users”.

Read more:
What Epstein emails say about Trump
NASA cancels launch
Decades of abuse ignored – report

Perpetrators have adapted their methods to take advantage of the opportunities presented, the NSPCC said.

Its research found that predators create multiple different profiles and manipulate young users to engage with them across different platforms.

The charity called on tech firms to analyse the metadata they have access to, to spot suspicious patterns of behaviour.

The charity said this would not involve reading private messages, but could flag where adults repeatedly contact large numbers of children or create fake profiles – strong indicators of grooming.

A spokesperson for Snapchat said: “We work closely with the police, safety experts, and NGOs in an effort to prevent, identify, and remove this activity from our platform and, where appropriate, we report offenders to help secure justice for victims.

“We block teens from showing up in search results unless they have multiple mutual connections and they have to be mutual friends or existing phone contacts before they can communicate directly.

“We also deploy in-app warnings for teens to help prevent unwanted contact from people they may not know. We will keep strengthening our safety tools with the goal of making Snapchat an inhospitable place for people intent on doing harm.”

Meanwhile, Meta says it uses technology to “proactively identify child exploitation content” on its platforms, and between January and March this year, removed over six million pieces of such content from Facebook and Instagram, over 97% of it said was found proactively before it was reported.

It says it already provides the protections to its users recommended in the NSPCC’s report.

Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email jo@samaritans.org in the UK. In the US, call the Samaritans branch in your area or 1 (800) 273-TALK

- Advertisement -

Popular

Subscribe

More like this
Related