Pocket-lint is supported by its readers. When you buy through links on our site, we may earn an affiliate commission. Learn more

(Pocket-lint) - The NSPCC - National Society for the Prevention of Cruelty to Children - has reported on the increasing rates of inappropriate sexual communication involving children on social networks - grooming.

With data gained through Freedom of Information requests, provided by 39 of the UK's police forces, the number of reported crimes involving inappropriate contact with a child online was established, showing the increasing rates of incidents. In some cases, initial online contact resulted in physical abuse. 

The number of cases reported over the past 18 months stands at 5,161, with the NSPCC saying that there has been a 200 per cent increase in the use of Instagram to target children over that time period - but highlights that Snapchat, Instagram and Facebook are involved in 70 per cent of such cases, according to Sky News

The charity is calling for the Government to speed up legislation to hold social media platforms responsible for their content and services, with Peter Wanless, NSPCC chief executive saying: "We cannot wait for the next tragedy before tech companies are made to act. It is hugely concerning to see the sharp spike in grooming offences on Instagram, and it is vital that the platform designs basic protection more carefully into the service it offers young people."

While the Government is due to release a white paper on online harms, the reported figures once again throw social networks into a bad light. Largely a success due to their open and self-policing nature, barely a week passes without one of the online giants making headline news.

The response from social networks is likely to be to point to the minimum age requirements required to use social networks, which in the case of Facebook and Instagram is 13 years old, an age widely ignored as parents fail to learn from the repeated message that unsupervised online access could expose children to harm.

Instagram recently found itself in hot water over self-harming images, resulting in Adam Mosseri, head of Instagram, responding online saying: "we need to do more to keep the most vulnerable people who use Instagram safe."

A spokesperson for Facebook and Instagram said: "Keeping young people safe on our platforms is our top priority and child exploitation of any kind is not allowed. We use advanced technology and work closely with the police and CEOP [Child Exploitation and Online Protection command] to aggressively fight this type of content and protect young people."

The news comes hot off the heels of the Momo Challenge hoax and YouTube confirming that it will disable comments following predatory behaviour on the platform involving children, while TikTok is being fined for collecting children's data.

The NSPCC calls out "10 years of failed self-regulation by social networks", inviting people to sign-up to their #WildWestWeb petition.

10 best Lego sets 2021: Our favourite Star Wars, Technic, City, Frozen II sets and more

Writing by Chris Hall. Originally published on 1 March 2019.