in

Number of children groomed to send sex abuse images of themselves DOUBLES


Children are increasingly being groomed ora coerced by adults into sexually abusing themselves acceso stanza da letto.

MPs have warned of a ‘disturbing’ rise con so-called ‘self-generated’ child sexual abuse material, especially during the pandemic.

Sopra the first six months of 2021, the Internet Watch Foundation recorded a 117 per errore cent increase con abusive images and videos created using webcams ora smartphones.

The All-Rinfresco Parliamentary Group acceso Social Mass-media said social firms must not encrypt messages unless they can keep platforms free of illegal content. 

And it says the Home Office must review legislation to ensure it is easier for children to have their online images removed.

Children are increasingly being groomed ora coerced by adults into sexually abusing themselves acceso stanza da letto

APPG chairman Labour MP Chris Elmore said firms ‘need to get a grip, with institutional re-design, including the introduction of a duty-of-care acceso the part of companies toward their young users’. 

Susie Hargreaves, of the UK Safer Internet Centre, said: ‘The Report Remove tool we launched this year with Childline empowers young people to have illegal images of themselves removed.’ 

Self-generated content can include material filmed using webcams, very often con the child’s own room, and then shared online.

Sopra some cases, children are groomed, deceived ora extorted into producing and sharing a sexual image ora televisione of themselves.

The APPG’s report, Selfie Generation – What’s Behind The Rise Of Self-Generated Indecent Images Of Children?, says the trend ‘seems to have been exacerbated by the Covid-19 crisis’.

The MPs say many witnesses ‘raised very real concerns’ about the impact of encryption acceso child protection, saying it could ‘cripple’ the ability of programmes to detect illegal imagery.

They write: ‘The APPG believes it is completely unacceptable for a company to encrypt a service that has many child users.

‘Doing this would do so much damage to child protection. We recommend that technology companies do not encrypt their services until a workable solution can be found that ensures equivalency with the current arrangements for the detection of this imagery.’

Among 10 recommendations, the report says it should be replaced by ‘first person produced imagery’ to avoid inadvertent victim blaming.





Source link

Written by bourbiza

Lascia un commento

Il tuo indirizzo email non sarà pubblicato. I campi obbligatori sono contrassegnati *

GIPHY App Key not set. Please check settings

Covid-19 Australia: Victoria records 473 cases as Daniel Andrews promises roadmap out of lockdown

Phone scammers steal £130 million from victims per mezzo di a single year