TikTok star launches ‘OnlyNans’ urging younger customers to report something that will offend their gran

A TikTok star has launched an Ofcom-backed marketing campaign referred to as ‘Solely Nans’ to encourage children to report dangerous content material that will offend their grandmothers.

Lewis Leigh gained thousands and thousands of followers after posting candy footage of him dancing together with his aged grandmother, Phyllis, on the favored video sharing web site.

He has now launched the marketing campaign urging individuals to report content material that will offend their grans as a result of ‘nans are the very best judges on the market’.

Lewis grew to become widespread on the app throughout lockdown, and wished to launch the marketing campaign with Ofcom after scrolling by means of some questionable content material.

He mentioned: ‘Nans all the time give the very best recommendation. So subsequent time you’re scrolling by means of your telephone and are available throughout one thing you’re not fairly positive about, ask your self, “What would my nan assume?”

‘If it’s a “no” from nan then maybe take into consideration reporting it.’

Ofcom, the communications regulator, discovered that 67 per cent of youngsters and younger adults, aged 13-24, had encountered a minimum of one doubtlessly dangerous piece of content material on-line.

Lewis has launched the marketing campaign with Ofcom and his grandmother Phyllis, pictured, after changing into populatr on TikTok through the pandemic. He’s urging different younger individuals to report any on-line content material that they assume might doubtlessly be dangerous 


I really like social media, however there might be some dangerous content material on the market! We’re all responsible of scrolling previous it however the one approach to eliminate it’s report it! So, I’ve teamed up with @Ofcom to play OnlyNans, and I’ve referred to as within the large weapons to assist me… My Nanny Phyllis! #advert

♬ authentic sound – Lewis Leigh

However the analysis additionally discovered that solely 17 per cent went on to report it, as greater than 20 per cent mentioned they don’t assume reporting one thing would make a distinction.

See also  Busy mum, 33, reveals how her shock ADHD prognosis helped her launch a six-figure magnificence empire

In addition they discovered that 12 per cent of these concerned within the survey didn’t know what to do after they noticed dangerous content material, or who to tell about it.

The most typical content material that younger individuals got here throughout had been misinformation, scams and offensive language.

His marketing campaign comes because the On-line Security Invoice is making its approach by means of parliament, and can give Ofcom powers to advantageous social media platforms in the event that they fail in an obligation of care.

Ofcom will be capable of hand out fines of as much as £18million, or ten per cent of the businesses qualifying income.

In addition to forcing social media corporations to delete unlawful content material, reminiscent of youngster abuse imagery, they may also must attempt to eliminate some ‘hate crime offences’, though they’d be allowed in the actual world due to freedom of expression protections.

Information publishers have campaigned for a whole exemption from the On-line Security Invoice since its white paper launch three years in the past.

They’re involved the newest model of the Invoice seems to not tackle a advice from MPs for an modification to guard Press freedom.

The joint parliamentary committee which scrutinised the Invoice mentioned it ought to embrace a ban on tech corporations blocking information content material until it’s in breach of legal legislation.

Social media bosses could possibly be jailed in the event that they fail to cooperate with regulators on defending the susceptible on-line, beneath up to date laws.

The campaign comes after Molly Russell, pictured, took her own life in 2017 after viewing thousands of posts about suicide and self-harm on Instagram. The 14-year-old's dad Ian has been campaigning for tighter laws around online safety since her death

The marketing campaign comes after Molly Russell, pictured, took her personal life in 2017 after viewing 1000’s of posts about suicide and self-harm on Instagram. The 14-year-old’s dad Ian has been campaigning for tighter legal guidelines round on-line security since her demise

An earlier model of the On-line Security Invoice, revealed final 12 months, mentioned tech corporations could possibly be fined big quantities – doubtlessly operating into billions of kilos – in the event that they didn’t abide by an obligation of care.

Ministers had averted making bosses personally liable for firm failings, however now senior managers will face prosecution for breaking the responsibility of care.

The laws is dubbed the Nick Clegg legislation, as the previous deputy prime minister is now vice chairman for international affairs and communications at Fb.

Youngsters’s charities and frightened households have lengthy campaigned for social media corporations to be prosecuted in the event that they fail to crack down on self-harm materials.

It comes after the daddy of a teenage woman took her life after viewing 1000’s of posts on-line about suicide and self-harm.

Molly Russell, 14, took her personal life in 2017 after scrolling by means of the graphic photos on Instagram, along with her dad Ian Russell telling MP’s that he had ‘frustratingly restricted success’ when asking corporations to take down content material.

The net security campaigner mentioned tech corporations solely appeared to take motion when ‘information tales break’ or when the Authorities modifications rules.

Mr Russell mentioned the ‘company tradition’ on the platforms should change so that they reply to dangerous content material in a ‘proactive’ reasonably than a ‘reactive’ method.

Giving proof to MPs on the Draft On-line Security Invoice Joint Committee final 12 months, he mentioned: ‘It’s our expertise that there’s frustratingly restricted success when dangerous content material is requested to be eliminated by the platforms, notably by way of self-harm and suicidal content material and that is notably hectic for households and associates who’re bereaved by suicide.

‘It appears solely when both information tales break in a very public approach or when maybe rules change that the platforms reply… so it has grow to be our view, and more and more so, that the company tradition at these platforms wants to alter.

‘They must be proactive reasonably than reactive they usually in any case have the sources and the talents to do that.

‘However it’s so typically finished as an afterthought and they need to dwell as much as their phrases about taking on-line security critically and desirous to make their platforms safer.’

TikTok’s newest transparency report revealed that 85.8 million items of content material had been eliminated within the final three months of 2021.

Of these 5 per cent of them had been eliminated on account of consumer reviews, whereas Instagram reported 43.8 million content material removals.

Jo Hemmings, a behavioural psychologist, instructed The Instances that younger individuals not reporting doubtlessly dangerous content material ‘dangers a doubtlessly critical challenge going unchallenged’.

She added: ‘Individuals react very in a different way after they see one thing dangerous in actual life — reporting it to the police or asking for assist from a pal, guardian or guardian — however typically take little or no motion after they see the identical factor within the digital world.

Related Posts