• Coins MarketCap
    • Coins MarketCap
    • Crypto Calculator
    • Top Gainers and Loser of the day
  • Crypto Exchanges
  • Bitcoin News
  • Crypto News
    • Cryptocurrency
    • Blockchain
    • Finance
    • Investing
    • View all latest Updates regarding crypto
Saturday, November 8, 2025
WIREOPEDIA
No Result
View All Result
Contribute!
CONTACT US
  • Home
  • Breaking News
  • World
  • UK
  • US
  • Entertainment
  • Business
  • Technology
  • Defense
  • Health Care
  • Politics
  • Strange
  • Crypto News
WIREOPEDIA
  • Home
  • Breaking News
  • World
  • UK
  • US
  • Entertainment
  • Business
  • Technology
  • Defense
  • Health Care
  • Politics
  • Strange
  • Crypto News
No Result
View All Result
WIREOPEDIA
No Result
View All Result
Home Breaking News

AI driving ‘explosion’ of fake nudes as victims say law is failing them

by wireopedia memeber
December 6, 2024
in Breaking News, UK News, World
0
AI driving ‘explosion’ of fake nudes as victims say law is failing them
74
SHARES
1.2k
VIEWS
Share on FacebookShare on Twitter

Campaigners are warning the use of artificial intelligence (AI) to create realistic but fake nude images of real women is becoming “normalised”.

You might also like

Latest hostage remains returned from Gaza identified

Trump exempts Hungary from US sanctions on Russian energy after meeting Orban

UPS and FedEx ground fleet of cargo planes after deadly Kentucky crash

It’s also an increasing concern in schools. A recent survey by Internet Matters found 13% of teenagers have had an experience with nude deepfakes, while the NSPCC told Sky News “a new harm is developing”.

Ofcom will later this month introduce codes of practice for internet companies to clamp down on the illegal distribution of fake nudes, but Sky News has met two victims of this relatively new trend, who say the law needs to go further.

Earlier this year, social media influencer and former Love Island contestant, Cally Jane Beach, 33, was horrified when she discovered someone had used AI to turn an underwear brand photograph of her into a nude and it was being shared online.

The original image had been uploaded to a site that uses software to digitally transform a clothed picture into a naked picture.

She told Sky News: “It looked so realistic, like nobody but me would know. It was like looking at me, but also not me.”

She added: “There shouldn’t be such a thing. It’s not a colouring book. It’s not a bit of fun. It’s people’s identity and stripping their clothes off.”

UK in AI arms race with Russia as Putin ‘wants destruction’, minister to warn

How AI can let you speak with space

Google’s AI chatbot Gemini tells user to ‘please die’ and ‘you are a waste of time and resources’

When Cally reported what had happened to the police, she struggled to get them to treat it as a crime.

“They didn’t really know what they could do about it, and because the site that hosted the image was global, they said that it’s out of their jurisdiction,” she said.

In November, Assistant Chief Constable Samantha Miller, of the National Police Chiefs’ Council, addressed a committee of MPs on the issue and concluded “the system is failing”, with a lack of capacity and inconsistency of practice across forces.

ACC Miller told the women and equalities committee she’d recently spoken to a campaigner who was in contact with 450 victims and “only two of them had a positive experience of policing”.

The government says new legislation outlawing the generation of AI nudes is coming next year, although it is already illegal to make fake nudes of minors.

Meanwhile, the problem is growing with multiple apps available for the purpose of unclothing people in photographs. Anyone can become a victim, although it is nearly always women.

Professor Clare McGlynn, an expert in online harms, said: “We’ve seen an exponential rise in the use of sexually explicit deepfakes. For example, one of the largest, most notorious websites dedicated to this abuse receives about 14 million hits a month.

“These nudify apps are easy to get from the app store, they’re advertised on Tik Tok, So, of course, young people are downloading them and using them. We’ve normalised the use of these nudify apps.”

‘Betrayed by my best friend’

Sky News spoke to “Jodie” (not her real name) from Cambridge who was tipped off by an anonymous email that she appeared to be in sex videos on a pornographic website.

“The images that I posted on Instagram and Facebook, which were fully clothed, were manipulated and turned into sexually explicit material,” she said.

Jodie began to suspect someone she knew was posting pictures and encouraging people online to manipulate them.

Then she found a particular photograph, taken outside King’s College in Cambridge, that only one person had.

It was her best friend, Alex Woolf. She had airdropped the picture to him alone.

Woolf, who once won BBC young composer of the year, was later convicted of offences against 15 women, mostly because of Jodie’s perseverance and detective work.

Even then, his conviction only related to the offensive comments attached to the images, because while it’s illegal to share images – it’s not a crime to ask others to create them.

He was given a suspended sentence and ordered to pay £100 to each of his victims.

Jodie believes it’s imperative new laws are introduced to outlaw making and soliciting these types of images.

“My abuse is not your fun,” she said.

“Online abuse has the same effect psychologically that physical abuse does. I became suicidal, I wasn’t able to trust those closest to me because I had been betrayed by my best friend. And the effect of that on a person is monumental.”

‘A scary, lonely place’

A survey in October by Teacher Tap found 7% of teachers answered yes to the question: “In the last 12 months, have you had an incident of a student using technology to create a fake sexually graphic image of a classmate?”

In their campaigning both Cally and Jodie have come across examples of schoolgirls being deep faked.

Cally said: “It is used as a form of bullying because they think it’s funny. But it can have such a mental toll, and it must be a very scary and lonely place for a young girl to be dealing with that.”

Read more from Sky News:
Paedophile who made AI abuse images jailed for 18 years

Google’s AI chatbot Gemini tells user to ‘please die‘
Sex offenders using virtual reality to abuse children

Follow our channel and never miss an update

The NSPCC said it has had calls about nude deepfakes to its helpline.

The charity’s policy manager for child safety online, Rani Govender, said the pictures can be used as “part of a grooming process” or as a form of blackmail, as well as being passed around by classmates “as a form of bullying and harassment”.

“Children become scared, isolated and they worry they won’t be believed that the images are created by someone else,” Ms Govender said.

She added: “This is a new harm, and it is developing, and it will require new measures from the government with child protection as a priority.”

Alex Davies-Jones, under-secretary of state for victims, told MPs in November: “We’ve committed to making an offence of creating a deepfake illegal and we will be legislating for that this session.”

For campaigners like Jodie and Cally the new laws can’t come soon enough. However, they worry they won’t have strong enough clauses around banning the soliciting of content and ensuring images are removed once they’ve been discovered.

Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email [email protected] in the UK. In the US, call the Samaritans branch in your area or 1 (800) 273-TALK.

Read Entire Article
Tags: Breaking NewsSkynewsUK
Share30Tweet19

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Related News

Your Taylor Swift Song Choices Will Reveal If You’re Glinda Or Elphaba

Your Taylor Swift Song Choices Will Reveal If You’re Glinda Or Elphaba

November 26, 2024
Here’s What Everyone Wore To The 2024 iHeartRadio Music Awards

Here’s What Everyone Wore To The 2024 iHeartRadio Music Awards

April 2, 2024
Home secretary denies ‘watering down’ grooming gangs response

Home secretary denies ‘watering down’ grooming gangs response

April 10, 2025

Browse by Category

  • Blockchain
  • Breaking News
  • Business
  • Crypto
  • Crypto Market
  • Cryptocurrency
  • Defense
  • Entertainment
  • Finance
  • Health Care
  • Investing
  • Market
  • Politics
  • Strange
  • Technology
  • UK News
  • US News
  • World
WIREOPEDIA

Wireopedia is an automated news feed. The Wireopedia AI pulls from sources with different views so you can see the various sides of different arguments and make a decision for yourself. Wireopedia will be firmly committed to the public interest and democratic values.

Privacy Policy     Terms and Conditions

CATEGORIES

  • Blockchain
  • Breaking News
  • Business
  • Crypto
  • Crypto Market
  • Cryptocurrency
  • Defense
  • Entertainment
  • Finance
  • Health Care
  • Investing
  • Market
  • Politics
  • Strange
  • Technology
  • UK News
  • US News
  • World

BROWSE BY TAG

Bitcoin Bitcoinist Bitcoinmagazine Blockchain Breaking News Business BuzzFeed Celebrity News Coin Surges Cointelegraph Cryptocurrencies Cryptoslate Defense Entertainment Health Care insidebitcoins Market Stories newsbtc Politico Skynews Strange Technology Trading UK US World

RECENT POSTS

  • Davina McCall reveals breast cancer diagnosis a year after brain tumour surgery
  • Bitcoin May Launch Recovery To $120,000 If This Condition Holds – Details
  • Pakistan mulls rupee-backed stablecoin as country sees $25B crypto opportunity
  • Privacy on trial as Samourai Wallet cofounder lands in jail for writing code
  • It’s not a bubble, because AI is already running the markets

© 2024 WIREOPEDIA - All right reserved.

No Result
View All Result
  • Home
  • Breaking News
  • World
  • UK
  • US
  • Entertainment
  • Business
  • Technology
  • Defense
  • Health Care
  • Politics
  • Strange
  • Crypto News
  • Contribute!

© 2024 WIREOPEDIA - All right reserved.

  • bitcoinBitcoin(BTC)$102,063.001.93%
  • ethereumEthereum(ETH)$3,397.385.03%
  • tetherTether(USDT)$1.000.01%
  • rippleXRP(XRP)$2.284.40%
  • binancecoinBNB(BNB)$992.916.19%
  • solanaSolana(SOL)$159.024.26%
  • usd-coinUSDC(USDC)$1.000.00%
  • staked-etherLido Staked Ether(STETH)$3,396.185.09%
  • tronTRON(TRX)$0.2922102.68%
  • dogecoinDogecoin(DOGE)$0.17822010.10%
  • cardanoCardano(ADA)$0.578.89%
  • wrapped-bitcoinWrapped Bitcoin(WBTC)$101,776.001.77%
  • chainlinkChainlink(LINK)$15.546.78%
  • bitcoin-cashBitcoin Cash(BCH)$494.543.75%
  • stellarStellar(XLM)$0.2850536.19%
  • litecoinLitecoin(LTC)$100.3214.84%
  • avalanche-2Avalanche(AVAX)$17.567.64%
  • shiba-inuShiba Inu(SHIB)$0.0000109.53%
  • polkadotPolkadot(DOT)$3.2718.72%
  • crypto-com-chainCronos(CRO)$0.1275174.63%
  • daiDai(DAI)$1.000.01%
  • uniswapUniswap(UNI)$6.0212.83%
  • nearNEAR Protocol(NEAR)$2.9526.05%
  • okbOKB(OKB)$123.555.92%
  • filecoinFilecoin(FIL)$3.4562.09%
  • algorandAlgorand(ALGO)$0.1786059.88%
  • vechainVeChain(VET)$0.01715014.08%
  • cosmosCosmos Hub(ATOM)$2.998.62%
  • elrond-erd-2MultiversX(EGLD)$10.4821.15%