In some cases, sexual abuse (such as forcible rape) is involved during production. Pornographic pictures of minors are also often produced by children and teenagers themselves without the involvement of an adult. Referring to child sexual abuse materials as pornography puts the focus on how the materials are used, as opposed to the impact they have on children.
AI-generated child sexual abuse images are spreading. Law enforcement are racing to stop them
The National Center for Missing & Exploited Children’s CyberTipline last year received about 4,700 reports of content involving AI technology — a small fraction of the more than 36 million total reports of suspected child sexual exploitation. By October of this year, the group was fielding about 450 reports per month of AI-involved content, said Yiota Souras, the group’s chief legal officer. According to the child advocacy organization Enough Abuse, 37 states have criminalized AI-generated or AI-modified CSAM, either by amending existing child sexual abuse material laws or enacting new ones. More than half of those 37 states enacted child porn new laws or amended their existing ones within the past year.
Hertfordshire Police told us that a 14-year-old girl had managed to use her grandmother’s passport and bank details to sell explicit images. Leah’s age was directly reported to OnlyFans by an anonymous social media account in late January. The company says this led to a moderator reviewing the account and double-checking her ID. She told her mum she originally intended to only post pictures of her feet after making money selling them on Snapchat. But this soon escalated to explicit videos of her masturbating and playing with sex toys. But BBC News has also heard from child protection experts across the UK and US, spoken to dozens of police forces and schools, and obtained anonymised extracts from Childline counsellor notes, about underage experiences on OnlyFans.
Sites featuring terrorism or child pornography to be blocked in France
In many states reports can be filed with child protection authorities anonymously which means you can file without providing identifying information about who you are. If you have questions about filing you can call a confidential helpline such as Child Help USA or the Stop It Now! If you file with an authority which is not best suited to take the report, ask them specifically who you should contact to file. Typically reports should be filed in the area where you believe the abuse took place, not necessarily where the people involved are right now. The government says the Online Safety Bill will allow regulator Ofcom to block access or fine companies that fail to take more responsibility for users’ safety on their social-media platforms.
- A U.S. Army soldier accused of creating images depicting children he knew being sexually abused.
- It was shut down last year after a UK investigation into a child sex offender uncovered its existence.
- With the recent significant advances in AI, it can be difficult if not impossible for law enforcement officials to distinguish between images of real and fake children.
- “Dark web child sex offenders…cannot hide from law enforcement,” the UK’s National Crime Agency investigations lead, Nikki Holland, said.
- More and more police departments are establishing Internet Crimes Against Children (ICAC) teams.
US: Alaska man busted with 10,000+ child sex abuse images despite his many encrypted apps
In addition, the NGO identified a further 66 links that had never been reported before and which also contained criminal content. A report drawn up by SaferNet, an NGO active in promoting human rights online since 2005, found that 1.25 million users of the messaging app Telegram are in group chats or channels that sell and share images of child sexual abuse and pornographic material. One of these communities alone—which was still active when the survey was made—had 200 thousand users. Analysts upload URLs of webpages containing AI-generated child sexual abuse images to a list which is shared with the tech industry so it can block the sites.
Their primary objective is to make sure the child is safe in their own home or when with adults who are responsible for their care. They also “restrict specific sensitive media, such as adult nudity and sexual behaviour, for viewers who are under 18 or viewers who do not include a birth date on their profile”. “We use a combination of state-of-the-art technology together with human monitoring and review to prevent children under the age of 18 from sharing content on OnlyFans. OnlyFans says it cannot respond to these cases without being provided with account details, which the police were unable to pass on to us. It says it has a number of systems in place to prevent children from accessing the site and continues to look for new ways to enhance them. BBC News has discovered that under-18s are also appearing in explicit videos on accounts run by adults in violation of OnlyFans’ guidelines.
Because the reports were provided to the BBC without any identifying details of the children or OnlyFans accounts in question, we were unable to provide the platform with account names. As a part of the investigation, we also spoke to schools, police forces and child protection experts who told us they are hearing from under 18-year-olds whose experiences on the site have had serious consequences. BBC News was told the account was reported to police in the US in October 2020 but had not been removed until we contacted OnlyFans about the case this month. According to his friend Jordan, Aaron didn’t have his own account, but instead “got sucked into” appearing in explicit videos posted by his girlfriend, Cody, who was a year older than him.