How People Use the Internet to Sexually Exploit Children and Teens
Tlhako urged parents to monitor their children’s phone usage, and the social media platforms they are using. JOHANNESBURG – A massive amount of child sexual abuse material is traded on the dark web, a hidden part of the internet that cannot be accessed through regular browsers. Some people accidentally find sexual images of children and are curious or aroused by them. They may justify their behavior by saying they weren’t looking for the pictures, they just “stumbled across” them, etc. Of the 2,401 ‘self-generated’ images and videos of 3–6-year-olds that we hashed this year, 91% were of girls and most (62%) were assessed as Category C by our analysts. These images showed children in sexual poses, displaying their genitals to the camera.
- The Organization for Pornography and Sexual Exploitation Survivors (PAPS) is a nonprofit organization that offers counseling on how to request deletion of online child porn images and videos.
- Breaking a federal CSAM law is a serious crime, and if legally convicted, those creating, sharing, accessing or receiving CSAM could have to pay fines and or face severe legal consequences.
- “If you’ve got a social-media site that allows 13-pluses on, then they should not be able to see pornography on it.”
- Yet, to be considered child sexual abuse, behaviors do not have to involve penetration to the vagina, anus, or mouth (by penis, tongue, finger or object), or involve force.
- Our elearning courses will help you manage, assess and respond to sexual harassment and abuse in primary and secondary schools.
- To be clear, the term ‘self-generated’ does not mean that the child is instigating the creation of this sexual content themselves, instead they are being groomed, coerced and in some cases blackmailed into engaging in sexual behaviour.
Findings based on hashed image analysis
There is no obligation for a website to investigate, but OnlyFans told the BBC it checks social media when verifying accounts. According to Firman, it is not only child porn users and the government who must strive to minimize negative content and harmful effects on digital platforms. Platform providers are also responsible for ensuring that their services are friendly and safe for all people. Child pornography videos are widely circulating on social media, closed groups, messaging applications, and the dark web.
“Should I wait until I’m sure before I file a report?”
The Internet Watch Foundation has joined with a consortium of partners to develop the Artemis Survivor Hub (ASH) – a revolutionary, victim-focused response to online child sexual exploitation. The Internet Watch Foundation’s powerful new tool for small businesses and startups. Designed to detect and stop known illegal imagery using advanced hash-matching technology, Image Intercept helps eligible companies meet online safety obligations and keep users safe. However, there was also a higher percentage of Category B images that had more than one child. Category B images include those where a child is rubbing genitals (categorised as masturbation) or where there is non-penetrative sexual activity which is where the children are interacting, perhaps touching each other in a sexual manner.
Top technology companies, including Google, OpenAI and Stability AI, have agreed to work with anti-child sexual abuse organization Thorn to combat the spread of child sexual abuse images. The court’s decisions in Ferber and Ashcroft could be used to argue that any AI-generated sexually explicit image of real minors should not be protected as free speech given the psychological harms inflicted on the real minors. The court’s ruling in Ashcroft may permit AI-generated sexually explicit images of fake minors. The city of Lancaster, Pennsylvania, was shaken by revelations in December 2023 that two local teenage boys shared hundreds of nude images of girls in their community over a private chat on the social chat platform Discord. Witnesses said the photos easily could have been mistaken for real ones, but they were fake.
CSAM is illegal because it is filming an actual crime (i.e., child sexual abuse). Children can’t legally consent to sexual activity, and so they cannot participate in pornography. It may also include encouraging youth to send sexually explicit pictures of themselves which is considered child sexual abuse material (CSAM). The U.S. Department of Justice defines CSAM, or child pornography, as any sexually explicit images or videos involving a minor (children and teens under 18 years old). The legal definition of sexually explicit does not mean that an image or video has to depict a child or teen engaging in sex. A picture of a naked child may be considered illegal CSAM if it is sexually suggestive enough.