January 2024
Any content on our site that sexualizes, exploits, or puts minors in danger is strictly prohibited. We will report it if we locate it or are made aware of it.
Nottyhub XXX is fiercely opposed to the transmission of child sexual abuse material (CSAM). This comprises text, illustrations, and computer-generated graphics. We consider it our obligation to guarantee that our platform is not used to share or consume CSAM, and to discourage users from searching for it.
Any material that features or depicts a kid (actual, fictitious, or animated) or promotes child sexual exploitation is absolutely prohibited on our site and constitutes a serious violation of our Terms of Service. Written material (which includes, but is not limited to, comments, content titles, content descriptions, messages, usernames, and so on)
Further, Nottyhub supports and endorses the Voluntary Principles to Combat Online Child Sexual Exploitation and Abuse; a collaborative initiative launched by the Five Country Ministerial (5 Eyes) and backed by industry-leading tech companies to combat online child sexual exploitation and abuse. While some bad actors aim to exploit developments in technology and the digital world, we think that participation in and support for global cross-sector collaboration, as well as rigorous, efficient, and adaptable rules, can successfully eliminate the spread of online abuse.
If you encounter child sexual abuse material on NottyHub, please report it to us via our Content Removal Request Form. For information on how to report Child Sexual Abuse Material please refer to the Additional Resources and Support section below.
Anyone can report potential violations of this policy.
For more information on how to report content, see the section titled How Can you help us.
All NottyHub feedback and critiques are kept private and are evaluated by real moderators who work quickly to address the content responsibly. If you suspect a child is in danger of death, please notify the appropriate law enforcement authorities immediately.
DO NOT upload anything (whether visual, audio, or written) that*:
A child is featured, involved, or depicted.
A child is sexualized. This includes any content that shows, involves, or portrays a kid (which includes any illustrated, machine-generated, or other realistic portrayals of a human child) engaging in illicitly sexual or inappropriate behavior.
* This is only an example and is not intended to be exhaustive. Please see our Terms of Service under the section labeled Prohibited Uses for a more thorough overview. Nottyhub maintains the right at all times to evaluate whether content is acceptable and in accordance with our Terms of Service, and may delete content at any time without prior notice and in its sole discretion.
We have substantial regulations, operational methods, and technology in place to combat and respond quickly to CSAM. We additionally work together with authorities on investigations and respond quickly to genuine legal demands to help stop the spread of CSAM on our platform.
Our staff of human moderators works around the clock to verify all uploaded content and prevent any content that violates our CSAM or other regulations from appearing on our site. Furthermore, when we are notified of an actual or prospective case of CSAM on the platform, we delete the content, investigate it, and report any material recognized as CSAM.
As part of our continuous efforts, we audit our websites on a regular basis to maintain and broaden our list of prohibited search phrases, titles, and tags in order to keep our community safe, inclusive, varied, and void of abusive and unlawful content.
They depend on new industry-standard technology solutions to aid in finding, reporting, and deleting CSAM and other forms of unlawful content from our platform, in addition to our team consisting of human moderators and frequent platform audits. To prevent CSAM off our platform, we deploy automated detection technologies as additional levels of defense.
These technologies include:
Youtube’s CSAI Match, a tool that assists in identifying known child sex abuse videos;
Microsoft’s PhotoDNA, a tool that aids in detecting and removing known images of child sexual abuse;
Google's Content Safety API, a cutting-edge artificial intelligence (AI) tool that scores and prioritizes content based on the likelihood of illegal imagery to assist reviewers in detecting unknown CSAM.
Safer, Thorn's comprehensive CSAM detection tool utilized to keep platforms free of abusive material
MediaWise® service from Vobile ®, a state-of-the-art fingerprinting software and database, which scans all new user uploads to help prevent previously identified offending content from being re-uploaded.
Safeguard –our proprietary image fingerprinting and recognition technology designed with the purpose of combatting both child sexual abuse imagery and the distribution of non-consensual intimate images, and to help prevent the re-uploading of that content to our platform.
In order to reinforce the many approaches we use to prevent the upload and publishing of possible or real CSAM, we additionally use age estimate capabilities to examine information submitted to our platform utilizing a mix of internal proprietary algorithms and Microsoft Azure Face API.
Together, these capabilities are critical in our joint battle against the spread of CSAM on our platform, as well as our aim to support industry efforts to eradicate the heinous worldwide crime of online child sexual exploitation and abuse.
How can you help us?
If you feel you have discovered CSAM or any other content that violates our Terms of Service, please notify us immediately by reporting the item for our examination.
If you are a victim or have firsthand information that content violates our CSAM policy, please report it to us by filling out and submitting our Content Removal Request Form. Please include any pertinent URL links to the item in concern, and we will respond to your request discreetly and promptly.
Anyone, whether or not they have an account on our site, can report breaches of this policy using our Content Removal Request Form.
Our zero-tolerance policy for any content involving children or including child sexual assault material. All child sexual abuse material that we discover or become aware of results in the deletion of the material and a suspension of the uploader. All incidents of suspected CSAM are reported to the National Center for Missing and Exploited Children.
If you believe a child is in imminent danger, you should reach out to your local law enforcement agency to report the situation immediately.
You may also contact and report incidents of child sexual exploitation or abuse material to any of the resource groups committed to eradicating and preventing child sexual exploitation listed below. Reports may be filed anonymously and play an important role in ensuring the protection of minors.
International Association of Internet Hotlines
Canadian Centre for Child Protection
We collaborate with a number of groups devoted to combating child sexual exploitation across the world.
The Five Country Ministerial is an annual gathering of the Homeland Security, Public Safety, and Immigration Ministers of Australia, Canada, New Zealand, the United Kingdom, and the United States to work on similar security concerns.