Trust & Safety

Modified on Wed, 12 Nov at 4:40 AM

Community Guidelines & Content Policies


Our team reviews reports of abuse from racist remarks and body shaming to revenge porn and sexual offenders.

See: Community Guidelines & Content Policy

Profile Verification

Verification photos are used solely for verification purposes, as required by our banking partners, to help combat child sexual abuse material (CSAM) and non-consensual intimate imagery.

All verification photos are encrypted upon upload, and we maintain a secure audit trail recording who views each verification photo and when.

Age Verification

During the upload process, we use machine learning to estimate the age of individuals appearing in photos or videos. This helps us identify potential underage members while upholding our promise to protect and respect everyone’s privacy.

Transparency Report

We share our Transparency Report every month with our partners to give them insight into the platform’s overall health.

Content Safeguards

Content Review During Upload

When content is uploaded, we use AI and machine learning to detect whether a photo or video might contain illegal material or content that violates our policies. Examples include, but are not limited to, CSAM, hate symbols, and graphic violence.

If the system flags content as likely violating our guidelines, it is reviewed by a human moderator. We aim to review 95% of flagged items within 30 minutes.

We also work with Cloudflare to prevent the sharing of CSAM and non-consensual intimate images.

CSAM Detection & Reporting

Every uploaded photo or video is analyzed by AI to estimate the age of all persons depicted.

If there’s a possibility that someone appears to be underage, the content is reviewed by a moderator. When content is confirmed as CSAM, the account is immediately terminated and preserved, and the case is reported to the appropriate authorities.

Easy Removal Requests for People in Photos or Videos

We believe that consent can be withdrawn at any time. If you appear in a photo or video on another user’s profile and wish to have it removed, we will remove it for you, no questions asked.

See: [Removal Requests for People in Photos or Videos]

DMCA Takedown Requests

We take content ownership seriously and respond to all valid DMCA notices.
We encourage users to use our official DMCA form when submitting a takedown request.

See: Submit a DMCA Takedown Request

No Download Option

We do not offer any feature that allows others to download your photos or videos, and we have implemented technical measures to make it more difficult to extract content from wap.nl.

Personal Safeguards

Data Protection by Design

We treat your private data as we’d want others to treat ours.
We only ask for information that’s essential for WAP.nl to function.

See: Privacy Policy

Meta Information

Metadata attached to photos or videos can sometimes reveal sensitive details about the uploader — such as where the content was captured.

When content is uploaded, all metadata is stripped before processing or displaying it on WAP.nl.

Inbox Controls

Members can choose levels of message control, deciding who can send them private messages. You can also limit who may send photos in private chats.

Reporting

Every page includes an option to report content or behavior. The report button appears next to all types of content, including posts, photos, videos, events, groups, and comments.

While reporting through the platform is the fastest way to alert us, you can also email: support@wap.nl

For copyright issues, please file a DMCA Takedown Request.


 

Was this article helpful?

That’s Great!

Thank you for your feedback

Sorry! We couldn't be helpful

Thank you for your feedback

Let us know how can we improve this article!

Select at least one of the reasons
CAPTCHA verification is required.

Feedback sent

We appreciate your effort and will try to fix the article