Child Sexual Abuse & Exploitation (CSAE) Zero‑Tolerance Policy

Last updated: September 11, 2025 · Applies to: All Alinfate products and services (alinfate.com)

1. Our Stance (Zero Tolerance)

Alinfate strictly prohibits any form of child sexual abuse and exploitation (CSAE), including the creation, possession, distribution, solicitation, or linking of child sexual abuse material (CSAM). We remove such content, permanently ban involved accounts, and cooperate with law enforcement where required or permitted by law.

2. Definitions

3. Prohibited Content & Conduct (Non‑exhaustive)

4. Age Assurance & Access Restrictions

Alinfate does not target minors. Where appropriate, we implement a neutral age screen or equivalent measures to prevent minors from accessing adult‑only features and content. If any experience could be accessed by both children and older audiences, additional safeguards (including ad/SDK restrictions) will apply.

5. In‑App Reporting & Blocking

Users can report content or users and block unwanted accounts directly in‑app from profile menus, content overflow menus, and chat actions. Reports can be submitted under the “Child Safety / CSAE” category with optional evidence. Misuse of reporting features may lead to restrictions.

6. Moderation & Enforcement

7. Legal Reporting

Where U.S. law applies, upon obtaining actual knowledge of apparent CSAM, we will report to the National Center for Missing & Exploited Children (NCMEC) CyberTipline pursuant to 18 U.S.C. §2258A. We will also report to competent authorities in other jurisdictions, consistent with applicable law.

8. Evidence Preservation & Privacy

We preserve only the minimum data necessary to fulfill legal obligations and protect users. Under the U.S. REPORT Act (2024) amendments to 18 U.S.C. §2258A, CyberTipline report materials are preserved for at least one year; we may voluntarily preserve them longer to reduce the proliferation of child sexual exploitation, and we store them in a manner aligned with the latest NIST Cybersecurity Framework. Access is strictly permissioned and audited.

9. User Education & Safer Design

10. Third Parties & Generative AI

Third‑party SDKs, ad providers, content partners, and generative AI capabilities integrated into Alinfate must comply with this policy. Any AI‑generated or manipulated content that sexualizes a minor—or makes an adult appear to be a minor in sexualized content—is strictly prohibited.

11. Appeals

Users may appeal enforcement decisions by contacting us with relevant details. For CSAE/CSAM, we will not delay protective actions during an appeal.

12. Designated Child Safety Point of Contact

Name/Role: Child Safety Team (Trust & Safety)
Email: thecoffecoder@gmail.com
Website: https://www.alinfate.com/

Developer checklist (for Google Play):
  1. Publish this policy on your website/help center and provide the URL in Play Console (Child Safety Standards declaration).
  2. Ensure in‑app reporting/blocking and continuous UGC moderation are active.
  3. Document an internal SOP for CSAM handling: immediate takedown → human review → legal reporting (as applicable) → evidence preservation.
  4. Implement a neutral age screen and Families compliance if any experience could include children.
  5. Keep the child‑safety point of contact up‑to‑date.