1. Our Stance (Zero Tolerance)
Alinfate strictly prohibits any form of child sexual abuse and exploitation (CSAE), including the creation, possession, distribution, solicitation, or linking of child sexual abuse material (CSAM). We remove such content, permanently ban involved accounts, and cooperate with law enforcement where required or permitted by law.
2. Definitions
- Child / Minor: As defined by applicable law (generally under 18 years of age).
- CSAE: Sexual exploitation or abuse of a minor, including grooming, coercion/extortion, trafficking, or any act that places a minor at sexual risk.
- CSAM: Any visual depiction of a minor engaged in sexually explicit conduct, including AI‑generated, manipulated, or deepfaked material.
3. Prohibited Content & Conduct (Non‑exhaustive)
- Producing, uploading, distributing, requesting, trading, or linking to CSAM.
- Sexualizing minors or using minors in sexualized contexts (text, images, audio/video, stickers/filters, AI/deepfake).
- Grooming, coercion/extortion (“sextortion”), enticement for sexual activity, or arranging in‑person meetings with minors for sexual purposes.
- Organizing, facilitating, or promoting child sex trafficking or any commercial sexual activity involving a minor.
- Attempting to obtain sexual imagery from minors or to exchange value for sexualized interactions.
4. Age Assurance & Access Restrictions
Alinfate does not target minors. Where appropriate, we implement a neutral age screen or equivalent measures to prevent minors from accessing adult‑only features and content. If any experience could be accessed by both children and older audiences, additional safeguards (including ad/SDK restrictions) will apply.
5. In‑App Reporting & Blocking
Users can report content or users and block unwanted accounts directly in‑app from profile menus, content overflow menus, and chat actions. Reports can be submitted under the “Child Safety / CSAE” category with optional evidence. Misuse of reporting features may lead to restrictions.
6. Moderation & Enforcement
- Immediate action: Suspected CSAE/CSAM is prioritized for removal; live streams/chats may be interrupted; accounts may be frozen pending review.
- Human review: Trained trust & safety staff triage and make enforcement decisions.
- Enforcement ladder: Content removal → feature limits → permanent account bans → device/payment/network‑level countermeasures; referrals to law enforcement where appropriate.
- Recurrence prevention: Signals from repeat or organized abuse inform model and rule updates; we collaborate with relevant partners where lawful.
7. Legal Reporting
Where U.S. law applies, upon obtaining actual knowledge of apparent CSAM, we will report to the National Center for Missing & Exploited Children (NCMEC) CyberTipline pursuant to 18 U.S.C. §2258A. We will also report to competent authorities in other jurisdictions, consistent with applicable law.
8. Evidence Preservation & Privacy
We preserve only the minimum data necessary to fulfill legal obligations and protect users. Under the U.S. REPORT Act (2024) amendments to 18 U.S.C. §2258A, CyberTipline report materials are preserved for at least one year; we may voluntarily preserve them longer to reduce the proliferation of child sexual exploitation, and we store them in a manner aligned with the latest NIST Cybersecurity Framework. Access is strictly permissioned and audited.
9. User Education & Safer Design
- We provide help‑center guidance on recognizing grooming and sextortion and how to seek help.
- We apply safety defaults and term filters in search/recommendation to reduce exposure to risk (without limiting legitimate help‑seeking content).
10. Third Parties & Generative AI
Third‑party SDKs, ad providers, content partners, and generative AI capabilities integrated into Alinfate must comply with this policy. Any AI‑generated or manipulated content that sexualizes a minor—or makes an adult appear to be a minor in sexualized content—is strictly prohibited.
11. Appeals
Users may appeal enforcement decisions by contacting us with relevant details. For CSAE/CSAM, we will not delay protective actions during an appeal.
12. Designated Child Safety Point of Contact
Name/Role: Child Safety Team (Trust & Safety)
Email: thecoffecoder@gmail.com
Website: https://www.alinfate.com/
- Publish this policy on your website/help center and provide the URL in Play Console (Child Safety Standards declaration).
- Ensure in‑app reporting/blocking and continuous UGC moderation are active.
- Document an internal SOP for CSAM handling: immediate takedown → human review → legal reporting (as applicable) → evidence preservation.
- Implement a neutral age screen and Families compliance if any experience could include children.
- Keep the child‑safety point of contact up‑to‑date.