Youth Safety Policy

At Newrich, we want to support people who are learning to participate in the online economy—including younger users who are beginning to explore digital opportunities. To ensure a safe environment, we have established additional safeguards for users under the age of 18 so that their experience on the platform remains secure, responsible, and compliant with applicable laws.

We recognize that many young people are increasingly involved in online communities, digital marketplaces, and content creation. While we encourage responsible participation, we also prioritize safety, privacy, and appropriate supervision. For this reason, Newrich maintains systems designed to protect younger users from harmful interactions, enforce global regulatory standards, and provide transparency for parents and guardians.

This policy explains the measures we take to protect underage users, what parents and guardians should know, and how concerns can be reported if issues arise.

Overview

This document outlines the platform-wide safeguards and compliance policies that apply to Newrich users under the age of 18.

These safeguards include:

  • Age verification procedures
  • Default privacy and design protections for minor accounts
  • Content monitoring and moderation standards
  • Parental consent requirements and optional parental supervision tools
  • Trust & Safety escalation protocols

These policies apply to all Newrich staff members, moderators, and automated moderation systems, including internal monitoring tools used to review platform activity.

Age Verification

Implementation

Newrich uses several safeguards to verify user age and apply appropriate protections:

  • Users may be required to provide their date of birth (DOB) during account registration following email verification.
  • Individuals under the age of 13 are not permitted to create accounts in order to comply with applicable child protection regulations such as COPPA.
  • Accounts belonging to users aged 13–17 are classified as minor accounts and are subject to additional protections and monitoring.

User Interface Protections for Minor Accounts

Dashboard Notification

Accounts identified as belonging to users under 18 may display a persistent notification reminding the user of platform safety rules and restrictions. In some cases, the user may be required to acknowledge these protections before accessing certain platform features.

Restricted Categories

Users under the age of 18 may be restricted from accessing certain types of communities or content categories. These may include areas related to:

  • Gambling or sports betting
  • Dating or adult relationship communities
  • Other categories determined by Newrich to be inappropriate for minors

Creators may also apply their own age-based restrictions to communities or products, and minor accounts will not be permitted to access those restricted areas.

Privacy, Design, and Data Protections for Minors

To align with evolving laws and best practices for youth safety (including standards reflected in the California Age-Appropriate Design Code, the UK Children’s Code, and state laws governing social and digital platforms), Newrich applies the following to accounts identified as belonging to users under 18:

  • Default privacy settings: Minor accounts receive privacy settings that are set to high-by-default (e.g., who can see profile or activity, who can send messages). We prioritize safety and privacy over optional, more permissive options unless a parent or guardian has consented to different settings where required by law.
  • No behavioral profiling or targeted advertising: We do not use personal data associated with minor accounts to build behavioral profiles for advertising or to target minors with personalized ads based on their activity. Where advertising is shown to minor accounts, it is not based on tracking their behavior across the platform or third-party services.
  • Geolocation: We do not collect, sell, or retain precise geolocation data for minor accounts beyond what is strictly necessary to provide the service (e.g., region for compliance or security).
  • Recommendations and feeds: Where we provide discovery, recommendations, or feeds, we do not use addictive or manipulative design aimed at minors. Recommendations for minor accounts may be limited or simplified, and we comply with applicable laws that restrict algorithmic or “addictive” feeds for users under 18 (including where parental consent is required).
  • Notifications and usage: We may limit or restrict push and in-app notifications for minor accounts (e.g., during overnight or quiet hours) in line with applicable state laws, and we may offer optional tools to help minors and parents manage screen time or usage.
  • Parental supervision tools: Where available, parents and guardians may use optional supervision or family-account tools to view privacy and safety settings, manage messaging or discovery options, or receive high-level summaries of account activity, in compliance with applicable law.

We provide our youth safety and privacy practices in clear, accessible language so that both parents and younger users can understand how we protect under-18 accounts.

Platform Safety Policy: Protection of Underage Users

1. Content Monitoring and Moderation

Newrich monitors user-generated content across the platform in order to protect users and enforce safety standards. This includes listings, profiles, comments, messages, and community discussions.

  • Content may be reviewed through a combination of automated moderation tools and human moderation teams.
  • Accounts identified as belonging to minors are subject to heightened moderation standards.
  • Content involving grooming behavior, sexual exploitation, manipulation, or other harmful activity is escalated immediately.
  • Any content flagged as inappropriate involving minors is reviewed by the Trust & Safety team.

Minor accounts are not permitted to participate in communities that creators have restricted by age. This includes communities focused on adult themes such as betting, gambling, sexually suggestive material, nicotine-related content, casinos, or other categories determined to be unsuitable for minors.

2. Messaging and Communication Controls

Direct messages and chat groups may be monitored using both automated systems and human review processes.

If conversations are flagged as potentially inappropriate or harmful—particularly when minors are involved—they are escalated immediately for review by the Trust & Safety team.

3. Purchasing by Minors

Users under the age of 18 acknowledge that purchases may require parental consent. Additional verification steps may be required before transactions involving minor accounts are completed.

4. Earnings and Identity Verification for Minors

Users under 18 are not permitted to earn money through the platform unless additional safeguards are satisfied.

Before a minor can earn through Newrich:

  • A parent or legal guardian’s identity must be verified by our payment processor, and
  • The guardian must provide consent through an approved verification process (such as an email confirmation code).

These measures are designed to ensure appropriate supervision and compliance with financial regulations.

5. Legal Compliance

Newrich’s youth safety practices are designed to comply with applicable laws and regulatory frameworks, including but not limited to:

  • COPPA (Children’s Online Privacy Protection Act): Users under 13 are not permitted to create accounts, and personal data is not collected from them.
  • California Business & Professions Code §22580: Minor users may request deletion of content they have posted.
  • California Age-Appropriate Design Code Act (AB 2273): Where applicable, we apply default high privacy settings, avoid profiling minors by default, and do not collect or retain children’s precise geolocation for non-essential purposes.
  • State laws governing minors on digital platforms: We follow state-specific requirements where in effect, which may include age verification, verifiable parental consent for certain features (e.g., algorithmic or recommendation feeds), optional parental supervision tools, and restrictions on notifications or usage hours for minor accounts. These may include laws in states such as New York (e.g., SAFE for Kids Act), Florida, Tennessee, Utah, Connecticut, and others as they become effective.
  • GDPR and the UK Children’s Code (Age-Appropriate Design Code): Additional safeguards apply to minors regarding consent, data use, behavioral tracking, and design that prioritizes the best interests of the child.

Trust & Safety Protocols for Communications Involving Minors

1. Support Interactions with Minors

When communicating with underage users, support staff must follow strict guidelines.

Support representatives should avoid:

  • Informal or overly familiar language
  • Requesting personal or identifying information unless required to resolve a legitimate issue
  • Continuing conversations longer than necessary once the issue has been addressed

2. Handling Inappropriate Content

If inappropriate content involving a minor is identified, the following actions may occur:

  • Immediate removal of the content
  • Flagging the account and documenting the incident internally
  • Escalation to the Trust & Safety leadership team

If there are indications of grooming, sexual exploitation, or predatory behavior:

  • Relevant content and metadata may be preserved for investigative purposes
  • Involved accounts may be suspended immediately
  • Parents or guardians may be notified when legally required
  • Reports may be submitted to relevant authorities, including the National Center for Missing & Exploited Children (NCMEC) when applicable

Internal Auditing and Oversight

Accounts identified as belonging to users under 18 may undergo periodic internal reviews to ensure continued compliance with safety policies.

These reviews may include checks related to:

  • Identity verification status
  • Moderation flags or reported incidents
  • Purchase or earning activity associated with the account

These reviews help ensure that safety protections remain effective and that the platform continues to meet regulatory and ethical standards for protecting younger users.