CSAE Policy

MESA Child Safety & Community Well-being Policy

At MESA, our primary goal is to foster a safe, positive, and connected local community. The safety of all our users, especially children, is our highest priority. This policy outlines our strict standards against child endangerment and our commitment to maintaining a secure platform. All users are required to adhere to these standards.


1. Zero-Tolerance for Child Endangerment

MESA has a zero-tolerance policy for any content or behavior that endangers, exploits, or abuses children. We strictly prohibit users from creating, uploading, sharing, or distributing any content that facilitates such harm.

This includes, but is not limited to:

  • Child Sexual Abuse Material (CSAM): Any visual or written depiction of child sexual abuse is strictly forbidden. We will immediately remove such content, terminate the responsible user’s account, and report the incident and the user to the National Center for Missing & Exploited Children (NCMEC) and other appropriate law enforcement agencies.
  • Predatory Behavior: We prohibit any use of our app to engage in predatory behavior towards children, such as:
    • Child Grooming: Attempting to befriend a child to facilitate sexual contact or exchange sexual imagery.
    • Inappropriate Interaction: Any form of inappropriate contact or communication targeted at a minor.
    • Sexualization of a Minor: Content that depicts or encourages the sexual abuse or exploitation of children.
    • Sextortion: Threatening or blackmailing a child using real or alleged intimate images.
    • Child Trafficking: Advertising, soliciting, or facilitating the commercial sexual exploitation of a child.

2. Content Appropriate for a Community Platform

While MESA is a hyper-local app for a general audience, we are committed to ensuring the platform does not contain adult themes that are inappropriate for younger users who may be exposed to the content. Therefore, we prohibit:

  • Excessive or Gratuitous Violence: Content featuring extreme violence, blood, and gore is not permitted.
  • Harmful and Dangerous Activities: We do not allow content that depicts, encourages, or provides instructions for dangerous activities that could lead to serious harm.
  • Promotion of Negative Body Image: Content created for entertainment that depicts or promotes cosmetic procedures like plastic surgery, extreme weight loss, or other significant physical alterations is not allowed.

3. User Responsibilities & In-App Reporting

Your safety and the community’s well-being depend on everyone’s participation. We empower our users to help us maintain a safe environment.

  • In-App Reporting: We provide a clear and easy-to-use in-app mechanism for you to report any user, content, or behavior that violates this policy or raises safety concerns. Please use the “Report” feature found on user profiles and content posts.
  • Immediate Danger: If you believe a child is in immediate danger or has been subjected to abuse, exploitation, or trafficking, please contact your local law enforcement immediately in addition to reporting it within the app.

4. Our Commitment to Action & Compliance

MESA is committed to enforcing this policy vigorously and complying with all applicable child safety laws and regulations.

  • Enforcement: Upon receiving a report or becoming aware of a violation, we will take appropriate action, which may include removing the content, suspending or permanently terminating the user’s account, and/or reporting the incident to law enforcement.
  • Legal Compliance: We have processes in place to comply with all relevant child safety laws, including the mandatory reporting of confirmed CSAM to NCMEC or other regional authorities.
  • Point of Contact: We have designated a specific point of contact to work directly with Google Play and other legal bodies on any issues related to child safety on our platform.