Policy
Reporting Policy
Reporting policy for safety, privacy, content, and account concerns.
What can be reported
Members and visitors may need to report content, accounts, behavior, privacy concerns, suspected impersonation, harassment, consent violations, illegal content, or issues involving safety and platform misuse.
Who can report
Reports may come from affected members, witnesses, guardians or representatives where appropriate, or people who encounter public policy concerns.
How reports are reviewed
Reports may be reviewed using submitted information, account context, content metadata, policy history, and other relevant signals available to Vorx. Not every report will result in action.
Information that may help
Helpful reports usually include the account or content involved, what happened, when it happened, why it may violate policy, and whether there is an immediate safety concern. Reporters should avoid sharing unnecessary sensitive details.
Possible outcomes
Depending on context, Vorx may take no action, ask for more information, limit content visibility, remove content, restrict features, suspend accounts, or take other steps permitted by policy.
Abuse of reporting tools
False, malicious, repetitive, or retaliatory reports may undermine community safety and may themselves be reviewed under the Community Guidelines.
Emergency situations
Vorx reporting tools are not emergency services. If someone is in immediate danger, contact local emergency services or appropriate crisis resources.
Contact and support
Use in-app reporting tools for content or account reports when available. For broader safety concerns, email safety@vorx.app. For general support, email support@vorx.app.
