Skip to main content

Content Moderation Panel

Learn how the Content Moderation Panel works, how messages are flagged, how moderators take action, and how to manage permissions.

Updated over a week ago

What Is the Content Moderation Panel?

The Content Moderation Panel is a centralized dashboard that allows clubs to review and manage potentially inappropriate or concerning messages sent in Team Chat.

It brings together:

  • Messages flagged automatically by AI

  • Messages manually reported by staff

From one place, club leaders can review incidents, see message context, review user history, and take action to maintain healthy and safe team communication.

This tool is designed to support safety, consistency, and accountability across your club.


Where to Find It

The Content Moderation Panel is available on Web.

You can access it under:

Communications > Content Moderation

A summary view may also appear as a dashboard panel, depending on permissions.

Limited monitoring is available on Mobile through the Chat Moderation tile on the Admin tab. Full moderation workflows are available on Web.


How Messages Get Flagged

Messages can enter the Moderation Queue in two ways.

AI Detection

AI scans Team Chat messages for high-risk language patterns and surfaces messages that may require review.

Examples of patterns evaluated include:

  • Critical safety concerns

  • Threats of violence

  • Self-harm language

  • Grooming or predatory behavior

  • Hate speech or discriminatory language

  • Bullying or harassment

AI detection evaluates language in context. Not every flagged phrase results in the same severity or outcome. All flagged messages require human review before action is taken.


Staff Reporting (Mobile)

Staff members can manually report a Team Chat message in Mobile.

When reporting, staff select:

  • A report reason:

    • Bullying or Harassment

    • Inappropriate Language

    • Threatening Behavior

    • Other

  • A severity level:

    • Low

    • Medium

    • High

The reported message appears in the Moderation Queue for review.

Direct Messages and Staff Chats are not included in moderation.


How Moderation Works

When a message is flagged, it appears as an open item in the Moderation Queue.

A moderator can:

  1. Review the message

  2. View surrounding context

  3. Review the user’s prior moderation history

  4. Take action

Moderation actions include:

Archive

  • Keeps the message in the user’s moderation history

  • Removes it from the open queue

  • Allows follow-up actions such as deleting the message or contacting the user

Dismiss

  • Removes the message from the Moderation Queue

  • Does not retain it in the user’s moderation history

Moderator actions are logged for transparency.

AI does not automatically discipline or notify users. All decisions are made by club staff.


What Is Included and Not Included

Included:

  • Team Chat messages

  • AI-flagged messages

  • Staff-reported Team Chat messages

Not Included:

  • Direct Messages

  • Staff-only chats

  • Automatic notifications to parents or players

Moderation decisions remain internal to your club.


Edge Cases

Deleted Message

  • If a message is flagged and later deleted from Team Chat, the moderation record may still appear in the panel when available for review.

Deleted User

  • If a user account is deleted, their moderation history remains visible and the user is labeled as Deleted.

Duplicate Reports

  • If the same message is reported multiple times, the system consolidates reports into a single moderation entry.

Suspended Players

  • Messages from suspended players are automatically blocked and do not appear in Team Chat or the Moderation Panel.


Access and Permissions

Access is controlled by role permissions.

To grant access:

  1. Go to Roles & Permissions

  2. Enable Content Moderation Panel under Dashboard Permissions

Default behavior:

  • Club Administrators have access

  • Other roles must be explicitly granted access

Clubs are encouraged to limit moderation access to a small, trained group to ensure consistent and fair decision-making.

Players and parents cannot see moderation data.


What This Panel Is Designed For

The Content Moderation Panel helps clubs:

  • Catch issues early

  • Protect player safety

  • Maintain appropriate communication standards

  • Create consistent moderation processes

  • Document moderation decisions for accountability

This tool supports proactive leadership and safety monitoring.


Frequently Asked Questions

Does AI automatically punish users?

No. AI only surfaces messages for review. Club staff always make the final decision.

Can players or parents see moderation activity?

No. Moderation data and actions are internal and only visible to authorized staff.

Can we control who moderates messages?

Yes. Access is managed under Roles & Permissions by enabling the Content Moderation Panel permission.

Are Direct Messages included?

No. Direct Messages and Staff Chats are not included in content moderation at this time.

Did this answer your question?