Digital Workplace Content Moderation: The Key to Managing Insider & Compliance Threats
by Aware
Enterprise collaboration brings organizations boosted rates of productivity and employee retention, as well as shortened onboarding times for new hires.
However, organizations cannot overlook potential risks in enterprise-scale collaboration platforms like Workplace, Yammer, Microsoft Teams and Slack Enterprise Grid. The concerns range from insider threats with 1 in every 149 messages containing confidential information, to HR violations when private messages are 160% more likely to be toxic than public content.
The demands of community management can be overwhelming, especially in a large organization with an integrated digital workplace. Fully embrace the benefits of collaboration when you manage human behavior risk with a comprehensive content moderation strategy.
What is Content Moderation?
Content Moderation is the practice of monitoring and applying a pre-determined set of rules and guidelines to user-generated communications to determine whether the content is permissible, or not. Content moderation is an effective strategy for companies to protect their employees, customers, intellectual property (IP) and brand when connecting employees on the free-flowing digital workplace.
Enterprise Collaboration Content Moderation Best Practices
With titles like Workplace Community Manager or Digital Workplace Consultant becoming commonplace in enterprises, intentional community management of the digital workplace is here to stay.
Use these 5 best practices to successfully employ content moderation strategies in your collaboration platforms:
Create clear community guidelines
Clearly define what is okay and what is not okay in your community. Common areas of concern include sensitive or confidential information, inappropriate employee behavior and risky file types.
Use the right type of moderation
Social Media Today outlines common types of content moderation. Multiple types often comprise a holistic content moderation strategy. For example, your organization may employ post-moderation for messages that contain swear words, but automate moderation for any content with a numerical pattern that matches a social security number.
- Pre-Moderation: A pre-moderation strategy is characterized as the requirement of a moderator to check all submitted content before it is shared to the community. In the context of enterprise collaboration, this approach is not often recommended because the delay in content sharing inhibits the organic, collaborative nature of the platforms.
- Post-Moderation: A post-moderation strategy shares content to the community directly after sharing, but is replicated in a queue for a moderator to review and potentially remove afterwards. This type of moderation can be useful for human resource investigations where additional context gathering is often required before taking action.
- Reactive Moderation: Reactive moderation strategies rely on community members to flag content that is either in breach of community rules, or that the members deem to be undesirable.
- Automated Moderation: An automated moderation strategy uses technology to process shared content and apply defined rules to reject or approve submissions. This might include quarantining or tomb-stoning content for further manual review.
- No Moderation: This is exactly as it sounds—no moderation at all! Given the high level of potential risk that comes with any tool containing employee communications, this is not recommended for most communities and certainly not in an enterprise environment.
Moderate all types of private and public content including text, image and shared files
Don’t forget, employees collaborate in a variety of contexts including text, images or file attachments in public or private groups, as well as 1-to-1 chat. In fact, 43% of all messages in collaboration are private. Have plans in place to moderate all of these communication contexts.
Communicate your moderation practices with your community
In the spirit of transparency, it is best practice to communicate any form of monitoring or content moderation to your employee community.
Don’t over-moderate
While you want to remove inappropriate content from your collaboration platform, you don’t want to over-moderate. Too much moderation will delay knowledge sharing, collaboration and organic communication, as well as push employees to find other 'shadow' solutions to communicate with one another.
Digital workplaces are a place for employees to collaborate with one another and express their authentic selves and—just like real life—that’s not always sunshine and roses. Instead, use a qualitative insights solution to understand your community and understand where unhealthy or negative conversations might be happening. This knowledge will enable you to strategically respond on individual employee or broader community level.
The digital workplace is taking a spotlight in the modern enterprise. The key to keeping executive buy-in and maximizing the benefits of enterprise collaboration is managing and mitigating risky behavior in the platform. That means moderating public and private content throughout your company-owned tools.
Download the whitepaper on Creating a Human-Centered Organization to discover how innovative leaders are managing and moderating the digital workplace.