Enterprise collaboration brings organizations boosted rates of productivity and employee retention, as well as shortened onboarding times for new hires.
However, organizations cannot overlook potential risks in enterprise-scale collaboration platforms like Workplace, Yammer, Microsoft Teams and Slack Enterprise Grid. The concerns range from insider threats with 1 in every 149 messages containing confidential information, to HR violations when private messages are 160% more likely to be toxic than public content.
The demands of community management can be overwhelming, especially in a large organization with an integrated digital workplace. Fully embrace the benefits of collaboration when you manage human behavior risk with a comprehensive content moderation strategy.
Content Moderation is the practice of monitoring and applying a pre-determined set of rules and guidelines to user-generated communications to determine whether the content is permissible, or not. Content moderation is an effective strategy for companies to protect their employees, customers, intellectual property (IP) and brand when connecting employees on the free-flowing digital workplace.
With titles like Workplace Community Manager or Digital Workplace Consultant becoming commonplace in enterprises, intentional community management of the digital workplace is here to stay.
Use these 5 best practices to successfully employ content moderation strategies in your collaboration platforms:
Clearly define what is okay and what is not okay in your community. Common areas of concern include sensitive or confidential information, inappropriate employee behavior and risky file types.
Social Media Today outlines common types of content moderation. Multiple types often comprise a holistic content moderation strategy. For example, your organization may employ post-moderation for messages that contain swear words, but automate moderation for any content with a numerical pattern that matches a social security number.
Don’t forget, employees collaborate in a variety of contexts including text, images or file attachments in public or private groups, as well as 1-to-1 chat. In fact, 43% of all messages in collaboration are private. Have plans in place to moderate all of these communication contexts.
In the spirit of transparency, it is best practice to communicate any form of monitoring or content moderation to your employee community.
While you want to remove inappropriate content from your collaboration platform, you don’t want to over-moderate. Too much moderation will delay knowledge sharing, collaboration and organic communication, as well as push employees to find other 'shadow' solutions to communicate with one another.
Digital workplaces are a place for employees to collaborate with one another and express their authentic selves and—just like real life—that’s not always sunshine and roses. Instead, use a qualitative insights solution to understand your community and understand where unhealthy or negative conversations might be happening. This knowledge will enable you to strategically respond on individual employee or broader community level.
The digital workplace is taking a spotlight in the modern enterprise. The key to keeping executive buy-in and maximizing the benefits of enterprise collaboration is managing and mitigating risky behavior in the platform. That means moderating public and private content throughout your company-owned tools.
Satisfy governance, risk and compliance (GRC) stakeholders, while also delivering data-driven stories on topics, behavior and sentiment of your digital communities with one partner.
Integrate Aware with Slack Enterprise Grid, Workplace from Facebook, Yammer and Microsoft Teams to get the most ROI out of your digital workplace.
Not to worry, you may unsubscribe from these communications at any time. For information on how to unsubscribe, as well as our privacy practices and commitment to protecting your privacy, please review our Privacy Policy.
111 Liberty Street, Suite 102
Columbus, OH 43215