{"id":4267,"date":"2018-04-25T14:47:07","date_gmt":"2018-04-25T14:47:07","guid":{"rendered":"https:\/\/thesocialelement.agency\/?p=4267"},"modified":"2020-10-21T09:27:37","modified_gmt":"2020-10-21T09:27:37","slug":"invest-in-your-community-social-media-moderation","status":"publish","type":"post","link":"https:\/\/thesocialelement.agency\/invest-in-your-community-social-media-moderation","title":{"rendered":"Invest in Your Community: They\u2019re the Future of the Web"},"content":{"rendered":"

If you\u2019ve been following conversations at the intersect of politics and tech, you\u2019re aware of the current debate around the role that platforms have to play in moderating the content they display. Facebook has come under fire for its moderation policies, Twitter is being held accountable for fake users on its platform, and Snapchat is under scrutiny for showing inappropriate ads and content to underage users. These are big conversations that involve examining and reworking complex algorithms as well as simply upping resources (Facebook hires more moderators than the entire headcount of Twitter<\/a>).<\/p>\n

 <\/p>\n

Platforms need to do better.<\/h2>\n

The Social Element\u2019s own CEO has been on multiple panels stating that yes, platforms need to do better. Unfortunately, no one seems to have quite broken the code yet on how to do this.<\/p>\n

 <\/p>\n

Honestly, the problem of inappropriate content on platforms isn\u2019t a tech problem: it\u2019s a human problem. Barring high-profile examples involving Russian bots, all content online is created by humans. And the internet really isn\u2019t a lawless place \u2013 content creators are subject to laws in their countries. So why do platforms keep having to atone for content that a small group of users create?<\/p>\n

 <\/p>\n

Simply put, platforms are easier targets.<\/h2>\n

They are more likely to comply with content removal requests, they\u2019re easier to blame, and, importantly, they have more money. The Internet and its content sees no borders and working with governments to penalize their citizens for the content they produce online is a complicated issue with international and diplomatic ramifications.\u00a0Nonetheless, it is an essential aspect for us to contend with: a human-produced that content, and purposefully chose to upload it and share it with the world.<\/p>\n

 <\/p>\n

Which brings us to the critical importance of online communities, particularly ones fostered by brands. An expert team of human moderators is absolutely necessary to keep your brand safe and your community healthy. Beyond this, how can brands act to engage the humans in their community to produce content that is uplifting rather than problematic? It boils down to engaging in human, authentic ways.<\/p>\n

 <\/p>\n

Here are a few tips to bear in mind:<\/h2>\n