UGC

The Rise and Regulation of User-generated Content Platforms

The Rise and Regulation of User-generated Content Platforms

Share this on

Platforms led by user-generated content (UGC) have taken the internet by storm in the past twenty or so years, most notably platforms like Facebook, Instagram, YouTube, Roblox and more. By definition, UGC is any content – text, videos, images, reviews, etc. – created by people, rather than brands. It is a great way for brands to build trust, authenticity, engagement and a sense of community. For these reasons, UGC platforms have become incredibly popular, with platforms like YouTube for example attracting 2.1 billion users worldwide, according to Statista. However, the astronomical number of users on many UGC platforms makes it increasingly difficult to effectively moderate the content being posted, in terms of copyright and user safety.

The astronomical number of users on many UGC platforms makes it increasingly difficult to effectively moderate the content being posted, in terms of copyright and user safety.

Erly Stage Studios recently spoke to Sydney Liu, who founded the online literary community Commaful in 2015. Commaful is a platform where users can publish short stories on a range of topics and genres. Sydney told Erly Stage:

‘In a world where “stories” means lip syncing, dance videos, fake pregnancies, and filming dead bodies for virality, we want to reclaim the meaning of storytelling. Commaful is a platform for actual stories and we do it by designing a new content format that makes storytelling actually fun’.

Commaful has been creator-focused since day one. As a result, they have got to know their users, which has informed platform improvements and led to the wonderful community on Commaful that exists today. 

(Source: Commaful Instagram)

UGC platforms can also provide creators with a means to earn an income on the side, or even the opportunity to take up content creation as a lucrative full time job. This is also known as the ‘Creator Economy’, where creators can earn money from their audience. Gaming site Roblox boasts 199 million users worldwide, according to SEO specialist BackLinko. Roblox users can create and earn money from their own content through the ‘Avatar Shop’, as well as custom Models and Plugins. In 2020 alone, creators and developers on Roblox collectively made over $328 million. 

Another example of a UGC platform feeding the Creator Economy is Patreon, which allows creators to offer exclusive content to their fans who pay a monthly subscription. When fans subscribe, they are also able to enjoy being part of a community of people with similar interests. 

There are an increasing number of ‘new kids on the block’ jumping on the UGC bandwagon such as Clubhouse, which was released in April 2020. Clubhouse serves as a combination of radio, conference calls, and it is also often compared to the popular app Houseparty. According to BackLinko, Clubhouse currently has 10 million weekly active users, up from 600,000 in December 2020 and a valuation of $1 bn, proving that UGC platforms are not going anywhere soon. 

One of the greatest challenges facing UGC platforms is the issue of regulation. When a brand hands the reigns to its users, they hand over a degree of control over their platform. This becomes problematic when posts violate copyright or are potentially harmful to other users, particularly children. For this reason, UGC platforms face a constant battle in striking a balance between allowing users to have freedom to create while also avoiding violations of the law. 

New rules instilled by local governments have shifted the responsibility of content moderation towards the UGC platforms themselves. For example, in December 2020 the UK government issued an ‘Online Harms Full Government Response’, which introduced stricter rules, placing heavier responsibility on the platforms in managing their content regulation. Under these new rules:

“All companies in scope will need to tackle illegal content on their services and protect children from harmful and inappropriate content, such as pornographic or violent content. The regulator will have additional powers to ensure companies take particularly robust action to tackle terrorist activity and child sexual abuse and exploitation online.” (Source: gov.uk)

Some large companies will also be required to lay out clearly in their terms and conditions exactly what content is acceptable. If these terms are not enforced effectively, senior members of staff could risk facing criminal sanctions. Sydney Liu told Erly Stage that content moderation has always been one of the Commaful’s top priorities and they have built a number of tools to filter inappropriate content and review issues as efficiently as possible, but it is an ongoing battle and learning process. 

Moderation becomes even more important for platforms like Roblox, which has a user demographic of 55% below the age of 13, according to the platform’s investor presentation in February 2020 which makes up around 110 million of its users. To moderate its content, Roblox explicitly lists inappropriate behaviour in their ‘Community Rules’:

(Source: Roblox Community Guidelines) 

To actively regulate the content, Roblox encourages its users to flag and report any inappropriate content. According to their Community Guidelines “If we are notified that your uploaded content violates another’s intellectual property rights, we will block or remove that content without notice”. 

However, Roblox’s approach to content moderation has left many users unhappy. According to Roblox Fandom, the platform outsources its moderation team: “these third-party moderators work on multiple sites with different guidelines (possibly the reason why many people consider Roblox’s moderation bad)”. Sure enough, a quick google of ‘Roblox moderation’ arrives at a series of forum questions such as “Why is Roblox moderation like this?”, “Why is Roblox moderation so awful?” and “bro Roblox’s moderation can be trash”. When platforms are over vigilant and ruthless in taking down content, it runs the risk of disappointing fans and damaging the brand’s reputation. 

Roblox also implies in its Community Guidelines that it partially relies on Artificial Intelligence (AI): “Some violations are blocked automatically through the use of filters and other detection systems.”. In 2019 a report was published by Cambridge Consultants on behalf of the UK communications regulator Ofcom, which places emphasis on the requirements of AI in moderating online content. Without the use of AI, it describes the process through traditional human methods as quite literally “impossible”. Recent advances in technology have allowed AI to pick up complex data inputs including speech, images and text. For this reason it has many incredible uses. This diagram is an example of a workflow where AI can be beneficial in content moderation:

(Source: Cambridge Consultants)

However, when it comes to content regulation, there are often cultural and societal sensitivities that impact what is considered ‘acceptable’, which is often where the capabilities of AI fall short. In some instances, users will post ‘Deepfakes’, which are images that have been carefully doctored to be undetectable through AI or human methods, which poses a high risk to user safety. 

UGC holds huge potential as a method to engage users, but it is its own worst enemy when the freedom allowed to users is abused. AI tools can be effective in preventing the spread of explicit content, but more implicitly harmful content is complex and difficult to crack down on. There are also some more ethical questions raised around the use of AI, for example to what extent can content still be considered ‘user-generated’ when it is so carefully picked apart through advanced technological AI systems? This will be a continuous learning process for all companies as UGC platforms and the pressure to regulate both accelerate.

Share this on

Leave a Reply