Over the past decade social media platforms have continuously been questioned on the amount of freedom there is to express one’s opinion online without their content being removed or flagged as inappropriate.
YouTube has been slammed for their new advertiser-friendly guidelines in moderating content, leaving digital series creators on the site to have their work de-monetised and removed. So, what’s moved YouTube to update their policies with stricter terms and conditions for content and ultimately decide if it is acceptable?
Why should I use Youtube to share my content?
YouTube, launched in 2005, is one of the third largest social media platforms after Facebook and the professional network LinkedIn. The platform consists mainly of videos made by independent creators, including music videos, sitcoms, short films and vlogs. The platform’s shareability has garnered a huge following over the years, with currently 1 billion active users each month.
Despite its huge success, over the past year YouTube’s content moderation system has caused creators to fight back for their rights to have monetised videos on their channels. Popular American YouTuber Philip DeFranco, who publishes daily content discussing current news and pop culture, first noticed the change after a few of his videos had been flagged as inappropriate for monetisation, without his knowledge. The YouTuber posted a video titled “Youtube Is Shutting Down My Channel and I’m Not Sure What To Do” calling out YouTube for accusing him under the terms of “graphic content or excessive strong language”. He described YouTube’s content moderation as “taking away the ability to monetise a video where you’re saying things that they don’t deem ‘okay’, that’s been described as censorship with a different name – because if you do this on the regular, and you have no advertising, it’s not sustainable”.
How can I make sure my videos aren’t removed from Youtube?
In Youtube’s community guidelines, the site states that their platform is for “free expression”. However, they do not allow content that “promotes or condones violence against individuals or groups based on race or ethnic origin, religion, disability, gender, age, nationality, veteran status, or sexual orientation/gender identity, or whose primary purpose is inciting hatred on the basis of these core characteristics.” Whilst they protect their users from harm and hateful content, videos are constantly flagged as inappropriate for the wrong reasons. Respecting Youtube’s guidelines and ensuring you use your own content will save you from having your videos removed.
“Respect the YouTube community: We’re not asking for the kind of respect reserved for nuns, the elderly, and brain surgeons. Just don’t abuse the site. Every cool, new community feature on YouTube involves a certain level of trust. We trust you to be responsible, and millions of users respect that trust. Please be one of them. “ – Respect the YouTube community – a statement from YouTube’s community guidelines.
Can my content be removed if I’m using Creative Commons content?
For all content creators out there, YouTube allows users to mark their videos with a Creative Commons license, allowing any video created using Creative Commons content to automatically appear in the source videos’ title. Monetisation is allowed when using a Creative Commons license, however if a video is flagged as ‘inappropriate’, the chances of a creator’s series to be demonetised is increased, and there’s more ways than one it can be removed.
On 20 September 2016, YouTube announced their newest addition of moderating content with “YouTube Heroes”, giving users the chance to gain points for flagging inappropriate videos. Once you flag more videos inappropriate, you can take part in ‘hero video chats’, can ‘mass flag videos’ and once reaching level 4, can go behind the scenes and contact YouTube staff directly. The concept of giving users around the world the chance to impact the removal of a video can be a harmful way of censoring videos online, however YouTube says the Heroes program is currently in “beta and subject to change”.
So does YouTube have a censorship problem?
That is the question – or is YouTube taking the right steps to better safety online, by moderating content based on what they feel is deemed ‘advertiser friendly’? YouTube, which first began as a private entity, will need to work harder in protecting its reputation for holding a platform for creators to share their talents. As YouTube and other social media platforms slowly incorporate itself in today’s growing digital world, censorship is an important factor when it comes to sharing creative work.