An American video-sharing website, YouTube has released its first detailed statement about the incidents, as well as its decision not to filter out hateful and violent content.
A letter from head of communications Chris Dale on the @TeamYouTube account highlighting that if a creator’s content doesn’t violate Youtube’s community guidelines then YouTube will take a look at the broader context and impact.
But if in any case creators content is egregious and harms the broader community, YouTube may take action.
YouTube explained its two key policies- Harassment and hate speech.
For harassment, mainly YouTube will look at whether the purpose of the video is to incite harassment, threaten or humiliate an individual; or whether personal information is revealed.
A person using racial, homophobic, or sexist epithets on their own would not necessarily violate either of the policies but for hate speech YouTube will look at whether the primary purpose of the video is to incite hatred toward or promote supremacism over a protected group; or whether it seeks to incite violence.
For more details, read this official announcement letter by YouTube: