YouTube is encouraging and “empowering” its users to “flag” content that does not meet the site’scommunity guidelines, while also offering more transparency on how its flagging feature works.
The video site’s renewed call on users to help enforce its community guidelines by flagging inappropriate content on the platform comes hot on the heels of YouTube’s apparent demonetization of some videos for content considered unfriendly to advertisers.
Empowering the YouTube community to use the flagging feature located beneath every video and comment will help “keep YouTube a platform where openness and creative expression are balanced with responsibility,” the Google-owned video site said on the official YouTube blog.
Among some of the community guideline no-nos are:
- No spam, misleading descriptions, tags, titles or thumbnails in order to increase views.
- No copyright infringing content. Only upload videos that you made or are authorized to use.
- No predatory behavior, stalking, threats, harassment, intimidation, invading privacy, revealing other people’s personal information, and inciting others to commit violence.
- No posting videos that encourage others to do things that might cause them to get badly hurt, especially kids.
“We have trained teams, fluent in multiple languages, who carefully evaluate your flags 24 hours a day, seven days a week, 365 days a year in time zones around the world. They remove content that violates our terms, age-restrict content that may not be appropriate for all audiences, and are careful to leave content up if it hasn’t crossed the line,” announced the video-sharing site on the YouTube blog.
How Flagging Content on YouTube Works
When flagging, any user from anywhere in the world can report which policy they think a video violates, from spam and sexual content to harassment and violent content, YouTube said. This in turn helps the video site’s team to route and review flagged content more efficiently and effectively:
Interestingly, over 90 million people have flagged videos on YouTube since 2006 — that’s more than the population of Egypt, according to YouTube. And over a third of these people have flagged more than one video. “As YouTube grows, the community continues to be very active in flagging content: the number of flags per day is up over 25 percent year-on-year,” stated the video company.
What this notable rise in flagging activity and renewed push to encourage it really means is that small business owners who depend on YouTube for their revenue need to be increasingly careful not to accidentally violate the video site’s guidelines. There have been reports that the YouTube complaints-appeals system is not particularly great. Apparently, YouTube users who get flagged and appeal a complaint can easily get trapped in a confusing automated system that can be quite inconveniencing.
Not all flagged content automatically gets removed, though. YouTube says it also takes into account local laws in the countries where it operates and if they receive a valid legal notice that content violates a local law, they will restrict that content in the local country domain.
“We hope this additional transparency will help you continue reporting responsibly,” the video platform said.
[“source-smallbiztrends”]