YouTube is disabling comments on most videos that include children and minors, in response to renewed brand-safety concerns from advertisers.
The dramatic changes to the platform’s commenting system came after a number of high-profile advertisers, including Disney, Hasbro and AT&T, pulled all ads from YouTube.
The move followed reports that comments sections of certain videos were being used by pedophiles to share suggestive images of children.
YouTube on Thursday announced a handful of changes to combat those concerns.
Comments on videos featuring minors will be disabled, with only “a small number of creators” being allowed to keep comments open on those videos. Those creators “will be required to actively moderate their comments, beyond just using our moderation tools, and demonstrate a low risk of predatory behavior,” according to YouTube.
In addition, YouTube is fast-tracking a new comments classifier meant to proactively identify and remove predatory comments.
YouTube will also be more aggressive in terminating channels that include any content which endangers minors.
In a note sent to creators on the platform, YouTube executives said: “We know that many of you have been closely following the actions we’re taking to protect young people on YouTube and are as deeply concerned as we are that we get this right."
The company added: “We recognize that comments are a core part of the YouTube experience and how you connect with and grow your audience. At the same time, the important steps we’re sharing today are critical for keeping young people safe.”
The solution is dead simple: “... Those creators will be required to actively moderate their comments, beyond just using our moderation tools, ..."
If the content creators or the platform hosts want to include a "clean" comments function, their only option is to do the work required themselves. Pre-approval, done by actual human moderators, before comments can be posted is by far the best way to help ensure a civil comments function.
Relying on any "bad post" reporting system is almost useless, no matter how quickly the offending post is identified and removed. As the article states, the moderation must be active and preemptive, and not reactive.If the site doesn't want to spend what's required to moderate comments, then they have no reason to complain when their comments sections are shut-down by the platform owners.
There are solutions available now, today, for influencers and brands. Respondology offers a tool to remove any toxic comments – hate, predatory or other – on YouTube and Instagram as protection. It also can surface who the bad actors are. Filtering technology + 1k U.S. based moderators make it happen. And it’s a personal mission of mine, President of Respondology, and father of two boys online. We’re here to help. Respondology.com or Erik@respondology.com