YouTube is launching a new feature that will push notifications to commenters, aimed at making them reconsider posting hateful responses to videos. The move comes as toxicity has been an ongoing problem on the platform with comment sections of various videos devolving into a hateful dirge of offensive and abusive remarks, often aimed at the creators of the videos.
As TechCrunch reports, Google will also test a new filter for creators that will automatically hide offensive comments targeted at them, and hold them for review.
Starting now, users will start a seeing an AI generated pop-up when they are about to post something abusive in the comment section. The notification will remind them to check if what they are going to say is acceptable within community guidelines, and urge them to reconsider posting it.
Google says that the feature will only tell users to reconsider their comments but will not stop them from posting it, if they so choose. In case you feel the algorithm has made a mistake, you can also report it directly to Google from within the pop-up.
The idea is to delay the posting of the comment and give users a little bit of time, to reconsider what they are about to say. This should prevent them from acting recklessly or in anger.