YouTube to Warn Users Before Posting Offensive Comments, Rolls Out Features to Support Diverse Communities

Social

YouTube is introducing new features to its platform to support diverse communities and encourage respectful interactions. The streaming platform will warn users when a comment they are about to post may be offensive to others, giving them the option to reflect before posting. YouTube will also test a new filter in YouTube Studio for potentially inappropriate and hurtful comments that have been automatically held for review, so that channel owners won’t have to look at those comments if they don’t want to.

Announcing the updates in a blog post, YouTube said that the platform was working to close any existing gaps in how YouTube’s products and policies work for everyone, specifically the Black community.

The new Community Guidelines reminder for posting potentially harmful comments is rolling out on Android. If the commenter wants to go ahead with the potentially harmful comment even after the pop-up notification, they can consider doing so, or even choose to edit/delete it. If users think they’ve been wrongly flagged, they can let YouTube know the same through the notification pop-up.

youtube community guidlines new youtube_community_guidlines_new

YouTube will warn users when a comment they are about to post may be offensive to others

According to a YouTube support page, the system learns from content that has been repeatedly reported by users. YouTube will also be streamlining the comment moderation tools to make it easier for creators to not read or engage with potentially inappropriate and hurtful comments.

In an effort to identify gaps in the system that could impact a creator’s opportunities to reach their full potentials, YouTube will, ask creators on a voluntary basis to provide their gender, sexual orientation, race, and ethnicity starting next year. Using this, the video streaming platform said it would examine how content from different communities is treated in its systems, such as search and discovery and monetisation.

The Google-owned company said it would also look into possible patterns of hate, harassment, and discrimination, that could affect some communities more than others.

YouTube said that it had invested in technology that would help the systems better detect and remove hateful comments, by taking into account the topic of the video and the context of the comment.

Meanwhile, YouTube launched three new filters earlier this week aimed at helping creators, artists, and publishers deliver an ‘even more interacting experience’ with Premieres. Two of the filters, Live Redirect and Trailers, have begun rolling out to eligible creators, while the third filter, Countdown Themes, will be available in the coming months.

Live Direct will allow creators to host a live stream as a pre-show just before the Premiere airs. The live audience will automatically be directed to the upcoming Premiere right before it starts. Trailers, on the other hand, will let users upload a pre-recorded hype video that will be played on the watch page in advance of the Premiere. This clip can range from 15 seconds to three minutes. Countdown Themes will let creators select a custom countdown for their Premiere across a range of themes, vibes, and moods.


Which is the best TV under Rs. 25,000? We discussed this on Orbital, our weekly technology podcast, which you can subscribe to via Apple Podcasts, Google Podcasts, or RSS, download the episode, or just hit the play button below.

Products You May Like

Leave a Reply

Your email address will not be published. Required fields are marked *