YouTube launches new ‘Creator Safety Center’ to help creators manage unwanted attention
YouTube is looking to help creators avoid unwanted in-app interactions and attention, with the launch of a new Creator Safety Center platform, which includes key links, tips, advice and notes on various related items.
As explained by YouTube:
“In a recent survey, we learned that while 95% of creators experienced unwanted behavior across multiple social platforms, only 50% said they had access to the resources or support needed to manage those interactions. »
Based on this information, YouTube worked with a range of third-party experts, including ConnectSafely, The Family Online Safety Institute and the National Cybersecurity Alliance, to put together its in-depth new collection of tips to help creators stay safe.
The new Creator Safety Center includes tips on how to handle these issues, in keeping with the growth of the channel, and how to combat bullying, trolling, account takeovers and more .
Much of the information included here isn’t new, it’s just been put into a more coordinated and centralized space, with a special focus on creators to help address key concerns.
The announcement is part of a larger YouTube initiative that examines how it is working to advance underrepresented communitiesand opportunities for minority creators.
Over the coming weeks, YouTube will share more information about how it helps these communities, as well as the tools and processes it is developing to maximize the potential of all users.
You can access the new YouTube Creator Safety Center here.
Source link
TikTok expands downvote test for video responses and adds new prompts to highlight its safety tools
After years of internal testing and experimenting with downvote options in social apps, we’re now seeing more platforms actually testing them, though not exactly in the way that many would have assumed, based on previous discussions.
TikTok is the latest platform to test a downvote option, and some users now see downvote messages that allow them to flag video responses they don’t want to see.
As TikTok explains it:
“We have started testing a way to allow people to identify comments that they think are irrelevant or inappropriate. This community feedback will add to the range of factors we already use to help keep the comments section consistently relevant and a place for genuine engagement. To avoid creating bad feelings among community members or demoralizing creators, only the person who registered a comment that I don’t like will be able to see that they did so.”
As you can see from the example above (posted by a social media expert matthew navarra), some users now see a “like” icon to the right of video responses, allowing them to signal their disinterest or dissatisfaction with the comment. TikTok first started experimenting with the option in 2020, but is now expanding the option to a broader group of users.
TikTok hasn’t explained how this will affect comment ranking, and whether this will lead to a change in how it displays comments for individual users or in general, but the idea is that this will give you more comments on what you don’t. users like. , allowing you to further hone your systems to prioritize the most relevant and engaging feedback.
So it’s not exactly a downvote option like you see on Reddit, where the community dictates the ranking of answers. But it could be, eventually, depending on how TikTok decides to go with those ideas.
As noted, several platforms are running similar tests, with Facebook also testing downvotes for answers and Twitter also working on the same thing. Maybe that leads to a variable ranking of responses, or maybe it helps platforms spot more trends and patterns in negative responses to better take action on them. By being vague about how response data is used, they leave the door open for different approaches, but it would be assumed that eventually this will allow for better ranking of comments, which could help improve engagement.
Really, it might be better to follow a Reddit model, where users can see the full data about the replies that prompt their comments, so they can then better understand the community’s response to their comments, which could be a better cycle educational feedback to encourage more civil interaction. But at the same time, there’s some hesitation in adding potentially negative comments like this, and with so few Twitter users already posting comments (10% of Twitter users create 80% of tweet content, according to research), that’s probably not the best way to encourage wider interaction in each app.
For context, the above analysis has suggested that Over 98% of Reddit’s monthly active users never post or comment on the app.
In addition to downvotes, TikTok is also experimenting with new reminders that will guide creators to their comment filtering and bulk blocking and removal options.
“Reminders will be shown to creators whose videos appear to receive a high proportion of negative comments. We will continue to remove comments that violate our Community Guidelines, and creators can continue to report comments or accounts individually or collectively for our review.”
TikTok will now also allow users create in-app reports via videoadding another way to provide more context in your reports, while also better aligning with app usage behavior.

Combined, the new tools will improve the feedback cycle for TikTok users and provide more information for TikTok to refine and further improve its systems, which could help make the app experience safer, more welcoming, and more attractive.