TikTok Users Now Able to Downvote Comments
Oct-04-2022
While some services (looking at you, YouTube!) search to minimize the negative feedback by disabling dislikes, others see it as a form of trimming inappropriate responses. TikTok now, after months of testing, enables the ability to downvote comments to all the users. This way, it trusts the community with flagging toxic or misleading comments on the media.
As you may have guessed, the icon for downvoting looks traditional, as a “thumb down” image. The number of downvotes will be shown at the right of the comment in question, so the most downvoted ones attract more attraction from both account owners and visitors. You won’t see, though, who else downvoted the comment; you’ll only see the number of downvotes, not the names of those who left them. This will also be used by TikTok algorithms to detect the most controversial comments and the accounts that leave them.
It’s rather about letting the community help the platform to remove toxic users, TikTok says, rather than just show the community’s reaction to a certain statement or manner. This will help the algorithms to detect which comments provoke the most condemning reaction to them, so the algorithms learn. TikTok has always been about machine learning and suggestions, and in this case, it also hopes to use the community’s help to provide enough material for detecting toxic accounts and comments.
The comment doesn’t have to contain slurs or misinformation to be downvoted; it may be just inappropriate, irrelevant, irrespective, or outright stupid. This type of unwanted comments will also be studied and reacted to. In the end, TikTok supposes this measure will make the environment friendlier and cleaner.
What do you think about downvotes? Will they do the job, or will they become another tool of harassment? How do you react when your own comment is downvoted on Reddit or other social media? Let’s share our expectations and experiences in the comments! None will be downvoted here.
Leave a comment
Your comment is awaiting moderation. We save your draft here
0 Comments