YouTube’s new update comes as social media companies face increasing criticism for their failure to stop spam, harassment, and abuse, which is seen to be damaging to users' mental health.
YouTube has also included a new function that will provide users with a warning if their behaviour is found to be against the platform's Community Guidelines.
YouTube's official help website and YouTube's machine-learning algorithms will now be able to detect malicious individuals' sophisticated spamming tactics. Additionally, YouTube also stated that 1.1 billion spam comments were removed in only the first half of 2022.
The video-sharing site owned by Alphabet has also enhanced its automatic bot identification algorithms, which will be used during live conversations to guard against any negative effects. The addition of a new method for comment removal, warnings, and timeouts is among YouTube's most significant updates.
When an inappropriate remark uploaded from a user's account is found to have broken YouTube's Community Guidelines, the new function will alert users with a warning. According to YouTube's support page post, repeat offenders will have their commenting privileges temporarily restricted for up to 24 hours. Users may only use these features to comment in English at this time. However, in the upcoming months, YouTube intends to extend similar features to other languages as well.
In an effort to reduce the amount of inaccurate health-related YouTube material, the platform also recently announced that it will begin certifying physicians, nurses, and other healthcare providers.