TikTok Warns Users About A Suicide Video That Is Going Viral
TikTok is trying to delete a suicide video that was uploaded to the platform on Sunday. Managing TikTok in comments to a video that a group of users has reloaded or used clippings in content has announced that the video will continue to be deleted and user accounts will be closed.
TikTok has been on the agenda for several days with an interesting video. A video of suicide on the platform on Sunday went viral in a few hours. Somehow, a group of users is not doing everything they can to spread this suicide video. And when this happened, the management of TikTok made important comments.
TikTok spokeswoman Hilary McQuide said the video was deleted immediately after the exchange, but some users uploaded this video again. McQuaid, who claims that some users have used snippets since suicide when creating the content, said the content has been scrutinized and such videos will be deleted. In addition, according to a representative of TikTok, user accounts are also at risk due to this suicide video.
In fact, suicide records were not uploaded directly to TikTok. Last week, when the United States ended the life of a person in Mississippi, transmitting these moments live on Facebook Live, suicide somehow passed to TikTok. But interestingly, users tried to turn this video into a viral one, paying more attention than expected. And frankly, I must say that this audience has achieved its goal.
Suicide footage is automatically detected and banned by TikTok’s algorithm
A statement made by the TikTok authorities said that videos from the moment of suicide, suicide statements and events that promote suicide violate the rules of the platform. Authorities claim that every content uploaded to the platform is controlled by algorithms, and say that these videos will be deleted instantly, even if they are uploaded again and again.,
“Your accounts can be closed”
TikTok spokeswoman Hilary McQuide said the accounts of users trying to share the suicide video again and again, which has begun recently, will be closed immediately. McQuaid, who has already stated that most of his users reported this video, thanked them for trying to protect the platform.
By the way, the suicide video is not the first. Users prefer Facebook Live in such cases. In 2017, a study by BuzzFeed News identified at least 45 cases of vulnerability on Facebook Live. These include suicide, armed assault, murder, torture and child abuse. Although Facebook claims to block such content with its algorithms, some content cannot be blocked.