Gadgets News

TikTok takes action against false positives and dangerous threats

TikTok has promised to do more to deal with the spread of lies and dangerous pressures. Most of the challenges of TikTok are not that dangerous and fun. Some are dangerous, such as this year’s crate problem, which resulted in many injuries. But well-meaning parents and other adults who want to warn others of the danger may unwittingly detect the danger, even if it is just a fad.

The company provided research more than 10,000 young people, parents and teachers in several countries, including the US and UK. It found that 31 percent of teens participated in some online activities.

The teens were asked about the risk factor they had recently seen online, not on TikTok. About 48 percent said the problem was safe, 32 percent said it was a little dangerous and 14 percent said it was dangerous or dangerous. Respondents said three percent of the problems were “extremely dangerous,” while 0.3 percent said they participated in the problems they listed as follows.

The study found that 46 percent of teens want to know more and help understand the risks of complications, while 31 percent said they “suffer” from self-injury and suicide. Recognizing and dealing with lies is not easy. 37% of adults surveyed say that they find it difficult to talk about self-injury and suicide without attracting their attention.

TikTok says it already removes the lies and takes action to reduce their spread, but plans to do more. It will download “alarmist” videos about self-inflicted injuries. “This study showed how self-injury warnings – even when given with good intentions – can affect the lives of young people because they often see lies as real,” TikTok said. “We will continue to allow for dialogue to take place that seeks to allay fears and promote accurate knowledge.”

The , for example, was a well-known viral infestation that many people experienced a few years ago. Its prevalence has been fueled by those who warn of “dangers,” which many falsely claim are encouraging children to injure themselves.

Another security change that TikTok has developed is to develop “technologies that help alert our security forces to a sudden increase in violation of what hashtag links.” Each time a user searches for a link to a false or malicious content, they have seen a warning sign.

The company worked with a pediatric psychiatrist and psychologist to change the language of the alphabet. Users who investigate fraud and malicious content will be encouraged to visit TikTok’s Safety Center to learn more about . If the investigation is related to a suicide-related or self-inflicted injury, they will look for resources such as the National Suicide Prevention Helpline.

In the US, the National Suicide Prevention Lifeline is 1-800-273-8255. The Crisis Text Line can be obtained by sending a HOME message to 741741 (US), 686868 (Canada), or 85258 (UK). Wikipedia is free a list of difficult lines to outsiders in those lands.

All sales supported by Engadget are selected by our writing team, independent of our parent company. Some of our articles include links to links. If you purchase something through one of these links, we may be able to find a support service for you.


Source link

Related Articles

Leave a Reply

Back to top button