NewsGuard has carried out a new study on disinformation on social networks. On this occasion, they focused on the TikTok platform, where they analyzed the behavior of the algorithm on a particular topic: the war in Ukraine. According to The Guardian, the result of this study has been very conclusive. And it is that TikTok guides its users to false news about the conflict after spending about 40 minutes on the application.
NewsGuard is dedicated to monitoring the veracity of certain news outlets throughout the web. In their report on TikTok, the team behind the investigation confirms that “near the end of a 45-minute experiment, the researchers' feed was populated by both correct and false information related to the war in Ukraine. Without any distinction between disinformation and reliable media”, and add:
“At a time when false narratives about the Russian-Ukrainian conflict are rampant online, none of the videos provided to our analysts by the TikTok algorithm contained any information about the reliability of the source; warnings, checks on facts or additional information that could provide users with reliable information.”
The most common fake news about Ukraine on TikTok
Of course, the selection of fake news on the app was quite varied. However, the most prominent ones are narratives that we have already heard before. Among them, we have the alleged claims that the United States has biological weapons laboratories in Ukraine. Also others who claim that Putin's statements during a press conference in early March are edited.
Other news try to sell that the fake video material is real; while some true ones are considered misleading in the application. Among the first, for example, are the videos of “The Phantom of kyiv”, which allegedly appears shooting at Russian aircraft. However, they are scenes taken from a video game. For their part, accounts inclined towards the Kremlin claim that some real content is false.
In addition, some videos shown to researchers during the tests are recognized as Kremlin propaganda, although the TikTok algorithm does not seem to care too much and shows them to users anyway.
How was the study carried out?
To conduct their investigation, the NewsGuard team started with the following: creating TikTok accounts. Thus, they already had a blank canvas to start with their tests. Once this step was done, they spent about 45 minutes scrolling through the For You tab of the app. During this process, they stopped to watch in their entirety all those videos that talked about the war in Ukraine.
While TikTok doesn't provide a detailed breakdown of how its algorithm weighs signals, the company says it takes into account watch time for various videos, as well as other signals including likes, comments, and who you follow or have blocked. a user.
The Guardian In this way, the researchers tried to train the algorithm of the application without opting for real or misleading information. With this they wanted TikTok to continue showing more and more content about the war. The result, as we mentioned before, was a mixture of true and false news, without any kind of warning.
The team further found that using the search terms “Ukraine”, “Russia”, “War”, “Kyiv” and “Donbas”, the TikTok algorithm kept mixing up real and fake content in the top 20 of the results.
TikTok has spoken about it
Photo by Solen Feyissa on Unsplash Of course, the company has been quick to state its position on the results obtained by NewsGuard. According to a TikTok spokesperson, these types of experiments “only offer limited conclusions about how the application works in the real world,” they tell The Guardian. According to his position, the research fails to imitate the viewing behaviors usual among users of the social network
. We continue to respond to the war in Ukraine with increased security resources as we work to eliminate harmful misinformation and help protect the TikTok experience. We also partner with independent fact-checking organizations to reinforce our goals of helping keep TikTok safe and authentic.”
TikTok Of course, the company does not have full control over the posts of its users. However, platforms have already established tools to eradicate disinformation in their communities; either by vetoing certain media outlets or labeling news as fake or aligned with the Russian government. If TikTok wants to make sure they create a safe space in their app, they better jump on the bandwagon before they become an unchecked incubator of fake content.