Next Article in Journal
A Systematic Literature Review on Applications of GAN-Synthesized Images for Brain MRI
Next Article in Special Issue
A GIS-Based Hot and Cold Spots Detection Method by Extracting Emotions from Social Streams
Previous Article in Journal
YOLO-DFAN: Effective High-Altitude Safety Belt Detection Network
Previous Article in Special Issue
Evaluation of the Factors That Impact the Perception of Online Content Trustworthiness by Income, Political Affiliation and Online Usage Time
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

COVID-Related Misinformation Migration to BitChute and Odysee

by
Olga Papadopoulou
*,
Evangelia Kartsounidou
and
Symeon Papadopoulos
Centre for Research and Technology Hellas—CERTH, Information Technologies Institute—ITI, 6th km Harilaou-Thermi, 57001 Thessaloniki, Greece
*
Author to whom correspondence should be addressed.
Future Internet 2022, 14(12), 350; https://doi.org/10.3390/fi14120350
Submission received: 31 October 2022 / Revised: 16 November 2022 / Accepted: 21 November 2022 / Published: 23 November 2022

Abstract

:
The overwhelming amount of information and misinformation on social media platforms has created a new role that these platforms are inclined to take on, that of the Internet custodian. Mainstream platforms, such as Facebook, Twitter and YouTube, are under tremendous public and political pressure to combat disinformation and remove harmful content. Meanwhile, smaller platforms, such as BitChute and Odysee, have emerged and provide fertile ground for disinformation as a result of their low content-moderation policy. In this study, we analyze the phenomenon of removed content migration from YouTube to BitChute and Odysee. In particular, starting from a list of COVID-related videos removed from YouTube due to violating its misinformation policy, we find that ∼15% (1114 videos) of them migrated to the two low content-moderation platforms under study. This amounts to 4096 videos on BitChute and 1810 on Odysee. We present an analysis of this video dataset, revealing characteristics of misinformation dissemination similar to those on YouTube and other mainstream social media platforms. The BitChute–Odysee COVID-related dataset is publicly available for research purposes on misinformation analysis.

1. Introduction

Millions of Internet users share their opinions and beliefs on social media platforms. Given that some users’ posts are considered offensive, misleading or, in general harmful, social media platforms are responsible for removing them to maintain a healthy online environment and protect their user community. To this end, they turn to content moderation policies and systems. For instance, Facebook has publicly stated its commitment to allocating 5% of the firm’s revenue, which amounts to USD 3.7 billion, to content moderation (https://knowledge.wharton.upenn.edu/article/social-media-firms-moderate-content/—accessed on 24 October 2022).
While large social media platforms arguably invest in fighting disinformation using content moderation, an ecosystem of alternative, low-content moderation platforms has emerged and become popular. For instance, platforms, such as Gab, Voat, Minds and 4chan are known for their tolerance to misinformation and harmful content in general. These platforms allow free speech with minimal standards for content [1], and most of them have been extensively studied due to their close connection to the QAnon movement [2,3]. BitChute and Odysee have recently joined the ecosystem of alternative low content-moderation platforms, attracting public attention. BitChute introduced itself in 2017 as a free-speech content-sharing platform, while Odysee serves as an alternative video sharing option since the end of 2020. To the best of our knowledge, these two platforms have not been analyzed extensively, even though they have attracted sizable user communities. A recent Reuters report highlights both platforms’ significant impact on the disinformation landscape, spreading conspiracy theories and abusive content (https://www.reuters.com/investigates/special-report/usa-media-misinformation/—accessed on 24 October 2022). Reuters reported that BitChute users falsely claimed COVID-19 vaccines caused cancers that “literally eat you” and spread the debunked claim that Microsoft founder Bill Gates caused a global baby-formula shortage. A video entitled ‘DR. RYAN COLE: COVID-19 VACCINES CAUSING AN ALARMING UPTICK IN CANCERS’ ( https://www.bitchute.com/video/BFrivyg3Yut0/—accessed on 24 October 2022) claimed that the COVID-19 vaccine is responsible for an increase in cancer. At the same time, the American Cancer Society explains that no information suggests that COVID-19 vaccines cause cancer.
BitChute said in a statement to Reuters that BitChute’s North Star is free speech, which is the cornerstone of a free and democratic society. Nevertheless, what happens when freedom of speech can cause negative consequences in public health matters? A search on the BitChute platform with the keywords ‘covid19 children’ returns many results. The top 5 results (on June 2022) are videos claiming that children should not be vaccinated and that vaccines will kill children. These videos containing false claims received high engagement by BitChute users with more than 15,000 views and an average of ∼284 likes posing risks to children’s health by influencing many parents’ decisions on child vaccination. Only a meagre percentage of users seems to contradict these claims (average value of dislikes ≈7).
A 26 min video called “The Plandemic” became viral on social media, affecting a large number of users and gaining millions of views and comments. The video was removed from mainstream platforms soon after it became viral but had a major impact on COVID-19 misinformation. Although the original video was removed from the platforms, many videos analyzing its content spread continue to appear on the platforms (Figure 1a). Additionally, a search on BitChute with the keyword ‘COVID-19’ returns several videos, with the first result being the Plandemic (Figure 1b). This example is telltale of a more general trend: users whose videos are removed by mainstream platforms turn to other online spaces to share their content. Motivated by this observation, in this study, we analyze this content migration trend from YouTube to BitChute and Odysee, using as a starting point a publicly available dataset of YouTube COVID-related misinformation videos that YouTube has removed.
Google Trends data also suggest that the BitChute platform has been growing in popularity since early 2019, peaking in May 2020, when the Plandemic video appeared, and then in early January. Figure 2 illustrated the analysis of Google Trends on BitChute and Odysee search terms worldwide and for the last five years. Odysee follows a similar growth path as BitChute but with somewhat lower rates.
In addition, some studies focus mainly on the content generated on BitChute [4,5]; however, little is known about the platforms’ user base and network, the content they generate (especially Odysee) and their interaction with other platforms. To address the above gaps, this study contributes to the study of the following aspects:
  • Migration of misinformation from the YouTube mainstream platform to alternative low content-moderation platforms, namely Bitchute and Odysee.
  • Analysis of the characteristics of misinformation dissemination in Bitchute and Odysee.
  • Analysis of how users engage with misinformation videos in Bitchute and Odysee.
To this end, we seek answers to the following research questions:
  • RQ1: Are misleading videos posted on mainstream platforms migrating to Bitchute and Odysee low content moderation platforms?
  • RQ2: Are there similarities between the dissemination of misinformation in Bitchute/Odysee versus in YouTube?
  • RQ3: How are users engaging with misinformation videos in Bitchute and Odysee?
  • RQ4: Do users who share misleading videos in Bitchute and Odysee maintain accounts on other platforms?
In the next section, we present previous work on health-related misinformation, focusing on misinformation studies on mainstream and low content-moderation platforms. Then, in the methodology section, we describe our dataset-creation process and its characteristics. Finally, we discuss our main findings, highlighting similarities and differences between the platforms and conclude the paper.

2. Related Work

In this section, we organize related work in the following three areas. We first present previous studies on misinformation and conspiracy theories related to health issues, focusing on public health policies and health protective behaviors, a rather critical topic during the COVID-19 pandemic. The spread of online misinformation during the pandemic has become a major public health concern, making it difficult to find credible and trustworthy information sources. Since social media platforms are highly associated with the spread misinformation online, our second topic of interest is the spread of misinformation on mainstream social media platforms, with a particular focus on YouTube videos. Since most of these mainstream platforms have adopted policies to reduce conspiratorial and anti-social content, low content-moderation platforms have emerged in the information ecosystem, promoting such content. Hence, we also present a few studies on misinformation on low content-moderation platforms, with a special interest in BitChute and Odysee, which are the focus of our study.

2.1. Health-Related Misinformation and Conspiracy Theories

Health-related misinformation, with a particular focus on conspiracy theories, has been widely studied in the past [6]. It has been observed that a tendency to believe that major public events are secretly orchestrated by powerful and malevolent entities [7] is negatively associated with health-protective behaviors [8]. Many studies found a negative relationship between medical conspiracy beliefs and health-protective behaviors [9], such as vaccination [10,11,12,13,14], or safer sex [15,16]. Moreover, people who believe in conspiracy theories tend to express higher levels of distrust of public institutions [17,18]; hence, their support for public health policies to address the pandemic is decreased [19]. These behaviors can raise concerns about having severe consequences in public health matters [20]. Furthermore, there is evidence that many Twitter users used metaphorical expressions to discuss about infections during the pandemic. Metaphors cannot be easily validated and when they are associated with negative emotions, such as fear, anticipation, and sadness; they hinder participation and awareness in healthcare discussions, contributing to the creation of a problematic health environment [21].
Health-related misinformation during the COVID-19 pandemic was characterized as another type of pandemic, known as “infodemic” [22]. Misinformation in the context of COVID-19 can include inaccurate information regarding the virus and its transmission, ‘fake news’, hoaxes, pseudoscience, conspiracy theories, and fabricated reports regarding methods of prevention and treatment. The most common topics of COVID-19 misinformation included, among others, the origin of the virus, cures and remedies, vaccination, masks and self-protection (https://covidinfodemiceurope.com/report/covid_report.pdf—accessed on 24 October 2022 ).
Conspiracy theories are frequently associated with explanations of the COVID-19 pandemic. With severe political, economic, and social implications, the current health crisis has created a powerful context and conditions in which conspiracy beliefs are likely to develop, such as vulnerability, insecurity, and isolation. Some recent studies provided evidence of a link between COVID-19 conspiracy or unfounded beliefs and reluctance to engage in health-protective behaviours related to COVID-19 prevention, such as minimizing the time spent outside the home, maintaining social distancing, wearing a face mask and hand-washing [23,24,25,26,27]. Another study found that people who believed conspiracies had lower intentions to vaccinate and to support COVID-19 public health policies [28,29]. In general, exposure to online COVID-19 misinformation is associated with a previous misinformation belief and sometimes with low preventive behaviours [23,30].

2.2. Misinformation on Mainstream Social Media and Video Platforms

Social media platforms play a key role in the information ecosystem. However, they are also associated with disinformation, and political propaganda [31]. In the past few years, conspiracy groups relying on social media manipulation using bots and online disinformation orchestrated campaigns, spread fake scientific information with regards to vaccination and other anti-science movements, creating a massive threat to public health [32,33].
During the COVID-19 pandemic, social media have proven to be a mixed blessing. On the one hand, social media can be used effectively to provide essential health-related information promoting the debate within the global scientific community. Yet, digital platforms can also disseminate flawed studies, inaccurate information and misinformation, while concerns have arisen regarding the fast and extended spread of COVID-19 conspiracy theories on social media [34]. A study shows that COVID-19 misinformation has similar probabilities with accurate information to spread and engage users on social media [35]. Another study presented an extremely low capability of the social network users to determine whether the content of the platform is real or fake [36].
YouTube is one of the platforms that has been strongly associated with conspiracy beliefs. Some findings show that 60% of people who believed that 5G networks caused COVID-19 symptoms were informed about the virus from YouTube [8]. Moreover, YouTube [37,38,39,40] and Facebook [41,42,43] were identified as major vectors for dissemination of conspiracy beliefs and misinformation, on medical and other topics. Previous studies have been conducted in order to analyze health-related videos on YouTube, such as smoking [44,45], obesity [46], and vaccination [47,48], and their potential impact on the audience. A quantitative analysis of 560 YouTube videos in 2017 related to the link between vaccines and autism or other serious side effects on children revealed an increasing trend in the annual number of uploaded videos and that most videos were negative in tone [49]. Likewise, another study pointed out that most of the HPV vaccine-related YouTube videos were negative in tone, adding that negative videos had higher ratings than positive or ambiguous ones [50]; the latter finding was also confirmed by the authors of [51]. In addition, studies on Twitter suggest that it plays a similar role in the spread and impact of misinformation as YouTube and Facebook [35,52,53,54].

2.3. Misinformation on Low Content Moderation Platforms

YouTube has received considerable criticism for potentially radicalizing individuals and promoting conspiracy videos [55]. To improve the quality of content on its platform, on January 2019, YouTube reduced recommendations to conspiratorial and alternative news content (https://blog.youtube/news-and-events/continuing-our-work-to-improve/—accessed on 24 October 2022), but it allowed the videos to remain on the platform [56]. Therefore, exposure to anti-social content videos occurs on many platforms beyond YouTube (via shared links). There is evidence that reducing exposure to anti-social videos on YouTube potentially reduces sharing on other mainstream platforms, such as Twitter and Reddit [57]. In reaction to perceived risks of censorship in mainstream media and platforms, the so-called alt-tech spaces emerged and have become increasingly popular, especially among users who are suspended on mainstream social networks for violating their terms of service. Most of them are low content-moderation platforms, producing and promoting anti-social content (i.e., disinformation, political propaganda, conspiracy theories, pseudoscience, and hate speech). For this reason, these alternative platforms have also received attention from researchers, practitioners, and policy makers [58].
Many of these platforms, including Gab, 4chan, and Parler, are very popular among fringe communities, especially far-right groups, with hateful and extremist content [59,60,61,62,63]. According to [61], Gab is predominantly used for the dissemination and discussion of news and world events, it has a significant rate of hate speech, much higher than Twitter, and it attracts alt-right users, conspiracy theorists, and other trolls. A study analyzing comments, users, and websites discussed in Dissenter, a browser extension created by Gab, identified a high degree of comment toxicity among a core group of users [64]. 4chan has also received considerable attention, mainly because of its politically incorrect /pol/ board, offensive culture, openly antisemitic rhetoric, users’ associations with the alt-right, and meme virality on social media [65,66,67,68]. In addition, Pieroni et al. [69] studied the spread of misinformation in Parler as an “alternative” place for free speech from a data science and UX design perspective. Another study on a Parler dataset [70] found that discussion on the platform mainly focused on conservative topics, President Trump, as well as conspiracy theories (e.g., QAnon). Another glaring example, as a result of the mainstream social media self-regulation strategy, is the launch of Gettr in 2021. Gettr is an alternative social network platform launched by former US President Donald Trump’s team, which hosts pro-Trump content mixed with conspiracy theories and anti-left stances [71]; however, without significant organic engagement and activity and with a lower level of toxicity, compared to other fringe social networks [72].
Unlike the attention that most of the aforementioned platforms have received from the scientific community, BitChute and Odysee remain relatively neglected by the research community. Concerning the BitChute platform, authors in [4] claimed that a handful of channels in the platform receive high engagement, and almost all of those channels contain conspiracies or hate speech. They found that BitChute has a higher rate of hate speech than Gab but less than 4chan, and also, some BitChute content producers have been banned from other platforms. Additionally, a BitChute dataset, called The MeLa BitChute Dataset [5], is publicly available, providing a basis for investigating the content on the platform.
BitChute is already one of the most popular alt-tech platforms shared on Telegram, according to [73], and one of the leading destinations for Internet celebrities who have been banned from mainstream platforms [74]. In 2019, approximately 20% of deplatformed far-right channels on YouTube had a BitChute presence [75]. Furthermore, there has been an increase in interest in alt-tech platforms, such as Gab and BitChute, after the January 6 attack on the US Capitol and the “Great Deplatforming” [76]. Moreover, YouTube’s de-recommendation strategy seems to suppress the sharing of misinforming content on Twitter and Reddit [57]. However, it simply moves this to alt-tech spaces, such as BitChute, as in the case of the Plandemic video, which was quickly removed from Facebook, YouTube, and Twitter but remained untouched on BitChute. [57,77].
As for Odysee, it is also an alternative video-sharing platform based on the “free speech” and low content-moderation principles. It is built on the LBRY (https://lbry.com/—accessed on 24 October 2022) blockchain-based file-sharing and payment network. Some early studies indicated that Odysee hosts a variety of subjects (https://www.nhpr.org/nh-news/2021-04-19/from-cooking-videos-to-qanon-n-h-based-video-platform-attracts-users-banned-elsewhere—accessed on 24 October 2022), while The Guardian reported “scores of extremist videos”, promoting antisemitic conspiracy theories, Nazi and neo-Nazi content, and COVID-19 misinformation (https://www.theguardian.com/world/2021/may/14/odysee-video-platform-nazi-content-not-grounds-for-removal—accessed on 24 October 2022). Out of the approximately 10 million videos hosted on the platform, the most-viewed video was one questioning the safety of COVID-19 vaccines (https://www.nhpr.org/nh-news/2021-04-19/from-cooking-videos-to-qanon-n-h-based-video-platform-attracts-users-banned-elsewhere—accessed on 24 October 2022). In addition, there are cases that Odysee operates as a backup archive for videos expressing extremist rhetoric and are more likely to be removed from other platforms (Dr Eviane Leidig, ‘Odysee: the new YouTube for the RWE’, GNET Insights, 17 February 2021, https://gnet-research.org/2021/02/17/odysee-the-new-youtube-for-the-far-right/—accessed on 24 October 2022). However, it seems there is no other study or review of this platform, its users or its generated content.

3. Methodology

Our starting point is a YouTube dataset of COVID-related misinformation videos. Authors in [78] released a dataset of 8105 COVID-related misinformation videos, called the Oxford COVID-related misinformation dataset, which according to the authors’ findings, is less than 1% of all YouTube videos related to the COVID pandemic. YouTube had removed all but 50 of these videos (marked as ‘misinformation videos’) at the time the dataset was collected. We queried YouTube with these 50 videos 16 months after the dataset was created, and only 31 of them were still available on the platform. Regarding the unavailable videos when the dataset was collected, the authors in [78] managed to collect the titles and metadata and publicly provide them to the community for analysis. We first conducted a short analysis of these videos, and then used them as a background collection to investigate their online presence on alternative video platforms.

3.1. Oxford COVID-Related Misinformation Dataset

The Oxford dataset is composed of the URLs of the videos and a set of metadata (e.g., video title, video description, view count, and channel id) in cases where they were recoverable. The videos have been removed from YouTube, resulting in a lack of visual information. For that reason, we could only use the video titles to investigate their migration to other online platforms. In the dataset we found 7220 videos (out of 8104) where the video title was recovered, and used those videos as a reference dataset for our analysis.
We first compared the video titles to retrieve the unique cases of COVID misinformation. We used SentenceTransformers (https://www.sbert.net/), a Python framework for state-of-the-art sentence, text and image embeddings, to compare the titles. The sentence-similarity process revealed that the dataset consists of 7066 unique cases of COVID-related misinformation, from which there are 110 cases with one to twelve duplicate videos. An example of such a case is presented in Table 1. However, to simplify the processing, in the rest of the analysis, we consider all 7220 videos as unique cases.
We also applied the en_core_web_sm language-detection model by the spaCy library (https://spacy.io/—accessed on 24 October 2022) to detect the languages of the 7220 video titles observing that misinformation is disseminated in several languages. This revealed that several languages are present, but English dominates, making up more than 43% of the videos. Table 2 presents the top 5 languages detected in the titles.
Regarding propagation to other social media platforms, the authors of [78] provided the tweet IDs sharing those YouTube videos. We queried the Twitter API and found that only 1% (2360 out of 245,064 tweets) were still available, while the remaining 99% are tweets that Twitter removed. This implies that YouTube videos presenting misinformation content shared through Twitter were also removed.
Finally, we found that the number of views is provided only for ∼10% of the videos in the dataset. Although the number is too small to draw valid results, the average number of views amounted to 150,138 and showed that these videos gained considerable attention before being removed.

3.2. Video Retrieval in BitChute and Odysee

We applied a semi-automatic process to collect videos removed by YouTube for violating the platform’s policies and then posted on BitChute and Odysee. Figure 3 illustrates the steps we followed.
First, we collected the titles of the 7220 COVID-related misinformation videos from the Oxford dataset and used them to query BitChute and Odysee. Even though the Oxford dataset contains duplicate cases of COVID-related misinformation (i.e., several videos refer to the same case), we considered these cases individually as the video titles were slightly different, which increased our chances of finding them on BitChute and Odysee.
Since BitChute and Odysee do not provide a public API for data collection, we used a custom-built scraper to search with the query text (video titles) and collect the returned results and their metadata during that time. Due to this limitation, several metadata fields could not be collected.
Since we solely rely on the text of the title, we submitted them without pre-processing. A primary limitation in our analysis is that the initial YouTube videos are unavailable online, and visual inspection of the retrieved videos was not possible. Additionally, we rely on the accuracy of the search algorithm of each platform. In the BitChute case, videos with identical or similar titles were returned. At the same time, there were cases where the search returned no videos meaning that the query title did not appear in any videos on the platform. The queries resulted in 13,336 Bitchute videos corresponding to 1416 YouTube COVID-related misinformation cases. Concerning Odysee, most queries returned at least 50 results, even if the results differed significantly from the query text. This implies a challenge in the next steps of the process, where we filter the results to find exact duplicates of the YouTube videos appearing in Odysee. For Odysee, 324,300 videos were collected in this step.
The filtering process consists of two steps. In the first step, we used the text similarity between the collected videos to split the collection into three subsets: (1) near duplicates, (2) videos that can be automatically processed, taking into account visual information, and (3) videos that need to be manually annotated. In the following, we describe the process in detail.

3.2.1. Text Similarity

We used the SentenceTransformers (https://www.sbert.net/) Python library to compare the video titles. Each query video was compared to the retrieved videos from BitChute and Odysee, resulting in a title similarity value between 0 and 1, with one indicating that the titles are identical. We set a threshold of 0.9 and consider videos with similarity above this threshold as near duplicates. This led to a subset of 3860 BitChute videos corresponding to 893 YouTube cases and 2088 Odysee videos from 369 YouTube cases. The rest of the videos, with a similarity below 0.9, were provided as input to the second filtering step.

3.2.2. Visual Similarity

To apply visual similarity, we need a reference video. Since the original YouTube videos are not available, we use as reference videos the BitChute and Odysee videos that were identified as near duplicates based on the text similarity, namely, the 893 and 369 YouTube cases coming out of the previous step. For each of these cases, we set as reference video the BitChute or Odysee video with the higher similarity value (in case there are multiple videos with the same similarity value, we randomly selected one of them). Then, we applied the visual near duplicate detection approach of [79] on the reference videos and the videos with textual similarity below 0.9. This resulted in the discovery of 4256 duplicate BitChute videos corresponding to 372 YouTube cases and 487 duplicate Odysee videos out of 119 YouTube cases.

3.2.3. Manual Similarity

In the case where the retrieved videos had no similar titles to the titles of the YouTube cases, the textual and visual similarity was not applicable. In this case, we applied manual annotation to the videos. This required two annotators and ≈50 h per annotator to inspect the videos. The annotators were asked to manually annotate the BitChute and Odysee videos based on the metadata provided for the YouTube query and references to debunking articles and other sources on the Internet. For example, for a video entitled ‘Coronavirus and 5G by dr Thomas Cowan’ the annotator found a debunking article on CBC (https://www.cbc.ca/news/science/fact-check-viral-video-coronavirus-1.5506595), where screenshots of the original video were provided and the annotator could label BitChute and Odysee videos that depict these screenshots as duplicates. In total, 523 and 329 YT cases were manually annotated for BitChute and Odysee, respectively. Regarding BitChute, 15,813 videos were manually annotated, resulting in 1293 duplicates corresponding to 182 YouTube cases. Similarly, for Odysee, the manual inspection resulted in 968 duplicates from 222 YouTube cases.

BitChute and Odysee Dataset

In summary, we created a dataset of 5906 videos posted on the BitChute and Odysee platforms after their removal from YouTube, which addresses RQ1. These videos were derived from 1048 and 591 COVID-related misinformation cases posted on YouTube, which amounts to ≈15% of the Oxford dataset, that is 1114 unique YouTube videos. We observe that a relatively small percentage of the videos migrated to BitChute and Odysee, but we also need to consider the following:
  • We used a semi-automated process to recover these videos, so there may be videos that migrated, but our process could not recover;
  • We solely relied on the video titles in the first step to search and compare videos; the videos might have been posted on the target platforms with a different title, in which cases it would not be possible to recover them.
Table 3 presents the number of BitChute and Odysee videos retrieved in each process and the number of YouTube videos they correspond. We collected the metadata of 3994 (due to using a custom-built scraper for collecting metadata, there are videos where the metadata are missing) Bitchute videos and 1810 Odysee videos. The unique value derives from the fact that, as presented above, the Oxford dataset contains duplicate cases of COVID-related misinformation, which means that the collected videos are the same for several cases.

4. BitChute Analysis

In this section, we present the analysis of the videos we managed to collect from the Bitchute platform in order to address RQ2, RQ3 and RQ4.

4.1. Temporal Disribution

We first study the temporal distribution of the BitChute videos. Figure 4 illustrates a timeline showing the BitChute videos per the corresponding YouTube video. Each line corresponds to one YouTube video and its BitChute duplicates. The horizontal axis corresponds to the time between posting the YouTube video and its BitChute duplicates. We plotted the BitChute videos posted at most within a day of the posting of the YouTube video. Due to missing metadata, we examined 297 YouTube videos and their 2335 BitChute duplicates. In total, 12% of the BitChute duplicates were posted minutes to a day after posting the YouTube video, while 50% of the BitChute duplicates appeared on the platform between one day and a week after posting the YouTube video. The remaining 34% of videos were shared more than a week after the original YouTube video. Therefore, more than half of the videos appeared in BitChute at most within a week after the original YouTube video.

4.2. Video Interactions

We then study the likes–dislikes, and views acquired by the videos. There are 961 BitChute videos (24%) with zero likes. Additionally, 53% of the BitChute videos had at most 10 likes. Only 13 of the BitChute videos, i.e., 0.3%, had a significant impact with more than 1000 likes. Inspecting the dislikes, we noticed that there is not much dislike activity since 84% of the videos have no dislike reaction, while the remaining 16% have at most 77 dislikes. The like–dislike analysis revealed that users on the BitChute platform applaud the misinformation to some extent, while there are only limited objections. We plotted a histogram of the likes and dislikes of the top 20 disliked videos in Figure 5.
Concerning video views, the most viewed video in the collected BitChute dataset is the Plandemic documentary, with 2,350,179 views. The second most viewed video is also from the Plandemic misinformation case, with 2,337,274 views. The Plandemic video popularity goes in line with YouTube, where the documentary led to very high engagement. It is worth noting that 70.2% of videos collected less than 1000 views, which is considered low compared to the engagement that misinformation videos attract on YouTube [38]. Only 5.5% of the BitChute videos managed to attract more than 10,000 views and are thus comparable in popularity to the YouTube content.

4.3. Titles and Descriptions

We analyzed the titles of the BitChute videos and extracted parts of speech using the spaCy library. The average number of tokens for the BitChute title is 8.2, where 2.1 are nouns, 0.7 verbs, 0.36 numbers, 1.85 pronouns, 0.82 punctuation, and the rest are other parts of speech, such as adjectives, auxiliaries, etc. The nouns dominate, as expected, while verb usage in BitChute titles is limited. It is noticeable that pronouns are used almost as much as nouns. In general, there is evidence that possessive pronouns (e.g., “my”), Wh-Determiners (e.g., “which”) and Wh-Adverbs (e.g., “where”) occur more often in clickbait headlines as compared to legitimate news headlines, creating a curiosity gap by asking questions which entice users to click in order to satisfy their need for information [80].
Additionally, we applied the aforementioned spaCy language detection model on the BitChute titles and similar to the Oxford dataset, we found that English dominates the dataset (≈72% of titles). The second most frequent language is German with a percentage of ≈15%. Spanish is the third language (≈2%).
For the video description, the average number of tokens is 128. It is a typical case for the description to be longer. As a next step, we searched for keywords (deleted, banned, removed) that refer to content take-down to investigate whether the BitChute users informed the viewers that the video was deleted from other platforms. This resulted in only 99 video descriptions with the keyword ‘deleted’, 330 with the keyword ‘banned’ and 153 with the keyword ‘removed’, that corresponds to 2.5%, 8.3% and 3.9% of the videos, respectively. Example sentences that appear in video descriptions and contain the keywords mentioned above include: ‘Deleted YouTube video of a German Journalist visiting a “hospital with coronavirus patients”, but finds no one there’, ‘This video is VERY BIG and it’s getting deleted in YouTube. Make sure to download it and spread it around.’ and ‘David Icke’s explosive interview with London Real, which sparked controversy after being banned from YouTube and Vimeo.’ We calculated the average number of likes for the videos containing the keyword ‘deleted’ in their description and compared it with the rest. The average number of likes for those containing the keyword is 54, while the average on 10 random subsets of the same number of videos (99) without the keyword in their description was much less, 24.6. Similarly, for the 330 videos containing the keyword ‘banned’, the average number of likes was found to be 42, while for 10 random subsets, the average number of likes was computed to be 25.5. Different behavior appears for the keyword ‘removed’, where the average number of likes for videos containing the keyword and the same number for 10 random samples was found to be quite close, at 24.5 and 22, respectively.

4.4. Categories

According to Figure 6, the dominant video category is ‘None’ followed by ‘News and Politics’ and ‘Education’. The category is assigned by the user before sharing the video on the platform. Although the BitChute platform offers an option for the ‘Health and Medical’ category, users do not prefer this option but rather assign COVID-related videos to none or other unrelated categories.

4.5. BitChute and IDs

The BitChute platform provides users with functionality that automatically sends YouTube videos to BitChute by assigning the YouTube channel ID. In this case, videos from YouTube are shared through BitChute using the same video ID. In the created dataset, there are only 93 videos with identical video IDs between BitChute and YouTube. We cannot determine whether the users do not take advantage of this feature because they are not aware of it or because they prefer to not reveal the source of the video.

4.6. Cross-Platform Diffusion

We also applied cross-platform search to investigate the diffusion of BitChute videos on mainstream social media platforms. We selected Facebook and Twitter, the most popular platforms accessible through APIs. We used the Twitter Academic search API to retrieve historical tweets containing BitChute videos, and CrowdTangle for Facebook posts by searching with the keyword ‘Bitchute’. Although the number of BitChute videos shared through both platforms is large, the number of videos from the created dataset shared through these platforms is limited. Specifically, we retrieved only 285 BitChute URLs from the created dataset in the 2,057,195 tweets sharing a BitChute video URL (0.01%) and, similarly, 52 BitChute video URLs from the 28,867 Facebook posts (0.2%). We need to note that since these videos are already banned as misinformation-related, there might have been tweets and Facebook posts sharing them but removed by the platforms. Authors in categorize [81] categorized YouTube and BitChute content during U.S. election fraud discussions on Twitter. They revealed that the size of YouTube content shared through Twitter compared to BitChute is much larger, taking into account the popularity and size of each platform. However, their investigation concluded that each platform’s content varies drastically in terms of topic.

4.7. Comments

Just 670 of the 3994 BitChute videos of the dataset have at least one comment (16.8%). In total, 5919 comments were posted on 670 BitChute videos, and for 311 of them, we retrieved only one comment. The highest number of comments collected for a video was 403.
We analyzed the collected comments using an emotion-detection method [82] since related studies point out that misinformation can spread rapidly when a group falls into negative emotions, such as anxiety. In general, there is evidence that viewers tended to watch and like negative videos more than positive videos on YouTube [50,51]. The comments are classified into five emotion categories (happy, sad, fear, angry and surprise) with a score in the range between 0 and 1, where the higher the score, the more the emotion appears in the comment text. In Figure 7, the negative emotion of fear dominates in the video comments. Table 4 presents examples of positive and negative emotions.
Additionally, we applied a toxicity detection library (https://github.com/unitaryai/detoxify) that detects five categories of toxicity, obscene, insult, threat and attack. A score in the range of 0 to 1 was calculated for each category. In Figure 8, we observe that comments mainly contain toxicity (e.g., ‘ year later we are still putting up with this crap. Humanity is lost.’) and to a lesser extent, insult (e.g., ‘Stupid MSM reporters cannot think for themselves, much less investigate anything.’). Obscenities, threats and attacks appear at a very low rate.

4.8. Channels

In addition to the analysis conducted on the collected videos, we also studied the channels sharing the BitChute videos. We collected the channel names of 3636 videos and found 1708 unique channels sharing those videos. A quarter of the channels have fewer than 10 subscribers, while the top 10 popular channels have more than 74,000 subscribers. In Table 5, we list the top 10 popular channels with their number of subscribers and the number of videos from the collected dataset they shared.
We then used the ‘sherlock’ (https://github.com/sherlock-project/sherlock) Python library to search social media accounts by username across social networks (https://github.com/sherlock-project/sherlock/blob/master/sites.md) including mainstream social media platforms such as Facebook and Twitter. However, we are missing YouTube and Odysee since they are not part of the supported sites. Supposing a user selects the same username across different platforms, which is common, we retrieved the platforms where the channels sharing the BitChute videos also have accounts. We need to consider that although it is typical for users to use the same username on different platforms, there might be several cases in which different users appear with the same username. Such a case is the account with username ‘thecrowhouse’, which is an influential account on BitChute (Table 5). We manually inspected the account on both platforms (BitChute and Facebook) and concluded that the accounts most likely were created by different users who incidentally selected the same username. We manually inspected all accounts in Table 5. Three of these channels do not appear on Facebook (or sherlock could not retrieve the Facebook accounts). For six of these channels, Sherlock retrieved a Facebook account with an identical username but manually inspecting these accounts, we discovered they are no longer available on the platform. We cannot be sure whether the users decided to delete their accounts on Facebook or Facebook removed them, but in any case, these accounts might be considered suspicious. The account with username ‘styxhexenhammer666’ appears in both BitChute and Dailymotion (https://www.dailymotion.com/) and shares similar content.
In Figure 9, we present the top 20 platforms in which the channels appear. Facebook is in the fourth position, with almost half of the channels having a matching Facebook account, while Twitter appears in a lower position. Quora (https://www.quora.com/), Y combinator (https://www.ycombinator.com/) and ICQ (https://icq.com/mobile/en) appear among the top-3 positions. These platforms have not been associated with misinformation. However, although limited, there is investigation on inaccurate content of such platforms such as a recent case study [83] that focused on content posted through Quora to evaluate different mechanisms and algorithms to filter insincere and spam content.

4.9. Comparison with MeLa Dataset

We also made use of the MeLa dataset, which the authors in [5] claimed as a near-complete dataset (June 2019 to December 2021), and retrieved 9958 and 74,948 videos containing one or more of the keywords ‘covid’, ‘coronavirus’, ‘covid19’, ‘covid-19’ in the title of the video or the description, respectively. The collected videos can be considered as a complementary COVID-related misinformation dataset since videos in BitChute often depict misinformation cases. The collected dataset accounts for 7% of the COVID-related videos appearing in the MeLa dataset and, therefore, on the platform for the period covered by the dataset.

5. Odysee Analysis

In this section, we present the analysis of the videos recovered by the Odysee platform addressing RQ2, RQ3 and RQ4.

5.1. Temporal Distribution

We created a timeline showing the posting time of Odysee matches to the YouTube videos. Each line corresponds to one YouTube video and the matched Odysee duplicates. The horizontal axis corresponds to the time between the posting of the YouTube video and its Odysee counterparts. Figure 10 shows the Odysee videos posted at most within a day (Odysee does not provide the exact time (hour, minutes) but the day of posting) of the posting of the original video. Due to missing metadata, we examined 162 YouTube videos and their 966 Odysee counterparts. In total, 35.4% of these Odysee videos were posted minutes to a day after posting the YouTube video, and 27.8% of the Odysee duplicates were shared between one day to a week after posting the YouTube video. Similar to BitChute, more than half of the Odysee matched videos appeared on the platform at most within a week after first appearing on YouTube.

5.2. Video Interactions

Odysee videos receive few interactions. We collect the number of views for each Odysee video and present them in a histogram plot in Figure 11. We notice that most videos have a low number of views (<200), while there are few videos with a number of views close to 1000. While misinformation-bearing videos often gain much popularity on mainstream platforms, it seems that Odysee has only a small user community for now. Concerning the likes–dislikes reactions on the videos, Figure 12 shows that there are few reactions on the collected videos. The likes prevail with a high difference over the dislikes. Only 3.5% of the videos have at least one dislike reaction while 37% of the videos are liked. The number of likes per video is, in most cases, less than 10.

5.3. Video Titles and Descriptions

We applied the aforementioned language detection model on the Odysee titles. Similar to the Oxford dataset, English dominates (≈67% of titles). The second most frequent language is German (≈12%). Spanish is the third language detected in Odysee videos (≈4.5%).
We searched for keywords (deleted, banned, and removed) that refer to deletion to investigate whether the Odysee users informed viewers that the video was deleted from other platforms. From the 1810 processed descriptions, we calculated only 34 (1.9%) containing the keyword ‘deleted’, 3.1% the keyword ‘banned’ and 2.8% the keyword ‘removed’.

5.4. Cross-Platform Diffusion

Similarly to BitChute, we used the Twitter API and CrowdTangle to search for Odysee videos in tweets and Facebook posts. Concerning Twitter, we collected 868,110 tweets with the keyword Odysee. Then, we compared the URLs shared through these tweets with the collected Odysee videos. We found only 0.02% (178 videos) of the collected dataset being tweeted. We need to consider here that the number of tweets may be higher, but the tweets could have been removed at the time of searching. With respect to Facebook posts, we collected a small number of posts (∼5000) containing the keyword ‘Odysee’ indicating that Facebook users do not disseminate content coming from the Odysee platform. None of the collected Odysee videos appears in these posts.

5.5. Comments

We observed that Odysee users comment very little on the videos shared on the platform. Only 12% of the collected videos have at least one comment, and in total, 644 comments were posted on the videos of our collection. We analyzed the comments similarly in the same way as for BitChute comments using the emotion detection and toxicity detection approaches. It is noteworthy that Odysee’s comments have more positive emotions (Figure 13). Additionally, the toxicity detection (Figure 14) reveals that user feedback is not particularly offensive as in the case of BitChute.

5.6. Channels

The collected videos were shared by 765 unique channels. We collected the number of followers for each channel and noticed that 19 have no followers, 16% have less than 10 followers, and the most popular channel has 129,000 followers. Additionally, we applied ‘Sherlock’ to search Odysee accounts on other platforms. Similarly to BitChute, we present the top 20 platforms where users appear to have an account in Figure 15. Facebook and Twitter are listed in the top 20 platforms, which shows that users in Odysee also have accounts on mainstream social media platforms.

6. Discussion

This paper provides evidence on the diffusion of COVID-related disinformation videos from YouTube, a mainstream video platform, to BitChute and Odysee, which are low content-moderation platforms. Starting from a dataset of 7220 videos that were banned from YouTube after being flagged as spreading COVID-19 misinformation, we collected a dataset of 4096 BitChute and 1810 Odysee videos.
Our analysis indicates that COVID-related misinformation videos on YouTube are shared on low content-moderation platforms soon after their original posting. The analysis showed that, for both BitChute and Odysee, most videos were shared at most within a week after their upload on YouTube. One might hypothesize that users are aware that their content will not have a long lifespan on the mainstream platforms, and they choose to post them on platforms with low content moderation. This finding is also in line with another study, which shows that the number of shares of the “Plandemic” on BitChute reached a peak just the day after the video was banned from YouTube [57].
As for the popularity of migrated videos on BitChute and Odysee, the number of views on both platforms is relatively low. The number of views and other viewer interaction metrics are also used in other studies of misleading health-information videos in online video platforms such as YouTube [37]. However, the extremely low number of views of videos in our study does not permit the study of more complex correlations as in [78]. Furthermore, we analyzed the number of likes and dislikes, which is also relatively low on both platforms. Yet, it is worth mentioning that videos in BitChute that feature the keywords “banned” or “deleted” in the title tend to gather more likes, which is in agreement with previous work on YouTube that found a tendency for negative tone videos to be liked more by the viewers than videos with positive or ambiguous tone [50].
Finally, emotion detection has been investigated in relation to the spread of misinformation. Authors in [84] proposed an emotion-based misinformation detection framework to learn content- and comment-emotion representations. They found that the proportion of anger on fake news content is 9% higher than in real news, while the percentage of happiness is 8% lower. In our analysis, we detected more negative emotions on BitChute, especially fear, compared to Odysee, where we found more positive emotions.

7. Conclusions

Given the increasing effort by mainstream platforms to moderate misinformation, malicious actors are looking for alternative low content-moderation platforms to disseminate misleading content. In this work, we retrieved COVID-related misinformation videos that were shared and removed on YouTube and appear in BitChute and Odysee. We created and analyzed a dataset of 5906 COVID-related misinformation videos. In the literature, there is limited investigation on BitChute and Odysee, and this work provides initial findings on the characteristics of misinformation dissemination through these platforms. Misinformation that is shared through low content-moderation platforms seems to follow misinformation practices also used in mainstream social media platforms. Our findings reveal that these platforms receive misleading videos and provide them with a space that allows them to remain online. Although our analysis has shown little user engagement on these platforms (at least in comparison with mainstream platforms), the high amount of misleading content hosted on them could pose a risk in the future. Future research should focus on broader categories of misinformation, beyond the coronavirus, and the dissemination of misinformation from platforms other than YouTube, such as Facebook and Twitter. In addition, future work should investigate the characteristics of disinformation spread through these platforms to derive features for developing automated methods to detect misleading videos. Finally, future research could focus on the analysis of channels that publish misleading videos.

Author Contributions

Conceptualization, O.P. and S.P.; Data curation, O.P.; Formal analysis, O.P. and S.P.; Funding acquisition, S.P.; Investigation, O.P. and E.K.; Methodology, O.P. and S.P.; Project administration, S.P.; Software, O.P.; Supervision, S.P.; Validation, O.P., E.K. and S.P.; Visualization, O.P.; Writing—original draft, O.P., E.K. and S.P.; Writing—review and editing, O.P., E.K. and S.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the MediaVerse project, which is funded by the European Commission under contract number 957252 and as well has received funding by the European Union under the Horizon Europe vera.ai project, Grant Agreement number 101070093.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data will be available upon request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Fair, G.; Wesslen, R. Shouting into the void: A database of the alternative social media platform gab. In Proceedings of the International AAAI Conference on Web and Social Media, Munich, Germany, 11–14 June 2019; Volume 13, pp. 608–610. [Google Scholar]
  2. Hanley, H.W.; Kumar, D.; Durumeric, Z. No Calm in The Storm: Investigating QAnon Website Relationships. In Proceedings of the International AAAI Conference on Web and Social Media, Atlanta, GA, USA, 6–9 June 2022; Volume 16, pp. 299–310. [Google Scholar]
  3. Papasavva, A.; Blackburn, J.; Stringhini, G.; Zannettou, S.; Cristofaro, E.D. “Is it a qoincidence?”: An exploratory study of QAnon on Voat. In Proceedings of the Web Conference 2021, Virtual Conference, 19–23 April 2021; pp. 460–471. [Google Scholar]
  4. Trujillo, M.; Gruppi, M.; Buntain, C.; Horne, B.D. What is bitchute? characterizing the. In Proceedings of the 31st ACM Conference on Hypertext and Social Media, Online, 13–15 July 2020; pp. 139–140. [Google Scholar]
  5. Trujillo, M.Z.; Gruppi, M.; Buntain, C.; Horne, B.D. The MeLa BitChute Dataset. In Proceedings of the International AAAI Conference on Web and Social Media, Atlanta, GA, USA, 6–9 June 2022; Volume 16, pp. 1342–1351. [Google Scholar]
  6. Andrade, G. Medical conspiracy theories: Cognitive science and implications for ethics. Med. Health Care Philos. 2020, 23, 505–518. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  7. Douglas, K.M.; Uscinski, J.E.; Sutton, R.M.; Cichocka, A.; Nefes, T.; Ang, C.S.; Deravi, F. Understanding conspiracy theories. Political Psychol. 2019, 40, 3–35. [Google Scholar] [CrossRef] [Green Version]
  8. Allington, D.; Duffy, B.; Wessely, S.; Dhavan, N.; Rubin, J. Health-protective behaviour, social media usage and conspiracy belief during the COVID-19 public health emergency. Psychol. Med. 2021, 51, 1763–1769. [Google Scholar] [CrossRef] [PubMed]
  9. Oliver, J.E.; Wood, T. Medical conspiracy theories and health behaviors in the United States. JAMA Intern. Med. 2014, 174, 817–818. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  10. Dunn, A.G.; Surian, D.; Leask, J.; Dey, A.; Mandl, K.D.; Coiera, E. Mapping information exposure on social media to explain differences in HPV vaccine coverage in the United States. Vaccine 2017, 35, 3033–3040. [Google Scholar] [CrossRef]
  11. Jolley, D.; Douglas, K.M. The effects of anti-vaccine conspiracy theories on vaccination intentions. PLoS ONE 2014, 9, e89177. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  12. Zimmerman, R.K.; Wolfe, R.M.; Fox, D.E.; Fox, J.R.; Nowalk, M.P.; Troy, J.A.; Sharp, L.K. Vaccine criticism on the world wide web. J. Med. Internet Res. 2005, 7, e369. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  13. Andrade, G.E.; Hussain, A. Polio in Pakistan: Political, sociological, and epidemiological factors. Cureus 2018, 10, e3502. [Google Scholar] [CrossRef] [Green Version]
  14. Karafillakis, E.; Larson, H.J. The benefit of the doubt or doubts over benefits? A systematic literature review of perceived risks of vaccines in European populations. Vaccine 2017, 35, 4840–4850. [Google Scholar] [CrossRef]
  15. Grebe, E.; Nattrass, N. AIDS conspiracy beliefs and unsafe sex in Cape Town. AIDS Behav. 2012, 16, 761–773. [Google Scholar] [CrossRef]
  16. Thorburn, S.; Bogart, L.M. Conspiracy beliefs about birth control: Barriers to pregnancy prevention among African Americans of reproductive age. Health Educ. Behav. 2005, 32, 474–487. [Google Scholar] [CrossRef] [PubMed]
  17. Jolley, D.; Douglas, K.M. The social consequences of conspiracism: Exposure to conspiracy theories decreases intentions to engage in politics and to reduce one’s carbon footprint. Br. J. Psychol. 2014, 105, 35–56. [Google Scholar] [CrossRef] [PubMed]
  18. Wood, M.J.; Douglas, K.M.; Sutton, R.M. Dead and alive: Beliefs in contradictory conspiracy theories. Soc. Psychol. Personal. Sci. 2012, 3, 767–773. [Google Scholar] [CrossRef] [Green Version]
  19. Earnshaw, V.A.; Bogart, L.M.; Klompas, M.; Katz, I.T. Medical mistrust in the context of Ebola: Implications for intended care-seeking and quarantine policy support in the United States. J. Health Psychol. 2019, 24, 219–228. [Google Scholar] [CrossRef] [PubMed]
  20. Goertzel, T. Conspiracy theories in science: Conspiracy theories that target specific research can have serious consequences for public health and environmental policies. EMBO Rep. 2010, 11, 493–499. [Google Scholar] [CrossRef] [Green Version]
  21. Alathur, S.; Chetty, N.; Pai, R.R.; Kumar, V.; Dhelim, S. Hate and False Metaphors: Implications to Emerging E-Participation Environment. Future Internet 2022, 14, 314. [Google Scholar] [CrossRef]
  22. Colomina, C.; Margalef, H.S.; Youngs, R. The Impact of Disinformation on Democratic Processes and Human Rights in the World; Directorate-General for External Policies, European Parliament: Strasbourg, France, 2021; Volume 29, p. 2021. [Google Scholar]
  23. Allington, D.; Dhavan, N. The Relationship between Conspiracy Beliefs and Compliance with Public Health Guidance with Regard to COVID-19; King’s College: London, UK, 2020. [Google Scholar]
  24. Freeman, D.; Waite, F.; Rosebrock, L.; Petit, A.; Causier, C.; East, A.; Jenner, L.; Teale, A.L.; Carr, L.; Mulhall, S.; et al. Coronavirus conspiracy beliefs, mistrust, and compliance with government guidelines in England. Psychol. Med. 2022, 52, 251–263. [Google Scholar] [CrossRef]
  25. Imhoff, R.; Lamberty, P. A bioweapon or a hoax? The link between distinct conspiracy beliefs about the coronavirus disease (COVID-19) outbreak and pandemic behavior. Soc. Psychol. Personal. Sci. 2020, 11, 1110–1118. [Google Scholar] [CrossRef]
  26. Hornik, R.; Kikut, A.; Jesch, E.; Woko, C.; Siegel, L.; Kim, K. Association of COVID-19 misinformation with face mask wearing and social distancing in a nationally representative US sample. Health Commun. 2021, 36, 6–14. [Google Scholar] [CrossRef]
  27. Bierwiaczonek, K.; Kunst, J.R.; Pich, O. Belief in COVID-19 conspiracy theories reduces social distancing over time. Appl. Psychol. Health Well-Being 2020, 12, 1270–1285. [Google Scholar] [CrossRef]
  28. Earnshaw, V.A.; Eaton, L.A.; Kalichman, S.C.; Brousseau, N.M.; Hill, E.C.; Fox, A.B. COVID-19 conspiracy beliefs, health behaviors, and policy support. Transl. Behav. Med. 2020, 10, 850–856. [Google Scholar] [CrossRef] [PubMed]
  29. Pavela Banai, I.; Banai, B.; Mikloušić, I. Beliefs in COVID-19 conspiracy theories, compliance with the preventive measures, and trust in government medical officials. Curr. Psychol. 2021, 41, 7448–7458. [Google Scholar] [CrossRef] [PubMed]
  30. Rosário, R.; Martins, M.R.; Augusto, C.; Silva, M.J.; Martins, S.; Duarte, A.; Fronteira, I.; Ramos, N.; Okan, O.; Dadaczynski, K. Associations between covid-19-related digital health literacy and online information-seeking behavior among portuguese university students. Int. J. Environ. Res. Public Health 2020, 17, 8987. [Google Scholar] [CrossRef] [PubMed]
  31. Ferrara, E.; Varol, O.; Davis, C.; Menczer, F.; Flammini, A. The rise of social bots. Commun. ACM 2016, 59, 96–104. [Google Scholar] [CrossRef] [Green Version]
  32. Bessi, A.; Coletto, M.; Davidescu, G.A.; Scala, A.; Caldarelli, G.; Quattrociocchi, W. Science vs conspiracy: Collective narratives in the age of misinformation. PLoS ONE 2015, 10, e0118093. [Google Scholar] [CrossRef] [Green Version]
  33. Del Vicario, M.; Bessi, A.; Zollo, F.; Petroni, F.; Scala, A.; Caldarelli, G.; Stanley, H.E.; Quattrociocchi, W. The spreading of misinformation online. Proc. Natl. Acad. Sci. USA 2016, 113, 554–559. [Google Scholar] [CrossRef] [Green Version]
  34. Lee, J.J.; Kang, K.A.; Wang, M.P.; Zhao, S.Z.; Wong, J.Y.H.; O’Connor, S.; Yang, S.C.; Shin, S. Associations between COVID-19 misinformation exposure and belief with COVID-19 knowledge and preventive behaviors: Cross-sectional online study. J. Med. Internet Res. 2020, 22, e22205. [Google Scholar] [CrossRef]
  35. Kouzy, R.; Abi Jaoude, J.; Kraitem, A.; El Alam, M.B.; Karam, B.; Adib, E.; Zarka, J.; Traboulsi, C.; Akl, E.W.; Baddour, K. Coronavirus goes viral: Quantifying the COVID-19 misinformation epidemic on Twitter. Cureus 2020, 12, e7255. [Google Scholar] [CrossRef] [Green Version]
  36. Bokovnya, A.Y.; Khisamova, Z.I.; Begishev, I.R.; Sidorenko, E.L.; Ilyashenko, A.N.; Morozov, A.Y. Global Analysis of Accountability for Fake News Spread About the Covid-19 Pandemic in Social Media. Appl. Linguist. Res. J. 2020, 4, 91–95. [Google Scholar]
  37. Bora, K.; Das, D.; Barman, B.; Borah, P. Are internet videos useful sources of information during global public health emergencies? A case study of YouTube videos during the 2015–16 Zika virus pandemic. Pathog. Glob. Health 2018, 112, 320–328. [Google Scholar] [CrossRef]
  38. Li, H.O.Y.; Bailey, A.; Huynh, D.; Chan, J. YouTube as a source of information on COVID-19: A pandemic of misinformation? BMJ Glob. Health 2020, 5, e002604. [Google Scholar] [CrossRef]
  39. Pandey, A.; Patni, N.; Singh, M.; Sood, A.; Singh, G. YouTube as a source of information on the H1N1 influenza pandemic. Am. J. Prev. Med. 2010, 38, e1–e3. [Google Scholar] [CrossRef] [PubMed]
  40. Pathak, R.; Poudel, D.R.; Karmacharya, P.; Pathak, A.; Aryal, M.R.; Mahmood, M.; Donato, A.A. YouTube as a source of information on Ebola virus disease. N. Am. J. Med. Sci. 2015, 7, 306. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  41. Buchanan, R.; Beckett, R.D. Assessment of vaccination-related information for consumers available on Facebook®. Health Inf. Libr. J. 2014, 31, 227–234. [Google Scholar] [CrossRef] [PubMed]
  42. Seymour, B.; Getman, R.; Saraf, A.; Zhang, L.H.; Kalenderian, E. When advocacy obscures accuracy online: Digital pandemics of public health misinformation through an antifluoride case study. Am. J. Public Health 2015, 105, 517–523. [Google Scholar] [CrossRef] [PubMed]
  43. Sharma, M.; Yadav, K.; Yadav, N.; Ferdinand, K.C. Zika virus pandemic—Analysis of Facebook as a social media health information platform. Am. J. Infect. Control 2017, 45, 301–302. [Google Scholar] [CrossRef] [PubMed]
  44. Kim, K.; Paek, H.J.; Lynn, J. A content analysis of smoking fetish videos on YouTube: Regulatory implications for tobacco control. Health Commun. 2010, 25, 97–106. [Google Scholar] [CrossRef]
  45. Paek, H.J.; Kim, K.; Hove, T. Content analysis of antismoking videos on YouTube: Message sensation value, message appeals, and their relationships with viewer responses. Health Educ. Res. 2010, 25, 1085–1099. [Google Scholar] [CrossRef] [Green Version]
  46. Yoo, J.H.; Kim, J. Obesity in the new media: A content analysis of obesity videos on YouTube. Health Commun. 2012, 27, 86–97. [Google Scholar] [CrossRef]
  47. Ache, K.A.; Wallace, L.S. Human papillomavirus vaccination coverage on YouTube. Am. J. Prev. Med. 2008, 35, 389–392. [Google Scholar] [CrossRef]
  48. Basch, C.; Zybert, P.; Reeves, R.; Basch, C. What do popular YouTubeTM videos say about vaccines? Child Care Health Dev. 2017, 43, 499–503. [Google Scholar] [CrossRef] [PubMed]
  49. Donzelli, G.; Palomba, G.; Federigi, I.; Aquino, F.; Cioni, L.; Verani, M.; Carducci, A.; Lopalco, P. Misinformation on vaccination: A quantitative analysis of YouTube videos. Hum. Vaccines Immunother. 2018, 14, 1654–1659. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  50. Briones, R.; Nan, X.; Madden, K.; Waks, L. When vaccines go viral: An analysis of HPV vaccine coverage on YouTube. Health Commun. 2012, 27, 478–485. [Google Scholar] [CrossRef] [PubMed]
  51. Keelan, J.; Pavri-Garcia, V.; Tomlinson, G.; Wilson, K. YouTube as a source of information on immunization: A content analysis. JAMA 2007, 298, 2482–2484. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  52. Broniatowski, D.A.; Jamison, A.M.; Qi, S.; AlKulaib, L.; Chen, T.; Benton, A.; Quinn, S.C.; Dredze, M. Weaponized health communication: Twitter bots and Russian trolls amplify the vaccine debate. Am. J. Public Health 2018, 108, 1378–1384. [Google Scholar] [CrossRef] [PubMed]
  53. Ortiz-Martínez, Y.; Jiménez-Arcia, L.F. Yellow fever outbreaks and Twitter: Rumors and misinformation. Am. J. Infect. Control. 2017, 45, 816–817. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  54. Oyeyemi, S.O.; Gabarron, E.; Wynn, R. Ebola, Twitter, and misinformation: A dangerous combination? BMJ 2014, 349, g6178. [Google Scholar] [CrossRef] [Green Version]
  55. Ribeiro, M.H.; Ottoni, R.; West, R.; Almeida, V.A.; Meira Jr, W. Auditing radicalization pathways on YouTube. In Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency, Barcelona, Spain, 27–30 January 2020; pp. 131–141. [Google Scholar]
  56. Faddoul, M.; Chaslot, G.; Farid, H. A longitudinal analysis of YouTube’s promotion of conspiracy videos. arXiv 2020, arXiv:2003.03318. [Google Scholar]
  57. Buntain, C.; Bonneau, R.; Nagler, J.; Tucker, J.A. YouTube recommendations and effects on sharing across online social platforms. In Proceedings of the ACM on Human-Computer Interaction, Málaga, Spain, 22–24 September 2021; Volume 5. [Google Scholar]
  58. Wilson, T.; Starbird, K. Cross-platform Information Operations: Mobilizing Narratives & Building Resilience through both’Big’&’Alt’Tech. In Proceedings of the ACM on Human-Computer Interaction, Málaga, Spain, 22–24 September 2021; Volume 5. [Google Scholar]
  59. Jasser, G.; McSwiney, J.; Pertwee, E.; Zannettou, S. ‘Welcome to# GabFam’: Far-right virtual community on Gab. New Media Soc. 2021, 22, 700–715. [Google Scholar]
  60. Lima, L.; Reis, J.C.; Melo, P.; Murai, F.; Araujo, L.; Vikatos, P.; Benevenuto, F. Inside the right-leaning echo chambers: Characterizing gab, an unmoderated social system. In Proceedings of the 2018 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining (ASONAM), Barcelona, Spain, 28–31 August 2018; pp. 515–522. [Google Scholar]
  61. Zannettou, S.; Bradlyn, B.; De Cristofaro, E.; Kwak, H.; Sirivianos, M.; Stringini, G.; Blackburn, J. What is gab: A bastion of free speech or an alt-right echo chamber. In Proceedings of the Companion Proceedings of the The Web Conference 2018, Lyon, France, 23–27 April 2018; pp. 1007–1014.
  62. Zhou, Y.; Dredze, M.; Broniatowski, D.A.; Adler, W.D. Elites and foreign actors among the alt-right: The Gab social media platform. First Monday 2019, 24. [Google Scholar] [CrossRef]
  63. Donovan, J.; Lewis, B.; Friedberg, B. Parallel ports: Sociotechnical change from the alt-right to alt-tech. In Post-Digital Cultures of the Far Right: Online Actions and Offline Consequences in Europe and the US; Fielitz, M., Thurston, N., Eds.; Transcript Publishing: Bielefeld, Germany, 2019. [Google Scholar]
  64. Rye, E.; Blackburn, J.; Beverly, R. Reading In-Between the Lines: An Analysis of Dissenter. In Proceedings of the ACM Internet Measurement Conference, Pittsburgh, PA, USA, 27–29 October 2020; pp. 133–146. [Google Scholar]
  65. Colley, T.; Moore, M. The Challenges of Studying 4chan and the Alt-Right:‘Come on in the Water’s Fine’. New Media Soc. 2022, 24, 5–30. [Google Scholar] [CrossRef]
  66. Hine, G.E.; Onaolapo, J.; De Cristofaro, E.; Kourtellis, N.; Leontiadis, I.; Samaras, R.; Stringhini, G.; Blackburn, J. Kek, cucks, and god emperor trump: A measurement study of 4chan’s politically incorrect forum and its effects on the web. In Proceedings of the Eleventh International AAAI Conference on Web and Social Media, Montreal, QC, Canada, 15–18 May 2017. [Google Scholar]
  67. Mittos, A.; Zannettou, S.; Blackburn, J.; De Cristofaro, E. “And we will fight for our race!” A measurement study of genetic testing conversations on Reddit and 4chan. In Proceedings of the International AAAI Conference on Web and Social Media, Atlanta GA, USA, 8–11 June 2020; Volume 14, pp. 452–463. [Google Scholar]
  68. Zannettou, S.; Caulfield, T.; Blackburn, J.; De Cristofaro, E.; Sirivianos, M.; Stringhini, G.; Suarez-Tangil, G. On the origins of memes by means of fringe web communities. In Proceedings of the Internet Measurement Conference 2018, Boston, MA, USA, 31 October–2 November 2018; pp. 188–202. [Google Scholar]
  69. Pieroni, E.; Jachim, P.; Jachim, N.; Sharevski, F. Parlermonium: A data-driven UX design evaluation of the Parler platform. arXiv 2021, arXiv:2106.00163. [Google Scholar]
  70. Aliapoulios, M.; Bevensee, E.; Blackburn, J.; Bradlyn, B.; De Cristofaro, E.; Stringhini, G.; Zannettou, S. A Large Open Dataset from the Parler Social Network. In Proceedings of the ICWSM, Virtually, 7–10 June 2021; pp. 943–951. [Google Scholar]
  71. Sharevski, F.; Jachim, P.; Pieroni, E.; Devine, A. “Gettr-ing” Deep Insights from the Social Network Gettr. arXiv 2022, arXiv:2204.04066. [Google Scholar]
  72. Paudel, P.; Blackburn, J.; De Cristofaro, E.; Zannettou, S.; Stringhini, G. An Early Look at the Gettr Social Network. arXiv 2021, arXiv:2108.05876. [Google Scholar]
  73. van Doesburg, J. An Alternative Rabbit Hole? An Analysis of the Construction of Echo Chambers within Coronavirus Activism Groups op Telegram. Master’s Thesis, Utrecht University, Utrecht, The Netherlands, 2021. [Google Scholar]
  74. Rogers, R. Deplatforming: Following extreme Internet celebrities to Telegram and alternative social media. Eur. J. Commun. 2020, 35, 213–229. [Google Scholar] [CrossRef]
  75. Rauchfleisch, A.; Kaiser, J. Deplatforming the Far-Right: An Analysis of YouTube and BitChute; SSRN: Rochester, NY, USA, 2021. [Google Scholar]
  76. Bond, S. Kicked off Facebook and Twitter, Far-Right Groups Lose Online Clout. 2022. Available online: https://www.npr.org/2022/01/06/1070763913/kicked-off-facebook-and-twitter-far-right-groupslose-online-clout (accessed on 24 October 2022).
  77. Kearney, M.D.; Chiang, S.C.; Massey, P.M. The Twitter Origins and Evolution of the COVID-19 “Plandemic” Conspiracy Theory; Harvard Kennedy School Misinformation Review: Cambridge, MA, USA, 2020; Volume 1. [Google Scholar]
  78. Knuutila, A.; Herasimenka, A.; Au, H.; Bright, J.; Nielsen, R.; Howard, P.N. Covid-related misinformation on YouTube: The spread of misinformation videos on social media and the effectiveness of platform policies. COMPROP Data Memo 2020, 6, 1–7. [Google Scholar]
  79. Kordopatis-Zilos, G.; Tzelepis, C.; Papadopoulos, S.; Kompatsiaris, I.; Patras, I. DnS: Distill-and-Select for Efficient and Accurate Video Indexing and Retrieval. arXiv 2021, arXiv:2106.13266. [Google Scholar] [CrossRef]
  80. Naeem, B.; Khan, A.; Beg, M.O.; Mujtaba, H. A deep learning framework for clickbait detection on social area network using natural language cues. J. Comput. Soc. Sci. 2020, 3, 231–243. [Google Scholar] [CrossRef]
  81. Childs, M.; Buntain, C.; Trujillo, M.Z.; Horne, B.D. Characterizing Youtube and Bitchute content and mobilizers during us election fraud discussions on twitter. In Proceedings of the 14th ACM Web Science Conference 2022, Barcelona, Spain, 26–29 June 2022; pp. 250–259. [Google Scholar]
  82. Luo, H.; Cai, M.; Cui, Y. Spread of misinformation in social networks: Analysis based on Weibo tweets. Secur. Commun. Netw. 2021, 2021, 7999760. [Google Scholar] [CrossRef]
  83. Al-Ramahi, M.A.; Alsmadi, I. Using Data Analytics to Filter Insincere Posts from Online Social Networks. A case study: Quora Insincere Questions. In Proceedings of the 53rd Hawaii International Conference on System Sciences, Maui, HI, USA, 7–10 January 2020. [Google Scholar]
  84. Guo, C.; Cao, J.; Zhang, X.; Shu, K.; Yu, M. Exploiting emotions for fake news detection on social media. arXiv 2019, arXiv:1903.01728. [Google Scholar]
Figure 1. (a) Screenshot of the Plandemic misinformation video entitled ‘VERIFY: “The Plandemic” documentary being shared on social media is full of misinformation’. The video still exists on the platform with a verification tag. Accessed on June 2022 and (b) Search on BitChute with the keyword ‘COVID 19’ accessed on June 2022.
Figure 1. (a) Screenshot of the Plandemic misinformation video entitled ‘VERIFY: “The Plandemic” documentary being shared on social media is full of misinformation’. The video still exists on the platform with a verification tag. Accessed on June 2022 and (b) Search on BitChute with the keyword ‘COVID 19’ accessed on June 2022.
Futureinternet 14 00350 g001
Figure 2. Google Trends analysis on BitChute and Odysee search term.
Figure 2. Google Trends analysis on BitChute and Odysee search term.
Futureinternet 14 00350 g002
Figure 3. Semi-automatic filtering method diagram.
Figure 3. Semi-automatic filtering method diagram.
Futureinternet 14 00350 g003
Figure 4. Temporal distribution of BitChute videos.
Figure 4. Temporal distribution of BitChute videos.
Futureinternet 14 00350 g004
Figure 5. Histogram of likes and dislikes for the top 20 disliked BitChute videos.
Figure 5. Histogram of likes and dislikes for the top 20 disliked BitChute videos.
Futureinternet 14 00350 g005
Figure 6. Categorization of the BitChute videos.
Figure 6. Categorization of the BitChute videos.
Futureinternet 14 00350 g006
Figure 7. Histogram of BitChute emotion detection on comments.
Figure 7. Histogram of BitChute emotion detection on comments.
Futureinternet 14 00350 g007
Figure 8. Histogram of BitChute toxicity detection on comments.
Figure 8. Histogram of BitChute toxicity detection on comments.
Futureinternet 14 00350 g008
Figure 9. Top-20 social networks with BitChute accounts searched by username. x-axis shows the social networks and y-axis the number of account found in the network.
Figure 9. Top-20 social networks with BitChute accounts searched by username. x-axis shows the social networks and y-axis the number of account found in the network.
Futureinternet 14 00350 g009
Figure 10. Temporal distribution of Odysee videos.
Figure 10. Temporal distribution of Odysee videos.
Futureinternet 14 00350 g010
Figure 11. Histogram of views on Odysee videos.
Figure 11. Histogram of views on Odysee videos.
Futureinternet 14 00350 g011
Figure 12. Histogram of likes and dislikes reactions on Odysee videos.
Figure 12. Histogram of likes and dislikes reactions on Odysee videos.
Futureinternet 14 00350 g012
Figure 13. Histogram of Odysee comments’ emotion detection.
Figure 13. Histogram of Odysee comments’ emotion detection.
Futureinternet 14 00350 g013
Figure 14. Histogram of Odysee comments’ toxicity detection.
Figure 14. Histogram of Odysee comments’ toxicity detection.
Futureinternet 14 00350 g014
Figure 15. Top 20 social networks with Odysee accounts hunted down by username. The x-axis shows the social networks and y-axis the number of account found in the network.
Figure 15. Top 20 social networks with Odysee accounts hunted down by username. The x-axis shows the social networks and y-axis the number of account found in the network.
Futureinternet 14 00350 g015
Table 1. Duplicate videos.
Table 1. Duplicate videos.
YT Video Title
EXCLUSIVE Dr Rashid Buttar BLASTS Gates, Fauci, EXPOSES Fake Pandemic Numbers As Economy Collapses
Dr. Rashid Buttar BLASTS Gates, Fauci, EXPOSES Fake Pandemic Numbers As Economy Collapses
EXCLUSIVE# Dr Rashid Buttar BLASTS Gates, Fauci, EXPOSES Fake Pandemic Numbers As Economy Collapses
Table 2. Top 5 languages detected on the titles of the Oxford COVID-related misinformation dataset.
Table 2. Top 5 languages detected on the titles of the Oxford COVID-related misinformation dataset.
Language#VideosPercentage
EN316243.8%
ES84011.6%
DE7099.8%
PT4806.5%
FR3114.3%
Table 3. Number of BitChute and Odysee videos retrieved by each filtering process and the number of YT videos they correspond to.
Table 3. Number of BitChute and Odysee videos retrieved by each filtering process and the number of YT videos they correspond to.
TexualVisualManualTotal/Unique
#dupl
Total #yt
#dupl#yt#dupl#yt#dupl#yt
Bitchute3860893425637212931829409/40961048
Odysee20884874871199682223543/1810591
Table 4. Examples of positive and negative comments posted on BitChute videos.
Table 4. Examples of positive and negative comments posted on BitChute videos.
PositiveNegative
Cheers for uploading this critically important informationWhat’s it like being an idiot? Do people laugh at you or do they just walk away?
Great video i wish many more see this.You sound like a hateful Nazi demon Troll!
I love that manGary…Shut up!!!
Excellent documentary.why is it that evil people like Faucci get away with murder?
Glad I found this, and you, here on BitChuteAge of deceit!
Table 5. Top 10 popular BitChute channels with their number of subscribers and the number of videos they shared from the collected dataset.
Table 5. Top 10 popular BitChute channels with their number of subscribers and the number of videos they shared from the collected dataset.
ChannelsSubscribers#Videos
banned-dot-video155,24218
styxhexenhammer666143,6021
sgt-report125,1632
fallcabal114,0877
amazingpolly113,0361
computingforever112,6131
davidicke103,9337
thecrowhouse89,83813
timpool84,2121
free-your-mind74,22211
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Papadopoulou, O.; Kartsounidou, E.; Papadopoulos, S. COVID-Related Misinformation Migration to BitChute and Odysee. Future Internet 2022, 14, 350. https://doi.org/10.3390/fi14120350

AMA Style

Papadopoulou O, Kartsounidou E, Papadopoulos S. COVID-Related Misinformation Migration to BitChute and Odysee. Future Internet. 2022; 14(12):350. https://doi.org/10.3390/fi14120350

Chicago/Turabian Style

Papadopoulou, Olga, Evangelia Kartsounidou, and Symeon Papadopoulos. 2022. "COVID-Related Misinformation Migration to BitChute and Odysee" Future Internet 14, no. 12: 350. https://doi.org/10.3390/fi14120350

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop