

Other comments could be considered “borderline content,” which YouTube defines as content that “brushes up against” but does not cross the lines set by its rules (The YouTube Team, 2019, para. A significant portion of comments appeared in obvious violation of YouTube’s rules, for example, comments that proposed vaccines are used for mass sterilisation or to insert microchips into recipients. In our study, however, we found comments made against three news videos to be dominated by conspiratorial statements about Covid-19 vaccines in violation of these rules.

YouTube states that its Covid-19 medical misinformation rules apply to all content posted to the platform, including video comments. Claims about vaccines that “contradict expert consensus,” including claims that vaccines cause death or infertility, or contain devices used to track or identify individuals, and false claims about the effectiveness of vaccines are all explicitly in violation of YouTube’s Covid-19 medical misinformation guidelines (YouTube Help, 2022a, para.

The revised guidelines specify that Covid-19 medical misinformation includes any content that contradicts health authorities’ guidance on Covid-19 treatments, prevention, diagnosis, physical distancing, and the existence of Covid-19. These included a system for amplifying authoritative content in automated video recommendations (Matamoros-Fernandez et al., 2021) and amendments to the Community Guidelines to prohibit content “about COVID-19 that poses a serious risk of egregious harm” (YouTube Help, 2022a, para. In 2020, in response to criticisms that it was amplifying misinformation about the virus (Bruns et al., 2020 Shahsavari et al., 2020), the platform introduced a range of policies and design changes aimed at limiting the spread of Covid-19 medical misinformation. YouTube has been a popular source of information among diverse populations throughout the Covid-19 pandemic (Khatri et al., 2020). The platform should consider design and policy changes that respond to discursive strategies used by conspiracy theorists to prevent similar outcomes for future high-stakes public interest matters. Results suggest that during the Covid-19 pandemic, YouTube’s comments feature may have played an underrated role in participatory cultures of conspiracy theory knowledge production and circulation.Through topic modelling and qualitative content analysis, we found the comments for each video to be heavily dominated by conspiratorial statements, covering topics such as Bill Gates’s hidden agenda, his role in vaccine development and distribution, his body language, his connection to Jeffrey Epstein, 5G network harms, and human microchipping.Each video featured Bill Gates and, at the time of data extraction, had between 13,000–14,500 comments posted between April 5, 2020, and March 2, 2021. We studied a dataset of 38,564 YouTube comments, drawn from three Covid-19-related videos posted by news media organisations Fox News, Vox, and China Global Television Network.During the Covid-19 pandemic, YouTube introduced new policies and guidelines aimed at limiting the spread of medical misinformation on the platform, but the comments feature remains relatively unmoderated and has low barriers to entry for posting publicly.RQ2: What discursive strategies are YouTube commenters using to formulate and share conspiracy theories about Bill Gates and Covid-19?.RQ1b: Which conspiratorial topics on these news videos attract the most user engagement?.RQ1a: What are the dominant conspiratorial themes discussed among YouTube commenters on news videos about Bill Gates and Covid-19?.Image by geralt on Pixabay Research Questions
