YouTube’s demonetization policy of harmful content and other changes to its recommender algorithm may have lowered the visibility of anti-vaccine videos.
•
Anti-vaccine videos are less likely to lead users to pro-vaccine videos due to a homophily effect observed in the recommendation network.
•
Public health agencies ought to collaborate with social media platforms to audit AI-driven recommendations to address inherent biases built into recommendation models.
Abstract
Objective
This research examines how YouTube recommends vaccination-related videos.
Materials and methods
We used a social network analysis to evaluate how YouTube recommends vaccination related videos to its users.
Results
More pro-vaccine videos (64.75%) than anti-vaccine (19.98%) videos are on YouTube, with 15.27% of videos being neutral in sentiment. YouTube was more likely to recommend neutral and pro-vaccine videos than anti-vaccine videos. There is a homophily effect in which pro-vaccine videos were more likely to recommend other pro-vaccine videos than anti-vaccine ones, and vice versa.
Discussion
Compared to our prior study, the number of recommendations for pro-vaccine videos has significantly increased, suggesting that YouTube’s demonization policy of harmful content and other changes to their recommender algorithm might have been effective in reducing the visibility of anti-vaccine videos. However, there are concerns that anti-vaccine videos are less likely to lead users to pro-vaccine videos due to the homophily effect observed in the recommendation network.
Conclusion
The study demonstrates the influence of YouTube’s recommender systems on the types of vaccine information users discover on YouTube. We conclude with a general discussion of the importance of algorithmic transparency in how social media platforms like YouTube decide what content to feature and recommend to its users.