Works Cited
Chaslot, G. (2017, March 31). How YouTube’s A.I. boosts alternative facts – Guillaume Chaslot – Medium. Retrieved from https://medium.com/@guillaumechaslot/how-youtubes-a-i-boosts-alternative-facts-3cc276f47cf7
A former YouTube engineer, who worked on the recommendation algorithm, argues that it tends to promote conspiracy theories. He presents self-gathered data showing that YouTube’s recommendation system is much more likely to present false claims to users than a YouTube search or a Google search for the same topics. He hypothesizes that this is due to the recommendation’s preference for maximizing watch time, as opposed to the search algorithms’ preference for recommending salience. He calls for YouTube to publish more information about the functioning of its algorithm and for the development of better tools to measure its behavior.
Covington, P., Adams, J., & Sargin, E. (2016). Deep Neural Networks for YouTube Recommendations. Proceedings of the 10th ACM Conference on Recommender Systems – RecSys 16. doi:10.1145/2959100.2959190
In this paper, Google engineers demonstrate the technical designs of the YouTube recommendation system. They introduce the goal of the recommendation system, in which it is to recommend users with videos that are fresh, up-to-date, and tailor to their interests. Moreover, authors delineate the design of architectures on three major parts of the YouTube recommendation system: data collection, candidate generation subsystem and ranking subsystem. In each part, authors explain in detail on how deep learning algorithms are capable of learning users’ interests and generate corresponding recommendations.
Dave, P. (2017, November 29). YouTube sharpens how it recommends videos despite fears of… Retrieved from https://www.reuters.com/article/us-alphabet-youtube-content/youtube-sharpens-how-it-recommends-videos-despite-fears-of-isolating-users-idUSKBN1DT0LL
YouTube recommendation system is now trying to add a new feature in the algorithm, to evaluate users’ satisfaction level on the videos to predict and promote videos for the goal of preventing negative sentiment when people spend wasting hours on the uninspired program. This tool is pretty mature now, but there are concerns about filter bubble will persist and lead to misinformation and like-minded opinions spread. And according to YouTube product manager, they are still in combat with the misinformation.
Davidson et al. (2010) The YouTube Video Recommendation System. Proceedings of the Fourth ACM Conference on Recommender Systems, doi:10.1145/1864708.1864770
This article introduces algorithms used in the second version of recommendation system. YouTube recommends videos that is similar to the topic user has previously watched to users based on their activity on site. Moreover, authors demonstrate methods used in implementing and testing the recommendation algorithms.
Lee, J. (2013, September 14). Sick Of Irrelevant YouTube Recommendations? Here’s What You Need To Do. Retrieved from https://www.makeuseof.com/tag/sick-of-irrelevant-youtube-recommendations-heres-what-you-need-to-do/
This is a short article teach you how to change your YouTube recommendation system if the recommended video is becoming weird or beyond your interest. It provides two ways to first delete your search history from your account to reset your recommendations, or to remove YouTube recommendation entirely to solve the problem.
Maheshwari, S. (2017, November 04). On YouTube Kids, Startling Videos Slip Past Filters. Retrieved from https://www.nytimes.com/2017/11/04/business/media/youtube-kids-paw-patrol.html
This article addresses one of the key social issues arose with the popularization of YouTube’s recommendation algorithms. YouTube recommendation system has been criticized for recommending inappropriate videos to children. These videos often show well-known cartoon characters in an offensive and violent way, and use deceptive titles that get videos promoted. Children who watched these videos mimicked inappropriate behaviors shown in videos. Parents who are worried about their children blamed YouTube for relying on algorithms to filter recommended videos. However, YouTube defends itself through the tremendous amount of video information it needs to handle.
Ma, X., Wang, H., Li, H., Liu, J., & Jiang, H. (2014). Exploring sharing patterns for video recommendation on YouTube-like social media. Multimedia Systems, 20(6), 675-691. doi:10.1007/s00530-013-0309-1
The authors believes that video sharing behavior, which is a part of the algorithm for video sharing sites, is based on social relationship between users. Their attempts at demystify how much social relationship between users actually influence recommendation list, especially through user’s shared videos on their social media. They proposed a similarity-based strategy to enhance video recommendation algorithm for Youtube. Looking at their results, this strategy is fairly effective at improving the precision and recall of recommendations, as compared to other widely adopted strategies without social information. For further research, it would be essential to put an emphasis on better understanding video sharing sites’ users’ social relationship to produce a more suited algorithm for recommendation.
Oremus, W. (2017, November 07). Those Disturbing Kids’ YouTube Videos Are a Symptom of Tech’s Deepest Problem. Slate. Retrieved from http://www.slate.com/articles/technology/technology/2017/11/those_disturbing_youtube_videos_for_kids_are_a_symptom_of_tech_s_scale_problem.html
Journalist Will Oremus uses the story about YouTube showing disturbing content to children to make a broader argument that algorithms are creating a society-wide problem with the quality and trustworthiness of media. He argues that there are no realistic solutions on the horizon; because the algorithms have been designed to make decisions faster and at a greater scale than any human could, it will never be feasible to have humans review their decisions effectively.
Tufekci, Z. (2018, March 10). YouTube, the Great Radicalizer. The New York Times. Retrieved from https://www.nytimes.com/2018/03/10/opinion/sunday/youtube-politics-radical.html
Sociologist Zeynep Tufekci argues that YouTube’s recommendation algorithm tends to recommend more and more extreme or radical content over time. For example, starting by watching a conservative video, and allowing YouTube to continue recommending videos, will often quickly lead to far-right content. The same phenomenon exists on the left-wing side of the political spectrum, and with other ideological axes. For example, following recommendations from videos about vegetarianism leads to videos about veganism. Zufekci makes reference to her own informal research, as well as a 2018 New York Times investigation that provides evidence to back up some her observations.
Zhou, R., Khemmarat, S., Gao, L., Wan, J., & Zhang, J. (2016). How YouTube videos are discovered and its impact on video views. Multimedia Tools and Applications, 75(10), 6035-6058. doi:10.1007/s11042-015-3206-0
This article answers the question of how Youtube videos are discovered and its impact on three major view sources: related video recommendation, YouTube search, and video highlight. A measurement-driven analysis of how three main view sources of Youtube video impact video views are performed, and here are the findings. 1. The youtube recommendation systems increase the view diversity, while search and video highlight create an effect of rich-get-richer; 2. Search and recommendation contributes to the video views the most in the long term, and the contribution of views from recommendation and search approximately stabilizes to a constant rate as the video becomes older; 3. The top referrer video set is fairly stable, and the view rate of referrer videos is the most important factor in driving views from the top referrer videos; 4. The contribution of views from highlight faded quickly during the promotion and generally does not increase the view rate of video in the long run, comparing to the videos those are not highlighted. It is essential to develop methods for gaining views from the search engine and the recommendation system, which include choosing appropriate keywords, getting placed on the related video lists of more popular videos.
Zhou, R., Khernmarat, S., Gao, L., Wan, J., Zhang, J., Yin, Y., & Yu, J. (2016). Boosting video popularity through keyword suggestion and recommendation systems. Neurocomputing, 205, 529-541. doi:10.1016/j.neucom.2016.05.002
This article examines ways to increase view on video by leveraging on the recommendation system. They started off by understand how videos are propagated through links and identify factors that weight in the recommendation algorithm. This was the part that helps us understand the algorithm better. Their results from this initial measurement show that similarity in video meta-data is a crucial measurement in connecting videos. What they proposed as a potential strategy is using keyword suggestions, as it utilizes video clusters on a referrer video graph to obtain relevant keywords and ranks keywords based on both their relevance and their potential to attract video views. This case study only demonstrates a potential strategy of using the keyword suggestions algorithm to increase video views and extending the watching time per playback. There are limitations in this strategy in terms of the use of visualization within the algorithm, which need to be reviewed in future studies.
Zhou, R., Khemmarat, S., & Gao, L. (2010). The impact of YouTube recommendation system on video views. Proceedings of the 10th Annual Conference on Internet Measurement – IMC 10. doi:10.1145/1879141.1879193
This article is a study on how the YouTube recommendation system would influence the views of videos. Based on the dataset crawled from YouTube page. And they find that the main sources of YouTube videos views is the related video recommendation, accounts for 30%. They also investigate the correlation between video views and referred video view. The study found that the YouTube recommendation system has a positive impact in increasing the diversity of video views in aggregation, to help watchers find more videos that they would be interested in.
Images cited:
Georgetown logo:
Culture, communication and technology graduate school logo, Georgetwon University
Frontpage background image:
4k, woman browse the website on smartphone with business building & urban traffic background. gh2_11594_4k, Shutterstock
https://www.shutterstock.com/zh/video/clip