Sociotechnical system

Proudly presented by Team Vector

 

YouTube’s video recommendation algorithm

A sociotechnical description

YouTube is an online platform that provides video-uploading, searching and sharing services. At the core of YouTube’s services is its video recommendation system, which provides its users with personally tailored video suggestions. As one of the world’s top video-viewing community, YouTube’s recommendation system affects almost everyone in some way. As the amount of information generated in the digital age has increased tremendously, people are looking for a platform that can socialize and keep updated with information. This need socially shapes the YouTube recommendation system. In return, YouTube recommendation system also has its social impacts. Our socio-technical diagram dissects the relationships between actants that interact with YouTube recommendation algorithm from both the frontend (users and audiences) and backend (engineers and programmers works at YouTube), as well as social factors that are influencing the system or being affected by the system as a whole.

 

On the backend, the YouTube recommendation system is an algorithm produced by the engineers and programmers at YouTube to recommend a series of videos for users. Specifically, the algorithm incorporates deep learning to extract user histories and user profiles and neural networks to filter those data three times to narrow down a list of videos that the users may potentially be interested in watching. Programmers and engineers are also in charge of generating user interfaces (laptop or mobile) to best present the list of recommended videos in an appealing way. Each time when users either click on the recommended video or actively search for a video, YouTube stores all these in their database for future reference by their algorithm. Users’ private data are protected and only used for producing better-recommended videos for their users. In other words, Youtube is not jeopardizing users’ privacy for profits, or at least, they try not to.

 

On the frontend, users, producers, advertisers and commentators access the YouTube interfaces, they are all seeing a list of tailored videos that are unique based on their video watching history on Youtube. Their usual activities include watch, share and search on the YouTube platform for their pleasure and purposes. Users watch videos out of their preferences either from the list of recommended videos or actively searching for the video they would like to watch. Overall, users are the key source for YouTube to collect data to both prefect their platform and their recommendation algorithm to keep active users.

 

Commentators/influencers share similar purpose as users with a tiny twist. They usually have a louder voice on the platform, which means their opinions are more likely to be heard by the YouTube community than normal users. They have the potential to determine which videos are going to be watched the most, or which video is going to have the most comments. In other words, they are the ones that can shape the next trend on YouTube. For producers, they are probably getting a list of videos on how to expand your channel, how to get more clicks and other channel building related videos. They may also see videos that share similar contents that they are making in their own channels. As producers play one more role on the YouTube platform than audiences, their recommended videos may not necessarily be only based on their video watching histories. Advertisers, on the other hand, may have a completely different relation to YouTube platform than the other actants. As advertisers, they must learn what videos their customers watch on Youtube so that they are able to place their ads on the right videos. In that case, their list of recommended videos could be the videos that their customers watch. For advertisers, watching videos on YouTube is the way to better understand their customers’ preferences.

 

Many theories have been advanced about the social effects of the recommendation and similar systems. The most prominent ones are critical. Commentators have claimed that it distorts facts (Chaslot 2017, Lewis 2018), radicalizes people politically (Tufekci 2018), shows children disturbing content (Oremus 2017) even on the ostensibly filtered YouTube Kids version (Maheshwari 2017), and harms independent media (Hess 2017). However, it has also been lauded as a revolution for independent content creators.

Sources

Tufekci, Zeynep (2018). YouTube, the Great Radicalizer. The New York Times. https://www.nytimes.com/2018/03/10/opinion/sunday/youtube-politics-radical.html

 

Oremus, Will (2017). The Problem With Silicon Valley’s Playthings. Slate.  https://www.slate.com/articles/technology/technology/2017/11/those_disturbing_youtube_videos_for_kids_are_a_symptom_of_tech_s_scale_problem.html

 

Lewis, Paul (2018). ‘Fiction is outperforming reality’: how YouTube’s algorithm distorts truth. The Guardian. https://www.theguardian.com/technology/2018/feb/02/how-youtubes-algorithm-distorts-truth

 

Chaslot, Guillaume (2017). How YouTube’s A.I. boosts alternative facts.  https://medium.com/@guillaumechaslot/how-youtubes-a-i-boosts-alternative-facts-3cc276f47cf7

 

Hess, Amanda (2017). How YouTube’s Shifting Algorithms Hurt Independent Media. The New York Times. https://www.nytimes.com/2017/04/17/arts/youtube-broadcasters-algorithm-ads.html

 

Maheshwari, Sapna (2017). On Youtube Kids, Startling Videos Slip Past Filters. The New Yortk Times. https://www.nytimes.com/2017/11/04/business/media/youtube-kids-

paw-patrol.html

 

css.php