Laurier researchers assess the social aspect of streaming services algorithms

/

Photo by Darien Funk

Content-streaming services such as Netflix or Spotify have users sharing their account information with each other. This benefits users to gain access to TV, movies, sports and music.ย 

Patrick Watson, assistant professor of Criminology at Wilfrid Laurier University, is interested in how people are making sense of technology in their lives. 

Watson and his team collaborated with Netflix a year and a half ago. They wanted to know more about how people orient to the algorithm that Netflix uses and understand how the company pushes content towards their viewers. 

โ€œWe were able to look at Twitter data through an analytics package to learn what other people were tweeting about,โ€ Watson said. 

โ€œWe were able to use data that we found to start doing some interviews. We interviewed those who identified themselves as people who shared streaming account and who could tell us more about how the algorithm works and what experience they have with it.โ€  

Watson believes this research is relevant and important in many different ways. 

Watson said people do not know what they are dealing with when they are interacting with these algorithms. โ€œThere are a lot of questions regarding what the algorithm does. We donโ€™t spend a lot of time asking people how they feel.โ€

โ€œAn aspect we deal with is privacy rights, and protecting individuals’ privacy. The courts in Canada break down privacy in two lines. The first line is a normative line which describes law that has set out conditions for what is and is not privacy. The social side is a general perception about how the common person on the street orients to notions of privacy and notions of information integrity.โ€ 

โ€œWe donโ€™t know much about the social side of the algorithm. We are learning more about how people feel and how they react to and what they expect of the digital algorithms operating in their lives.โ€ Watson said that will help inform policy and legal debates and help inform new legislation. 

Watson is very hopeful with the findings from this study. โ€œI know a lot more about how people feel about these sorts of little things.โ€ 

Surprisingly enough, romantic relationships are affected by these algorithms. 

โ€œWe compared the algorithm interaction to sharing keys to someoneโ€™s apartment with their significant other. When someone shares their algorithm and passwords to their accounts, then itโ€™s kind of opening themselves up,โ€ Watson said.

People talk about it being their algorithm. Itโ€™s not an abstract digital thing, itโ€™s an avatar for them. They are saying their algorithm is a representation of them, and they take ownership in that.
Patrick Watson, assistant professor of Criminology at Wilfrid Laurier University
Tweet

There is a lot of discussion on Twitter about people who are not hesitant to share such things; but there is also a little reservation. The algorithm shows things that you may watch or listen to that you do not want others to see. 

โ€œThis is a dodgy activity. People arenโ€™t supposed to be sharing accounts with people outside of their household. The thing we have noted is that there is a ton of chatter which comes from our interviews as well,โ€ Watson said, 

โ€œEverybody wants to engage in the same stuff but if they were to do that, they would have to spend hundreds of dollars a month on streaming services,โ€ says Watson. โ€œThey recognize that as being the problem.โ€  

โ€œIโ€™m always a little curious because we live in a world where things that used to be quite taboo and very private, are almost like resume building. Privacy does seem to be a concept criminologists and sociologists would like to interrogate a bit more, and get some idea of what type of concept privacy is,โ€ he said. 

Watson questions what people feel about privacy and what is our understanding of that concept. 

Some of the findings from their research have already been presented.

โ€œWeโ€™ve presented digitally at the American Sociological Association. We were at a workshop in Austin Texas, pre-COVID-19 and we have a paper that is under review,โ€ Watson said.  

Watson noted that people are making sense of technology in their lives in all sorts of different ways.

โ€œPeople talk about it being their algorithm. Itโ€™s not an abstract digital thing, itโ€™s an avatar for them. They are saying their algorithm is a representation of them, and they take ownership in that.โ€ 

In the next couple of weeks, Spotify will do its year-end wrap up. When Spotify did the โ€˜Year-End Listโ€™ in 2019, the internet responded positively. If someone has a shared account, their algorithm shown may not actually be a representation of them.

โ€œAn example that comes to mind is somebody who presented on Twitter as being a late teenager and early twenties male, and his top stream was Adele. Heโ€™s like, โ€˜Oh this is what happens when I share my Spotify account with my sister.โ€™ Heโ€™s justifying this output that makes us question, why would you even tweet about this? We sort of draw the conclusion that if you donโ€™t tweet about it, it becomes accountable,โ€ Watson said.

People show off their Spotify playlists at the end of the year, which is a representation of their algorithm. 

โ€œSomebody is putting effort into going in and making these posts. They are participating in these rituals and they are getting something out of it.โ€ 

Watsonโ€™s research assistant, Hayden Flight, has also played a big part in their research.

โ€œHe is one who is bringing these important tweets to us and showing us what he finds interesting. He helps us out a lot, and itโ€™s really interesting to see that type of work. Itโ€™s been really fun working with him,โ€ said Watson. 

So, the question still remains: does privacy need to be adjusted?


Leave a Reply

Serving the Waterloo campus, The Cord seeks to provide students with relevant, up to date stories. Weโ€™re always interested in having more volunteer writers, photographers and graphic designers.