Laurier researchers assess the social aspect of streaming services algorithms
Content-streaming services such as Netflix or Spotify have users sharing their account information with each other. This benefits users to gain access to TV, movies, sports and music.
Patrick Watson, assistant professor of Criminology at Wilfrid Laurier University, is interested in how people are making sense of technology in their lives.
Watson and his team collaborated with Netflix a year and a half ago. They wanted to know more about how people orient to the algorithm that Netflix uses and understand how the company pushes content towards their viewers.
“We were able to look at Twitter data through an analytics package to learn what other people were tweeting about,” Watson said.
“We were able to use data that we found to start doing some interviews. We interviewed those who identified themselves as people who shared streaming account and who could tell us more about how the algorithm works and what experience they have with it.”
Watson believes this research is relevant and important in many different ways.
Watson said people do not know what they are dealing with when they are interacting with these algorithms. “There are a lot of questions regarding what the algorithm does. We don’t spend a lot of time asking people how they feel.”
“An aspect we deal with is privacy rights, and protecting individuals’ privacy. The courts in Canada break down privacy in two lines. The first line is a normative line which describes law that has set out conditions for what is and is not privacy. The social side is a general perception about how the common person on the street orients to notions of privacy and notions of information integrity.”
“We don’t know much about the social side of the algorithm. We are learning more about how people feel and how they react to and what they expect of the digital algorithms operating in their lives.” Watson said that will help inform policy and legal debates and help inform new legislation.
Watson is very hopeful with the findings from this study. “I know a lot more about how people feel about these sorts of little things.”
Surprisingly enough, romantic relationships are affected by these algorithms.
“We compared the algorithm interaction to sharing keys to someone’s apartment with their significant other. When someone shares their algorithm and passwords to their accounts, then it’s kind of opening themselves up,” Watson said.
There is a lot of discussion on Twitter about people who are not hesitant to share such things; but there is also a little reservation. The algorithm shows things that you may watch or listen to that you do not want others to see.
“This is a dodgy activity. People aren’t supposed to be sharing accounts with people outside of their household. The thing we have noted is that there is a ton of chatter which comes from our interviews as well,” Watson said,
“Everybody wants to engage in the same stuff but if they were to do that, they would have to spend hundreds of dollars a month on streaming services,” says Watson. “They recognize that as being the problem.”
“I’m always a little curious because we live in a world where things that used to be quite taboo and very private, are almost like resume building. Privacy does seem to be a concept criminologists and sociologists would like to interrogate a bit more, and get some idea of what type of concept privacy is,” he said.
Watson questions what people feel about privacy and what is our understanding of that concept.
Some of the findings from their research have already been presented.
“We’ve presented digitally at the American Sociological Association. We were at a workshop in Austin Texas, pre-COVID-19 and we have a paper that is under review,” Watson said.
Watson noted that people are making sense of technology in their lives in all sorts of different ways.
“People talk about it being their algorithm. It’s not an abstract digital thing, it’s an avatar for them. They are saying their algorithm is a representation of them, and they take ownership in that.”
In the next couple of weeks, Spotify will do its year-end wrap up. When Spotify did the ‘Year-End List’ in 2019, the internet responded positively. If someone has a shared account, their algorithm shown may not actually be a representation of them.
“An example that comes to mind is somebody who presented on Twitter as being a late teenager and early twenties male, and his top stream was Adele. He’s like, ‘Oh this is what happens when I share my Spotify account with my sister.’ He’s justifying this output that makes us question, why would you even tweet about this? We sort of draw the conclusion that if you don’t tweet about it, it becomes accountable,” Watson said.
People show off their Spotify playlists at the end of the year, which is a representation of their algorithm.
“Somebody is putting effort into going in and making these posts. They are participating in these rituals and they are getting something out of it.”
Watson’s research assistant, Hayden Flight, has also played a big part in their research.
“He is one who is bringing these important tweets to us and showing us what he finds interesting. He helps us out a lot, and it’s really interesting to see that type of work. It’s been really fun working with him,” said Watson.
So, the question still remains: does privacy need to be adjusted?