When users scroll through Facebook feeds, pausing to watch cat videos and make witty comments on their friends’ statuses, most don’t expect the social media site to have any impact on what they choose to post.
According to the New York Times, a recent study by Facebook and researchers at Cornell revealed the site has been altering its users’ news feeds without them knowing it to control the amount of positive and negative posts to which they’re exposed in order to see how emotions spread via
This wasn’t for just a few of users, either — over half a million users were studied. As it turns out, there is a correlation between exposure to positive posts and making positive posts, and vice versa for negative posts.
The study has received monumental backlash. The Electronic Privacy Information Center, a privacy group, complained about it to the Federal Trade Commission.
Those who are concerned about the ethics of the study certainly should be. Facebook wasn’t just manipulating the content of almost 700,000 feeds: It was manipulating the users themselves. The only reason it was able to do so in the first place was because of a loophole — if users agreed with the site’s terms of service, they had already given their consent to be tested.
While it was a despicable move for the site to make its users vulnerable to its own pursuits and Facebook shouldn’t be commended for using psychological stunts to check if emotions are contagious, especially when it comes to testing negative posts, one NY Times columnist suggests it might not be a bad thing to have a defense against such a powerful entity.
The study was meant to see if emotions could be contagious without physical contact, and it found they are. This defense, as the writer said, can be used by the site’s critics to argue it is much too powerful since they can pull up the research conducted by Facebook itself as proof.
Aside from ways the study can be used against Facebook, its results have inherent value just because they prove users should approach the site with more awareness, if not skepticism. When relaxing and watching those cat videos or ramping up online cleverness, users can’t be too complacent.
When Facebook stops being the platform people have to share and communicate and starts affecting its users on a personal level, users should be more perceptive to the ways the site can manipulate them.
People have long argued Facebook is too intrusive — from concerns of the site requesting private information such as cell phone numbers to more recent criticism of the site catering its advertising to users based on one’s Internet history.
Along with the ethical questions of the study, the results have made it even harder for Facebook users to enjoy the site comfortably, without having to worry about how their information is used or how Facebook is controlling what they see.
It’s the reason why those who want to continue using Facebook should be more aware when doing so. Or, as tends to be an option when the public doesn’t like the choices a company makes, there are always boycotts.
Isabelle Cavazos is a junior majoring in English and Spanish.