It’s official. Facebook can control your emotions. At least that’s what the Social Media behemoth tried to see if it can do in a research in January 2012 that has come to light now.
For over a week, Facebook intentionally changed the posts that you could see on your news feed to see if it can affect your emotions. During this research, it meddled with around 700,000 users’ newsfeeds, making about a 155,000 people sad as a result of this tinkering. Of course, to be fair to the company, an equal number also were made happy by the posts that they saw popping up on their feed.
The more pertinent question to ask is if Facebook had any right to do it, and if it was ethical research. Unfortunately, the Terms and Conditions that we so often ignore state that Facebook reserves the right to change the feed on your timeline. So, essentially, you and I have consented to let Facebook decides what posts we see every day.
That may not sound too bad in itself; after all, Facebook already takes our news feed for granted, showing advertisements that we don’t want to see based on the posts we like and don’t. But toying with your emotions is a whole different ball game. While the sample size may have been quite small compared to the number of users on Facebook, it is still a significantly high number. For a week, there could have been more than a 100,000 people who could have undergone bouts of depression. And that’s never a good thing.
The worst part about this is not just Facebook doing some harmless research in some evangelical zeal to see if its posts are really harming people’s emotions. No, the really scary part is that this could all be an experiment to sell products that are designed to tap into users’ emotions. For example, if a user is sad, which can be gauged by his/her activity, Facebook could very show ads pertaining to products that can make you happy. Considering how much specific you can be about the demographics you choose to target with your advertising on Facebook, we could very well have companies lining up to advertise to people who are in a bad mood or others who would advertise claiming they can improve your mood. And that’s a scary thought and very creepy too!
The other controversial aspect about this “research” is the lack of consent from the participants. Sure, we did check the box in the Terms and Conditions which says Facebook can curate our news feed, but it does not imply that we would want to be part of a research study. For any research on humans, there has to be “informed consent” from the participants, i.e., they should know that they are part of a study group and should also know what the topic of research is. Without this “informed consent”, it is not deemed an ethical research study.
It is indeed a surprise that a research such as this has been published in an esteemed journal without a proper ethical review, but more importantly, Facebook needs to stop meddling with people, especially since it is such a major part of people’s lives now.