If you missed the story, Facebook (in cooperation with Cornell and the University of California) conducted an experiment involving almost 700,000 unknowing and potentially unwilling subjects. The study was originally designed to debunk the idea that positive social media updates somehow make people feel like losers. Instead, it affirmed something most sociologists, many psychologists, and a few marketers already know.
"Emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks," concluded the study. Negative and positive emotional content can influence our moods.
The significance of the study from the socio-psychological viewpoint.
The summary of the study is clear cut. The researchers showed via a massive experiment on Facebook that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. They also provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient) and in the complete absence of nonverbal cues.
The experiment itself consisted of manipulating the amount of positive and negative content people received from their friends and relatives throughout the day and over long periods of time. Sometimes the test reduced users' exposure to their friends' "positive emotional content," resulting in fewer positive posts of their own. Other times, it reduced exposure to "negative emotional content."
The study confirmed that the changes to a person's newsfeed had the potential to alter their mood. While interesting, it's not surprising. Everything we let into our heads influences us.
The books we read. The television programs we watch. The news we subscribe to. The advertising we see. The people we hang around. It's human nature. We are prone to adapt to our social settings and seek out affirmation for acceptance or validation. And the only remedy is awareness — either the truth or sometimes the constant recognition that someone is attempting to influence you.
The ethical lines of emotional manipulation and big data have blurred.
It is naive for anyone to think that affirmation media doesn't have an agenda much in the same way it is naive to think that marketers don't have a brand agenda (which can be much more powerful than direct sales). They do, much in the same way Facebook has an agenda. The more the social network understands where our new ethical lines are drawn, the more it taps any amount of data for anyone.
The only reason this experiment has touched a nerve is because the people were forced to look at what they don't want to believe much in the same way people who track down an online catfish are often disappointed. The truth isn't something people necessarily want. They want their truth.
As privacy issues have waxed and waned over the years, so has public tolerance. People are all too willing to opt in (or neglect to opt out) for the most marginal of benefits. And as they do, online and offline privacy will continue to erode. The only changes since some of the earliest online privacy debates have been around semantics. Consumer profiling has morphed into big data. Shaping public opinion has drifted toward mass manipulation. And all of it is covered in TOS.
At least, that is what some people think about privacy. What do you think? Is manipulation in the eye of the beholder? Is an apology enough? Would it be all right to promote one hair color over another without product identification just before introducing a new hair dye? Or maybe it is fine to dedicate more airtime to isolated tragedies in an effort to change public policy. The comments are yours.
The significance of the study from the socio-psychological viewpoint.
The summary of the study is clear cut. The researchers showed via a massive experiment on Facebook that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. They also provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient) and in the complete absence of nonverbal cues.
The experiment itself consisted of manipulating the amount of positive and negative content people received from their friends and relatives throughout the day and over long periods of time. Sometimes the test reduced users' exposure to their friends' "positive emotional content," resulting in fewer positive posts of their own. Other times, it reduced exposure to "negative emotional content."
The study confirmed that the changes to a person's newsfeed had the potential to alter their mood. While interesting, it's not surprising. Everything we let into our heads influences us.
The books we read. The television programs we watch. The news we subscribe to. The advertising we see. The people we hang around. It's human nature. We are prone to adapt to our social settings and seek out affirmation for acceptance or validation. And the only remedy is awareness — either the truth or sometimes the constant recognition that someone is attempting to influence you.
The ethical lines of emotional manipulation and big data have blurred.
The only reason this experiment has touched a nerve is because the people were forced to look at what they don't want to believe much in the same way people who track down an online catfish are often disappointed. The truth isn't something people necessarily want. They want their truth.
As privacy issues have waxed and waned over the years, so has public tolerance. People are all too willing to opt in (or neglect to opt out) for the most marginal of benefits. And as they do, online and offline privacy will continue to erode. The only changes since some of the earliest online privacy debates have been around semantics. Consumer profiling has morphed into big data. Shaping public opinion has drifted toward mass manipulation. And all of it is covered in TOS.
At least, that is what some people think about privacy. What do you think? Is manipulation in the eye of the beholder? Is an apology enough? Would it be all right to promote one hair color over another without product identification just before introducing a new hair dye? Or maybe it is fine to dedicate more airtime to isolated tragedies in an effort to change public policy. The comments are yours.