The readings this week were really fascinating. I had not read a full autopsy of the Facebook/Cornell study, and the Atlantic's article delivered!
The Atlantic piece offers both a critique of Facebook users for being too over the top, and Facebook/Cornell for being surreptitious and shady. It's interesting to see users/audiences have such vehement reactions to the study, however, people didn't actually leave. Facebook continues to grow. The research Facebook is doing should give audiences pause. Though Facebook claims to have not read personal messages of people in the study, it is manipulating users' experiences as they not only engage with content but with each other. I think this is what I find most ethically questionable. Facebook's algorithm has the power to manipulate our relationships with our family, friends. Yes, it may be useful to helping us connect with one another, but now that the company is a publicly trading entity, its focus has shifted to serving its shareholders. The Huffington Post article describes how Facebook may have impacted an election, and though on the face encouraging people to vote is a positive thing, I think audiences should be weary that a company like Facebook can influence who we vote for. Facebook is not simply a tech company, but also prescribes to certain political beliefs.
On the other hand, it is really interesting to see the strong reactions from Facebook users. We are seemingly manipulated every day by similar A/B tests from our cable and mobile providers, and perhaps even the people at the Belo coffee shop. There appears to be a strong disconnect with people understanding what Facebook provides and what Facebook is. And when its true colors are brightly lit, people freak out en masse. This reaction is ephemeral as people continue using Facebook's service. Perhaps if we paid for Facebook, there may be a different reaction?
The Atlantic piece offers both a critique of Facebook users for being too over the top, and Facebook/Cornell for being surreptitious and shady. It's interesting to see users/audiences have such vehement reactions to the study, however, people didn't actually leave. Facebook continues to grow. The research Facebook is doing should give audiences pause. Though Facebook claims to have not read personal messages of people in the study, it is manipulating users' experiences as they not only engage with content but with each other. I think this is what I find most ethically questionable. Facebook's algorithm has the power to manipulate our relationships with our family, friends. Yes, it may be useful to helping us connect with one another, but now that the company is a publicly trading entity, its focus has shifted to serving its shareholders. The Huffington Post article describes how Facebook may have impacted an election, and though on the face encouraging people to vote is a positive thing, I think audiences should be weary that a company like Facebook can influence who we vote for. Facebook is not simply a tech company, but also prescribes to certain political beliefs.
On the other hand, it is really interesting to see the strong reactions from Facebook users. We are seemingly manipulated every day by similar A/B tests from our cable and mobile providers, and perhaps even the people at the Belo coffee shop. There appears to be a strong disconnect with people understanding what Facebook provides and what Facebook is. And when its true colors are brightly lit, people freak out en masse. This reaction is ephemeral as people continue using Facebook's service. Perhaps if we paid for Facebook, there may be a different reaction?
No comments:
Post a Comment