Social Ambivalence: Facebook the Nonhuman Actor

By 26 March, 2014November 20th, 2020Indlæg

With Facebook we see a built in commercialization of our communication infrastructure to an unprecedented degree. Instead of choosing transparency, Facebook chooses to build its own business motives invisibly into the software architecture, so that it becomes a nonhuman actor. Behind our back Facebook monitors us and directs our communication and behavior in a way that is unpredictable for us.

At the macro level, we have seen similar phenomena before in previous media revolutions, like censorship in relation to the printing press and television. Historically seen, such measures may hold back development for a while, but in the longer term, society will takes full advantage of the opportunities the new media provide. So in this view Facebook’s model has a limited life, before the transparent affordances of digital media will take effect.

At the micro-sociological level, following Joshua Meyrowitz, every time a new communication medium comes into being, it results in new situations, which we need to develop new and adequate norms to cope with (Meyrowitz 1986). Until new norms have developed the new situations caused by the new medium, give rise to social ambivalences because we cannot cope adequately with them. The digital media generally give rise to a number of social ambivalences, because our system of norms needs time to catch up with the new situations, provided by the digital media. E.g. is it okay to concentrate on your smartphone while drinking coffee with your family, or during school time?

When we look specifically at Facebook, the company’s implementation of business motives in its software architecture gives rise to a number of ambiguities, that result in social ambivalent situations for its users, that is not caused by the new situations and possibilities of the digital medium.

The argument of the paper is, that Facebook instead of just providing us with the opportunities of digital media – which the company does – also provides us with ambiguities resulting in social ambivalences. It is also an argument that these ambiguities may hold back the evolution of new norms that are adequate with the new medium environment. In the following the paper put forward six ambiguities caused by Facebook’s functional architecture.[1]

Uncertainty about Facebook’s use of our content

It is unclear how far our “Status Updates”, photos and activities are spread out on Facebook. For example, it is unclear whether and when Facebook’s third-party companies are using our “Likes” for advertising purposes in relation to other users.

Uncertainty about who gets our Status Updates

We do not know how many of our “Friends” who will get our “Status Updates” in their “News Feet” – it is regulated by algorithms (EdgeRank) shaped to optimize the time we spend on Facebook. Only few of our “Facebook Friends” actually get our “Status Updates” in their “News Feed” and we do not know whom it is.

Ambiguity of “common” frame of reference

It adds confusion that Facebook is not a community, but consist of as many parallel networks, as there are users. Many get the feeling that they communicate with all of their friends and thus will be met like they do within a community. In contrast to community communication, the few friends who actually see our “Status Update” will not have a common ground to respond from. Instead, our Facebook friends often do not feel commitment enough to “Like” or “Comment” because these “friends” do not have a common frame of reference, and mostly care about their own reputation and social identity.

Uncertainty about when to use filters

It is unclear when to filter out other people’s access to our profiles and when other filters us, and why they do it. The way it is now, some because they cannot see through the many private settings, choose to “defriend” looser relationships (Sørensen 2013). In addition, we do not get feedback explaining why other users filter our opportunities to see what they write, or prevent us from writing on their “Wall” – or why they “defriend” us.

Uncertainty about news criteria

It adds confusion that Facebook is a quasi mass media, as we do not know its news criteria or mechanisms of viral spread. Suddenly we all sit with an editor’s responsibility, and risk making a fool of ourselves or write something illegal.

Uncertainty about social obligations

It is unclear how much we have to be on Facebook, and how active we have to be there, to meet our social obligations to our Facebook friends (and EdgeRank), as well, as it is unclear when and how much we can take the liberty to go on Facebook in regard to the persons we are physically together with.

All these uncertainties make it ambiguous and socially ambivalent to be on Facebook. But if we’re not on Facebook, we exclude ourselves from a large part of the societal communication. It is a built-in commercialization of our communication infrastructure to an unprecedented degree.

Imagine a parallel to the old postal mail

Imagine if you sent a letter to some particular addressees, and the letter also was distributed to others, while not all of them, you had addressed, received it.

It would perhaps be bearable to live with sending letters for free on the condition that there were printed advertisements on the envelopes. But if information about yourself, and what you were doing, and who you sent letters to, was printed with your name and photo on others envelops, in their mutual correspondence, it would probably be too much for most people.

Business motives built into the architecture

Instead of choosing transparency, Facebook chooses to build its own business motives invisibly into the software architecture, so that it becomes a nonhuman actor. Behind our back Facebook monitors us and directs our communication and behavior in a way that is unpredictable for us. If Facebook, or this medium’s replacement in the future, gave up the nontransparent elements and instead optimized transparency of the functional architecture, it could act as a part of the evolutionary process, where not only social norms on the user side are developed, but where the medium itself were part of this process, being accommodating in designing its functional architecture to be so clear and transparent as possible.

Not only Facebook but also Google cause – through their algorithms – that the outlook and possibilities offered by digital media are amputated and distorted. Similar transitional phenomena, has been seen before under previous media revolutions. Historically, such measures may hold back development for a while, but in the longer term, society will takes full advantage of the opportunities the new media provide. But the ambiguities keeps the development back for a while, and in that period it makes it harder to overcome the ambivalences that the new media throw us in.

References

Finnemann, N.O. (2001). ”The Internet – A New Communicational Infrastructure”. CFI monograph Series. Vol 2.

Sørensen, A. S. (2013). Facebook – kommunikation for kommunikationens skyld. In Tække & Jensen (Ed) Facebook – fra socialt netværk til metamedie. København: Samfundslitteratur. Pp. 117 – 136.

Meyrowitz, J. (1986). No Sense of Place: The Impact of Electronic Media on Social Behavior. New York: Oxford Uni. Press.

[1] The concept of “functional architecture” I have from Finnemann (2001).

Jespertække

Author Jespertække

More posts by Jespertække