Abstract
This study investigates social network site affordances and their implications for perceptions of marginalized communities. I employ Facebook as a case study and speak with young adult users to comprehend how socially marginalized groups are perceived through Facebook’s affordances. In particular, I consider: How familiar are users with Facebook’s tools and functionalities? How are issues of gender and race represented through the site’s interface? How do users conceive of gender and race? The findings suggest that gender is perceived as a more important identifier than race and that Facebook is post-racial, because of the user interface choices made. In addition, my participants view Facebook as an official social space that should include “authentic” identities; although Facebook has shaped authentic to mean accurate. I conclude that while the construction of affordances is a negotiation between user, interface, and designer, the designers have the most power because they have created the spaces in ways that will most benefit Facebook. In addition, users who are more situated in the socio-cultural majority have no desire to enact agency within Facebook’s structure because they are accustomed to forms and official documents that are well suited to fit their identification needs.
With each new technological advancement comes a declaration of some “great social equalizer” (boyd, 2014). When the Internet was first entering households, a common belief was that its integration would bring a cultural and social shift. These sentiments were guided by the idea that virtual communities allowed users to leave their bodies behind; users met new people and experimented with their identities (Rheingold, 1996; Turkle, 1995). Prejudices were assumed to soon be a thing of the past—race, gender, and physical appearances would no longer be delineating factors.
Today, these utopian visions are criticized for their optimism. It seems that technologies cannot solve social issues and perhaps even work to emphasize social divisions (boyd, 2014). The prejudices that we learn offline are likely to journey with us into digital spaces. Although Facebook, for example, allows users to connect to people in new ways, it also reinforces existing networks and norms transferred from offline spaces. In other words, Facebook relationships are “anchored” and compel users to value anonymous (Zhao, Grasmuck, & Martin, 2008), and perhaps even anti-anonymous (Cirucci, 2015), identifications over anonymous ones.
The goal of this study is to better comprehend, as they relate to race and gender, social network site affordances and related interpretations of gender and race identities. Employing Facebook as a case study, I seek to develop answers to questions such as: How familiar are users with Facebook’s tools and functionalities? How are issues of gender and race represented through Facebook? and How do users conceive of gender and race?
Affordances
The use of “affordance” as a noun was coined by Gibson in his 1979 book The Ecological Approach to Visual Perception. Writing about animals and their biological environments, Gibson uses his theory of affordances to explain that, to really understand where and how animals live, we must comprehend how animals visually perceive of what their environments offer them:
I mean it by something that refers to both the environment and the animal in a way no existing term does. It implies the complementarity of the animal and the environment . . . Affordance cuts across the dichotomy of subjective-objective and helps us to understand its inadequacy. It is equally a fact of the environment and a fact of behavior. It is both physical and psychical, yet neither. An affordance points both ways, to the environment and to the observer. (pp. 119, 121)
In other words, there is a symbiotic relationship between the animal and its environment.
Gibson continues by explaining what has happened now that humans have added on to the environment—the shapes and substances of our world have been changed because humans want to make more available what benefits them. The large, and important, change to the conception of affordances then is that human-made objects are no longer neutral. The symbiotic process of animal–environment takes on a third party—designers.
Gibson’s original argument is actually situated in ignoring the material makeup of objects and instead analyzing affordances—what the object can do, is supposed to do, and is promoted as doing. At the time of his writing, Gibson believed that people paid too much attention to the dimensions or physical qualities of things—their smaller pieces and their material makeup. Instead, he wanted researchers to focus on what objects may afford animals in the symbiotic relationship of animal–environment. We have perhaps come full circle, often omitting material analyses of mediated structures. And, media technologies present different issues than trees or flowers because they are made by humans and are thus not neutral. Indeed, the general social network site user is rarely, if ever, provoked to think about the material makeup of the site.
In sum, to apply Gibson’s theory of affordances is to focus on the negotiation that exists at the intersection of user, interface, and designer; affordances do not exist without interaction (Nagy & Neff, 2015). Without considering all three pieces, we miss the dynamic relationship that all users have with mediated technologies. Users display a stark differentiation between their practices and the actual interfaces. Theoretically, we can think of things like features, apps, and devices as distinct objects, but users make sense of technological systems in a variety of ways that span and conflate these objects and interfaces (McVeigh-Schultz & Baym, 2015).
McVeigh-Schultz and Baym (2015, p. 2) define affordances as the “perception between bodies and artifact.” Similarly, Van Dijck (2013) argues that platforms are techno-social constructs that are some aggregation of technology, content, and users. However, neither explicitly states the new third prong—designers. Thus, I argue, to study affordances is to study the negotiation that happens between users and designers through the created interface. While it is important to view interfaces and digital objects more like mediators instead of intermediaries (because they do much more than just move information but transform it; e.g., Galloway, 2013; Latour, 2005), it is integral to also include investigations into how and why specific designer choices have implications for users.
As such, using affordance to mean “choice” or “constraint” is not helpful when examining the three-pronged negotiation (McVeigh-Schultz & Baym, 2015). Instead, affordances are relational negotiations (e.g., Hutchby, 2001a) that are constantly occurring as the cycle of designers designing and users using continues. While objects may visually convey action capacities (Norman, 1988), digital capacities can be hidden by designers in an effort to promote desired perceptions and uses (McVeigh-Schultz & Baym, 2015).
In addition, because designers have the most power in affordance negotiation, their decisions work to prime, resist, and shape the ways users make sense of the technology (McVeigh-Schultz & Baym, 2015). “When a programmer decides which gesture to render, then [they are] deciding not what to communicate, but what possible messages to allow; such decisions dictate the communication potential of a space” (Kolko, 1999, p. 180). Interfaces are both semiotic and institutional structures (Giddens, 1984) that influence how narratives are shared and shaped (Duguay, 2016). As media technologies become more advanced, designers work harder to design “user-friendly” spaces that obscure how spaces actually function. Less space is provided to tinker and, as a general culture, we are encouraged to feel a passivity to technology (Gillespie, 2006).
Previous research in online gaming has highlighted the importance of avatar design because of its strong tendencies to affect offline interactions (e.g., Kolko, 1999). While this work was situated in games, social network site profiles are very similar to personalized avatars (e.g., Cirucci, 2013). At sign up, as with initial avatar creation before a game can be played, Facebook begins to shape user performances and experiences (Light & McGrath, 2010). Facebook’s “real name” policy, for example, along with a myriad of other design choices, is generally in contention with identities that are fluid and complex (Lingel & Golub, 2015).
Instead, Facebook is concerned with investing time and money into singular profiles (Lingel & Golub, 2015). Some identity choices are inescapable, while others are simply omitted (Zhao et al., 2008). These flat profiles produce data that conflate identity performances and contexts and feed Facebook data that can then be repurposed as both targeted ads for users and valuable fodder for third parties ready to cross-reference databases and build even more dynamic “profiles” of people. However, these databases are never perfect and privilege one point of view, while muzzling many others (Light, 2007). Facebook has, thus, become a polemical and political site of analysis (e.g., Bowker & Star, 1999).
With the above in mind, this study aims to better understand the non-neutral interfaces that guide user identity perception on Facebook. Specifically, I explore identifications related to gender and race in an effort to better realize how and why Facebook potentially proliferates social division and misrepresentation. Because digital technologies are created by people, they are necessarily couched in political, economic, and cultural powers (e.g., Winner, 1980). Therefore, a look into designer choices, paired with users’ perceptions and daily usage habits, offers insight into Facebook’s affordances.
Methods
Nagy and Neff (2015) outline a new way of conceptualizing affordances—imagined affordances. The notion of imagined affordance takes into account the experiences not consciously realized by users. They argue that users realize specific affordances, not some full set as presented by each space. On the other hand, McVeigh-Schultz & Baym (2015) found that people make sense of material structures at different, nested layers, and that this sense-making process does not involve speaking about material parts separately. Clearly, there is a disconnect between how we can better investigate structures, their design, and their implications for users.
In response to these two studies, this study was conducted in two parts: a structural discourse analysis and focus groups. Pairing an analysis of presented, non-neutral tools with users’ experiences with these tools provides a dynamic look into the negotiation and interaction that is affordances.
Structural Discourse Analysis
As Galloway (2013) proposes, I viewed Facebook’s interface not as a medium but as a mediator that does not transport, but transforms, information (Latour, 2005). Before examining identity performances through the interface, I first explored the ways in which Facebook mediates identifications. In January 2014, 1 I reviewed and cataloged all of Facebook’s buttons, tools, and functionalities. I then conducted a discourse analysis, noting what the tools made possible, what was impossible, and what actions and perceptions were privileged over others. Furthermore, mostly through gaining access to The Zuckerberg Files, 2 I searched news archives to include, when possible, when, why, and how interface changes were made.
In sum, I conducted a discourse analysis not of user-generated content but of Facebook’s structure—a structural discourse analysis. My technique was similar to Duguay’s (2015) walkthrough method, wherein she notes parts of technologies’ architectures such as content navigation tools, features, and buttons. However, the moral values embedded in Facebook (Light & McGrath, 2010) were more deeply considered through the discourse analysis. My overall process was guided by Fairclough (1995) and partially inspired by Papacharissi’s (2009) study of social network “geographies.” I attempted to examine the social powers at play and how those in power attempt to control content and their structures. I view Facebook as a socio-cultural system whose presented language and structure play a role in shaping identities, social relations, and systems of knowledge (Fairclough, 1995, p. 55).
Focus Groups
After many readings of each tool and functionality, I conducted nine focus groups with college students (n = 45) at a large, urban, east-coast university in the United States. Beyond an analysis of Facebook’s interface, I was interested in learning how everyday users identify and make sense of the site. Instead of just presenting architectural pieces as choices or constraints, I provided a space for informants to share their sense-making processes (McVeigh-Schultz & Baym, 2015). As Duguay (2015) notes, employing only a walkthrough method is limiting because there is no inclusion of user perception. At the same time, only completing an analysis of user performances fails to recognize and take into account the mandatory setting of the mediating interface.
Participants were 18–30 and declared their racial affiliations as: White (71%), Black (13%), Asian (9%), Latinx (4%), and Others (2%). Each focus group lasted between 45 min and 1.5 hrs. Topics for conversation were derived from my structural discourse analysis findings. Informants were asked to share stories regarding how they have used different Facebook tools and were asked to speak about the extent to which they are aware of their non-neutrality. While some general questions were inspired by the discourse analysis, the focus groups were generally open-ended.
It is important to note that all participants were given a short demographics survey to complete. Spaces to define gender and race affiliations were open-ended. However, no participants included a gender affiliation beyond female or male. Focus groups were also mixed gender and race, which may have silenced those who may have felt their comments would have been seen as marginalized or “different.” However, I chose focus groups because they are social and, thus, relevant when the content up for discussion is also social (Frey & Fontana, 1993). As discussed in detail below, the makeup of my sample became a finding in and of itself, displaying how those in more privileged socio-cultural positions (or those who at least feel the need to speak in a way that conforms to the majority) negotiate affordances in very specific ways.
Findings
The following sections outline three main themes that emerged through the structural discourse analysis and focus groups. These three themes are rooted in Facebook’s gender and race affordances. Each section outlines related functionalities, presents a brief discourse analysis, and shares participant experiences, as a way of exploring Facebook’s affordances.
Digital Gender
In early 2014, 3 beyond binary options, Facebook provided US users with 50+ gender affiliations. 4 It is safe to say that gender selection is important to the site; while on the About Page users can choose something other than a binary affiliation, at sign up new users still must choose from only “female” and “male.” For Facebook, third-party marketers, and database companies, gender is a crucial demographic.
The study of digital gender is certainly not new. The importance of ascribing binary gender to virtual bodies has been integral to online worlds since early multi-user dungeons (MUDs), gaming spaces, and chat rooms. While some may have thought that users logged in and became “disembodied,” it was quickly made apparent that electronic worlds are not separable from the physical self (e.g., Kolko, 1999). In a study of Gaydar, a dating site for gay men, Light (2007) found that suggestions for users as they created their dating profiles were stereotypically masculine, omitting groups from dropdown boxes, specifically effeminate men. Through its digital structure, Gaydar users were pressurized to conform to certain cultures with little room for resistance. This 2007 study is but one example of the ways in which designer choices of digital spaces make negotiation of gender affordances unequal and compel users to adhere to heteronormative expectations.
Although Facebook’s About Page gender change occurred shortly before I spoke with my informants, only half were familiar with the additions, and only one had changed the gender option (from female to cisfemale). For my informants, and perhaps in line with Facebook’s goals, the gender prompt is not perceived as a space for expression but perceived as a box to check that mirrors their birth certificates or medical forms:
Cheryl, 18-year-old white female:
5
Facebook’s going to keep upgrading to try and make people want to go on it and feel more comfortable, like, they’re accepted there. But, I just think that’s ridiculous. Like, on your birth certificate I really doubt it’s ever going to be more than just male or female. Ryan A., 19-year-old white male: I think as soon as I got a Facebook I put it [gender selection], and my gender hasn’t changed; though, I haven’t really changed.
This brief excerpt from one of Ryan’s stories nicely summarizes most of my informants’ views. For my sample, changing gender affiliation on the site has nothing to do with avoiding stereotypes or enacting agency. Instead, it is purely about filling in what is usually required on legal documents. For Facebook, this is defined as being “authentic” (putting in “correct” or “real” information). In reality, profile information that matches other identity performances, and thus buying habits, makes users more valuable.
When first introduced, Facebook users were not obligated to select their gender. It was an option, but never choosing one or the other was also possible, resulting in pronoun allotments becoming “they” and “their.” Eventually, users who had chosen this neutral option received a message:
Right now your mini-feed may be confusing. Please choose how we should refer to you. [user-first-name] edited her profile. [user-first-name] edited his profile. (McNicol, 2013, p. 204)
On 27 June 2008, Naomi Gleit, Facebook Director of Product for Growth and Engagement, posted a Facebook note explaining the change. She claimed that translators were having some trouble:
However, we’ve gotten feedback from translators and users in other countries that translations wind up being too confusing when people have not specified a sex on their profiles. People who haven’t selected what sex they are frequently get defaulted to the wrong sex entirely in Mini-Feed stores. (Gleit, 2008)
In her post, Gleit labels the affiliations “sex.” This is how Facebook previously labeled the category. She includes this in her post to connote a more objective sense of the identity label. Besides conforming to the English (US) delineation on the site, she is also implying that all users need to check off a sex, just as the hospital did when they were born. Later in the post, however, Gleit refers to the selection as “gender.”
Some of my participants even noted that using the space to perform more than gender ascribed at birth is identification “overload”:
Davina, 19-year-old Black female: If you are non-binary, then . . . whoever knows you would know that. Like, does that have to be the first thing that somebody sees on your profile?
Davina shares a story that explains her distaste for non-binary gender performances online. Although she claims, people who are offline may know this about you, she questions why it is necessary online. For this participant, a change is confusing because Facebook compels users to perform “legal” or “real” identifications and because these performances are closely linked to official forms that rarely offer more than “female” and “male.” It is clear that Facebook’s gender affordances are both unused by and confusing for my participants.
I argue that these perceptions are in part because the more fluid gender options feel inconsistent with other affordances Facebook offers that the site links to “authenticity.” For example, users, at sign up, still are provided only “female” and “male” from which to choose. They must also input their full birthday and their full “real” name (as it appears on their birth certificate or driver’s license). This type of rhetoric, paired with other options, prompted my informants to share that the increased gender options were probably just a pandering tactic.
From a programming perspective, the obvious heuristic is a model in which data are pulled for algorithmic ranking, personalized site characteristics, and third-party marketing, first, as “user gender is binary” or “user gender is not binary”; second, as “user is female,” “user is male,” or “user is other”; and, as a third point, what is actually selected as the custom affiliation(s). In other words, users’ digital gender identifications are still largely reliant on some binary estimation. On one hand, the new gender choice is appeasement; on the other, it serves as additional marketing data (Kellaway, 2015).
Gender Just Matters More
In comparison with the added explicit space for gender performance, Facebook provides no such space that invites users to input their race/ethnicity identifications. The absence of ethnicity is surprising considering how social media dramatize other identifications (as Facebook does—changes to gender tools were posted about and changed multiple times). The lack of race/ethnicity options speaks volumes about assumptions of the designers—there is a “cultural map of assumed whiteness” (Kolko, 2000, p. 225). This “map” of whiteness is essential to understanding how designers’ cultural expectations are baked into the spaces they create. At some point, someone decided that an explicit race/ethnicity category was not essential.
The omission of race/ethnicity space led my informants to share explanations that compared gender and race, almost as competing social issues. Three general themes emerged: (1) complexity, (2) post-racial culture, and (3) visual culture.
Complexity
Some of my informants spoke about Facebook’s lack of racial identification space as rooted in the fact that race is more complex and complicated than gender. Therefore, some of them concluded that it makes sense that Facebook stays out of the conversation:
Dana, 22-year-old white female: In our country, it wouldn’t make sense . . . There are so many different types of race and ethnicities. Deb, 18-year-old African American female: I know a lot of people who have a lot of race identity.
When thinking about Facebook race/ethnicity affordances, participants were quick to view them as too sticky to deal with, specifically when compared to gender. However, this way of thinking necessarily assumes that gender is not complex, even with the (new) knowledge that Facebook added 50+ gender affiliations. Users are at the risk of devaluing the struggles that many communities are experiencing in fighting for more than just “female” and “male.” At the same time, these assumptions also put users at risk of thinking that because an issue is complex, it means that they should stay out. In sum, the identification choices provided perhaps devalue one fight, while also pushing another to the sidelines. These perceptions are either carried from offline spaces and reified through Facebook or formed online by users who otherwise have not considered them.
Post-racial culture
As a second explanation, some participants discussed the possibility that race is purposely left out because Facebook is “color blind:”
LJ, 19-year-old white female: Umm, I like that there isn’t one actually; I think that it’s good that [Facebook’s] color blind. Ashley, 20-year-old white female: I just don’t think it [race] matters that much. It doesn’t define you. Ryan B., 21-year-old white male: I’m just, I’m indifferent about it, I guess. I mean, it’s something that I don’t think, you know, represents the individual.
In line with other racial discourses present in the United States, participants read Facebook’s lack of explicit racial identification spaces as an indication that Facebook positions race as no longer a defining characteristic of people. Again, Facebook’s omission promotes a very specific perception of racial affairs. Whether these are perceptions that started offline and were reified online, or they are perceptions that Facebook helped to cultivate, my participants were quick to assume that digital spaces are creators of new, equal environments.
In contrast, many argue that we learn to interact with another through physical features (e.g., Alcoff, 2006; Nakamura & Chow-White, 2012). As Nakamura (2002) argues, the Internet is a space for cybertyping. Digital spaces harbor hegemonic ideals, and race becomes just as important online as it is offline (Martin, Trego, & Nakayama, 2010; Tynes, Reynolds, & Greenfield, 2004). Thus, it could be argued that Facebook works to support notions that the United States is post-racial.
The point of this analysis is not to critique my informants for their opinions and interpretations of Facebook’s goals. Rather, I intend to show one of many instances in which affordances online can have strong impacts on users. When considering how Facebook’s interface is constantly constructing reality, highlighting specific traits and trends while squelching others, 6 we can conclude that many young users’ views of the United States are cultivated through digital affordances.
Visual Culture
Ultimately, discussions with the focus groups led to considerations of our current, highly visual culture. Visible, corporeal identifiers, namely profile pictures, are seen as deeming explicit racial/ethnic affiliation unnecessary:
Stephanie, 27-year-old white female: I think it’s the fact that you can post a picture of your race but you can’t post a picture of your gender. JM, 20-year-old white female: What I’m saying is you can see what they look like. And, if you want to know what their race is, you can ask them, sort of. But looking at them you really can’t, like, sexuality doesn’t have a color. I think people identify with that more, not to say that I agree with that; but I feel like that’s how Facebook is saying it.
Throughout my focus groups, both when speaking directly about gender and race, and otherwise, being visible was integral. Users want to see those with whom they interact. There is a certain cultural anxiety that exists for many when they cannot decipher another’s gender or race. This is perhaps rooted in the fact that first impressions help us to define people, and that stereotypes aid in the process. As such, online functionalities exploit this “stereotypical shorthand” (Kolko, 1999, p. 181). Stereotypes are easy because they quickly describe users and are easy to fit into databases. Designers’ first choices are often the tools and functionalities that are derived from a very small selection of stereotypes (Kolko, 1999).
In particular, Facebook promotes a very visible culture (Cirucci, 2015) that begins with the profile photograph. My informants shared many stories in which they were “creeped out” or annoyed when another Facebooker did not have a recognizable image of their face included in their profile photograph. Users rely on these photographs as a means to quickly summarize others’ lives in a matter of minutes (Farquhar, 2013). My participants labeled the profile picture as “prime real estate” and placed much value in its ability to not only add a esthetic value to a profile but also provide identification validation. Many admitted to not interacting with faceless users even when they know the person offline and know for certain that the profile is controlled by their offline friend.
The adherence to, and expectation for, visible selves allows users to rely on visible tells for race/ethnicity. As Stephanie includes, there must be a gender selection because that can be “hidden” in a photograph. But you cannot “hide your race.” In the case of my participants, the assumption that visible qualities make it easy to guess someone’s race/ethnicity is driven by the site’s omission of race and ethnicity categories, as well as their stark adherence to selves that are extremely visible.
In addition, it is intriguing to question why Facebook would leave out race/ethnicity, seeing that it is a valuable marketing tool. Beyond the notion that Facebook employs powerful algorithms that likely abstract what a user’s race/ethnicity is through stereotypical likes and browsing history, Stephanie may be correct. It is not impossible, or even improbable, that Facebook “guesses” race/ethnicity based on profile, uploaded, and tagged photographs among other data collected.
Gaver (1991) describes hidden affordances as functionalities that are afforded but with no information available about them. This is what Facebook essentially does—provides spaces that do not ask for race or ethnicity socially, while builds databases that still collect this information at the institutional level. In a sense, users are aiding in the negotiation of the affordances without even realizing it. It is perhaps important to note here that what a “profile” or “user” looks like for a user is drastically different from what a “profile” or “user” looks like in Facebook’s databases. Thus, it would be naïve to assume that just because Facebook makes a tool visible, that it matters greatly to the database, just as when an affordance is hidden it can still matter greatly. In sum, when an affordance is hidden by Facebook, it means that the full process and purpose are hidden, and the company does not want users to have much active say in the meaning negotiation.
Indeed, Facebook has implemented two processes, in particular, that are fairly hidden to the general user and that play a large role in how they categorize race and ethnicity—DeepFace and Multicultural Affinity Targeting. DeepFace is 3D modeling software that can verify faces with 97.35% accuracy, less than 1% away from human-level capabilities (Taigman, Yang, Ranzato, & Wolf, 2014). Thus, it is quite easy for Facebook to document skin color and other facial features that may align with racial and ethnic differences. There is no mention of DeepFace in Facebook’s Help section when searching for and reading about what the site does to and with user images. In fact, when you search “deepface” in the Help Center, no matches are returned.
Multicultural Affinity Targeting helps advertisers, and Facebook, target people with “multicultural interests.” They define multicultural affinity as “the quality of people who are interested in and likely to respond well to multicultural content” (Fussell, 2016). As with Nakamura’s (2002) early study of LambdaMOO where she shows that White is the “default” race and all others must be defined otherwise, White Facebookers do not have ethnic affinities—they are reserved for African American, Asian, and Latinx users (Thomas, 2016). Thus, paired with DeepFace data, how users perform through the site allows Facebook to ascribe an “affinity.” Then, advertisers can choose to exclude certain affinities from their marketing sample. This means that I could post an advertisement that excludes all users that have been deemed African American, Asian, and Latinx, with the hopes of my ad only being visible to White users. Facebook is careful to label these “affinities” and not ethnicities to avoid any lawsuits or bad press (Thomas, 2016).
Above, I quote JM who suddenly realized, toward the end of the focus group, that Facebook perhaps is sending an implicit message through the omission of explicit race/ethnicity spaces. This is representative of many of my participants’ comments toward the end of their focus group. After much thought and conversation, some began to think that perhaps Facebook compels them to think about gender and race in specific ways. To be clear, just as Facebook cannot solve social divisions, it is not my goal to argue that the site is solely creating new social divisions. Clearly, the issues discussed herein are not new. However, in a space that has been so seamlessly folded into many lives, it is important to investigate what stereotypical norms are promoted.
Facebook’s active decision to leave race/ethnicity off the user interface, while constantly updating and posting about gender affiliation, led my informants to view gender as a more important, but less messy, identification piece. Viewing US society as post-racial and guessing race/ethnicity through skin color and other physical “tells” are normalized through Facebook’s functionalities—our identities are “necessarily shaped by platform design choices” (Lingel & Golub, 2015, p. 547).
None of the Above
Finally, participants consistently shared experiences, wherein they spoke to feeling like they had no choice—it is either be on Facebook and follow their rules or not use the space at all. It is true that space for transgressive performances is lacking. But my participants were also highly unlikely to mention attempting to subvert the mainstream metanarratives offered by Facebook. Because of this, I asked them to consider why there is not more space allotted for agency. In particular, what about more “none of the above” spaces?
One common response across groups was to choose the privacy setting “only me.” Users still fill the form as Facebook has designed, but “no one else” gets to see the choice. This is an interesting and relevant example of what Raynes-Goldie (2010) terms social privacy versus institutional privacy. At the social level—friends, networks, and the public—any information marked as “only me” is hidden. However, at the institutional level—Facebook, third-party marketers, and database companies—these choices are still saved, used, filtered, and converted into profit. So, yes, Facebookers can hide their choices from their social networks, but these data are still blended into their online experiences, affecting ads displayed, news storied offered, and friends suggested.
Indeed, agency is often a tough issue when considering digital spaces because they are discrete, binary systems. Just as in a video game like Super Mario Brothers, say, where the gamer should have no expectation that Mario will be able to do something he has not been programmed to do (like move on the z-axis or be besties with Bowser), how social network sites are built, designed, and presented necessarily define what users can and cannot do. And, while structures do have the possibility of being transformed when users decide to take thoughtful action, the possibility of this kind of agency is often staged, made less visible, or quietly removed (Gillespie, 2006).
As one example, when Facebook’s About Page only offered binary gendered options, there was a bug that also, accidently, allowed users to alter a small line of code so that their gender would appear as neither female nor male. With some programming knowledge, this hack was easy to achieve. However, Facebook quickly patched the hole, and it is likely that that type of mistake, or bug, in the code will never occur again. Anonymity is also key to agency within digital systems (Magnet, 2007), but, as discussed earlier, Facebook’s drive for legal, official, and visible users is anything but anonymous.
What I really learned from these discussions is that there is a strong need for completion baked into Facebook’s architecture. Some options are mandatory like gender. Others, while perhaps not mandatory, are strongly suggested, and Facebook constantly reminds users that they have not completed or recently updated a section—perhaps a user has not updated their profile picture in sometime, and Facebook fears it no longer “accurately” represents the user’s corporeal self. Thus, my informants were compelled to believe that Facebook is official. Although not a governmental or medical space, it is the official social space. They expect one another to be honest online, and the way to “enforce” this “authenticity” is to take the site’s prompts as seriously as expected with filling out the US Census or a job application.
Options for gender and race were integral to these conversations. Because Facebook is official for my participants, and similar to other forms people fill out, the site “obviously” needs to collect this identifying information. This interpretation—that Facebook is official and calls for legal and corporeal data—is quite lucrative for the site. It ensures that Facebookers are accurately,
7
not necessarily authentically, broadcasting themselves:
JM, 20-year-old white female: I think when I filled out Facebook it was, like, so long that it was just kind of like, kind of like checking off a physical form, like, male, female, what are you interested in . . . Alessia, 20-year-old white female: When you first start out with Facebook, it’s an application process too . . .
The perception that Facebook is some official, patrolled space is in line with comments from its creator, Mark Zuckerberg:
You have one identity. The days of you having a different image for your work friends or co-workers and for the other people you know are probably coming to an end pretty quickly. Having two identities for yourself is an example of a lack of integrity. (Kirkpatrick, 2011, p. 199)
Immersed in the “official” Facebook space, users are compelled to believe that accuracy is authenticity. Thus, to perform “authentically,” my participants were likely to input identifying information that is deemed to be important through Facebook’s affordances. As such, to be “authentic” is to accurately include the gender that was assigned at birth and to not explicitly define racial or ethnic affiliations.
Facebook’s promoted and morphed definition of authentic that has come to mean accurate, legal, and official is highlighted by a phenomenon seen through another Facebook platform—Instagram. Instagrammers create second, socially private, accounts known as Finstas (Williams, 2016). In complete accordance with Facebook’s definition of authentic, a Finsta—a fake Instagram account—is actually a more real representation of a user. The account is only shared with close friends and often includes embarrassing, unfiltered, mundane images. Thus, the “fakeness” of this type of account is not that it is inauthentic to the user but that it is not in line with the definition of authentic that the architectures of spaces like Facebook, Instagram, and the like have cultivated (e.g., Papacharissi, 2009).
Conclusion
It is perhaps becoming less and less a secret that Facebook strategically decides how identities will be shaped in an effort to construct more efficient data collection, algorithmic, and marketing models. The process of selecting which identification affiliations to request, and which to simply leave off the user interface, places value on specific identifications. As supported through my findings, Facebookers are led to adopt specific expectations and norms regarding the identification process and important cultural issues. As my focus group participants demonstrated, some believe that gender is a more important issue than race because Facebook explicitly asks users to define it. Others noted that race is a more complex and important fight than gender, and Facebook is right in “staying out.” Thus, just as offline expectations follow us into online spaces, prejudices that we learn online journey with us into offline spaces—they are naturalized and reified through our constant, digital performances guided by the site’s design.
Through the structural discourse analysis and focus groups, two main conclusions emerged. First, the negotiation of affordances, as defined by Gibson and updated for social network sites, is not, in fact cannot be, equal because the power roles at play are not equal. Facebook, as Gibson explained, creates a non-neutral space that makes more available what is beneficial to them. They control both the institutional data and the social interface. Facebook’s employees decide how tools, functionalities, and buttons will be designed, how the data will be cataloged and saved, and what will happen to them over time. They decide which data are important and which are “throwaway,” included at the social level for user appeasement. Thus, while users are certainly allowed to do as much as they can within the site, they play only a small role in what the affordances are. This is made especially clear by the way in which my participants view the site as an “official” space or a social utility. Just as they would not want to lie on a form, they do not want to cheat the Facebook system.
The second conclusion situates my sample within the generally heteronormative Facebooker type—when users affiliate with more privileged and socially accepted identifications (whether by choice or through social pressure), they are not inspired to tinker with the site or resist the norms being cultivated. This would at first seem counter-intuitive—those with more social power should have more power in the negotiation of affordances. However, those with marginalized identities are used to fighting against how they are shoved into categories in spaces exactly like job applications or medical forms.
Marginalized individuals, especially when considering gender and race/ethnicity affiliations, are not concerned with being “accurate” because many official forms do not even provide them with the correct spaces and options to be “accurate.” For these groups, being authentic still means defying what forms present as identifications. Therefore, it is clear that when a part of privileged identity groups, as were most of my participants, users are not likely to even think about how they could, or should, subvert the Facebook architecture from the inside. Facebook’s functionalities and policies reflect particular assumptions of identity that privilege some users over others. But it is those who are marginalized that attempt to find workarounds (Lingel & Golub, 2015).
The structural rules implemented by Facebook, although not always designed as visible affordances at the user level, both set parameters for what is possible (Hutchby, 2001b) while also compelling users to act in particular ways and, in turn, implicitly support and adhere to heteronormative identity expectations. It is not that each user is determined by Facebook’s structure, but that, through a “regulated process of repetition that both conceals itself and enforces its rules precisely through the production of substantializing effects”, users are molded (Butler, 2006, p. 198). Agency, then, can only be located in some break of that repetition. This subversion is difficult however, because unlike dressing in drag offline, if a cisfemale user decides to upload a profile photograph wherein she is dressed in a stereotypically masculine way, Facebook will have enough other data points to continue to view, and market to, her as “female.” In addition, until mainstream media break down the complex negotiation of digital affordances, most users will remain comfortable in, or at the very least unaware of, Facebook’s promoted culture.
Footnotes
Declaration of Conflicting Interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.
