Amplifying “Otherness”
Communication and Technology Political Communication
This article by Jasmine McNealy, University of Florida College of Journalism and Communications Telecommunication Associate Professor and Associate Director of the Marion B. Brechner First Amendment Project, originally appeared in Human Rights Racial Equality & New Information Technologies: Mapping the Structural Threats, a publication from The Promise Institute for Human Rights at the UCLA School of Law and the UCLA Center for Critical Internet Inquiry.
In their 2016 essay on inclusiveness and belonging, john a. powell and Stephen Menendian called “othering” “[t]he problem of the twenty-first century.” Defining othering as a system and structure that marginalizes and perpetuates inequality based on categories of identity, including religion, sex, and race, among other things, the scholars identified political and social conditions and power dynamics that promote group based othering in the world. The human tendency toward categorization and unconscious bias helped to explain the dynamics of othering; segregation, secession, and assimilation were failed responses to othering or the problem of the other. powell and Menendian proposed inclusion and belonginess – “unwavering commitment to not simply tolerating and respecting difference but to ensuring that all people are welcome and feel that they belong in the society” – as the way forward.
I agree with powell and Menendian’s assessment of othering, and argue that emerging information technology is amplifying otherness through neglect, exclusion, and disinformation, all of which have significant consequences. Neglect, while perhaps the most recognized problem with emerging technology, is persistent. By neglect I mean the creation, use, and deployment of technology without regard for the disparate impacts they may have on communities different than the imagined audience. Ignorance of the effects of technology can be both intentional and unintentional. Unintentionality presumes a developer did not know or had not considered the possible impacts of their technology. Creators embed their creations with their own values, and values reflect culture and politics. If communities are outside of the scope of the creator’s purview, they may fail to recognize the consequences of that technology for that community.
More insidious, perhaps, is intentional neglect, when in the creation, use, or deployment of technology the impact on a community is both known and ignored. A readily available example of this amplified othering through neglect is the implementation of algorithmic decision-making systems in the criminal justice process in the United States. Though touted as a way to circumvent bias in human decision-making in pretrial and sentencing, these machine learning systems are trained on data reflecting societal biases and systemic racism in the American criminal justice system. And although organizations creating these systems are aware of the biases in the training data, and the consequences, they continue to sell these systems to state and local governments, which then deploy them on their constituents. Whether neglect is intentional or unintentional, then, the discriminatory impact on communities of people should not be acceptable.
Exclusion, keeping particular groups from participating in various ways, is a significant impact of amplified othering. Algorithmic decision systems, like those mentioned above, are more likely to exclude members of some communities from full participation based on biased historical data. Not only are these systems deployed in the United States criminal justice system, but also in the financial sector, where they are used to decide whether an institution should extend credit for home or business loans. Such systems have also been shown to have discriminatory impact when deployed in human resources systems in choosing candidates to interview for jobs, as well as candidates for graduate and professional schools. Unlike the unconscious bias that Powell and Menendian discuss in their essay, developers, scholars, journalists and others are now aware of the biases in these systems. Yet, adoption continues. This may be a reflection of persuasive framing communications used by governments and organizations creating and implementing these technologies despite public outcry.
Indeed, persuasion through framing is a part of language. Individuals and organizations persuade us to accept particular meanings and interpretations by making certain aspects of an idea more salient than others. Advances in communications technology have 30 allowed the persuasive messages of disinformation campaigns to swell around the world, amplifying otherness, and resulting in race, gender, sexuality, and other identity-based violence. Social media manipulators are able to obscure the source of false information, while convincing those with a significant audience to propel their misleading messages. As a result, a larger audience may encounter deceptive communications, which may increase the vulnerability of certain communities. Social media disinformation campaigns have been identified as abetting the genocide of Rohinyans in Myanmar and influencing elections in Kenya, Brazil, and the United States, among other countries. Emerging disinformation technology is amplifying othering in additional ways. Deepfakes technology, for example, allows the user to make it appear as though an individual is saying or doing something they have not said or done. Because of the severe ramifications this technology on our political systems and for those targeted, legislators are considering passing laws.
But can technology-specific laws change othering? Certainly, legislation aimed at banning particular uses of technology and the deployment of harmful technology on the public is welcome. The recent successful campaigns to ban the use of facial recognition technology in San Francisco, Somerville, Massachusetts, and Oakland, for example, are important to helping to push back against government surveillance and the disparate impacts of those activities. But even more impactful would be the passage or strengthening of laws aimed at remedying othering and its historic and current impacts. Voting rights, gender equity, fair pay, and comprehensive privacy/data governance legislation, among other things, and the enforcement of these laws would go a long way in helping to remedy the underlying social issues amplified in emerging technology. While we may, and should, prohibit the use and deployment of harmful technology, it is important that we use law to manifest the belongingness and inclusion powell and Menendian identify – “that all people are welcome and feel that they belong in the society.
Posted: July 8, 2020
Insights Categories:
Communication and Technology, Political Communication
Tagged as: Jasmine McNealy, Othering, Technology Policy