A Power Analysis for Platforms: Expression, Equitable Governance and Participation
Legal, Policy and First Amendment
What do the internet and newspapers have in common? When it comes to regulation, just about everything.
The internet of the 90s was very different from the polished, mainstream technology that it is today. Content was mostly produced by tech-savvy individuals and delivered to small, like-minded audiences on CompuServe forums and internet Relay Chat.
The current internet landscape is far different, being dominated by society-spanning tech organizations like Meta, X, Amazon, Google, TikTok, and others. In many ways, these organizations and their platforms have become the internet, commanding audiences so vast and with torrents of content so large that they are practically unavoidable.
What happens when the services people use become more powerful than the people themselves? Increasingly, the answer seems to be a trampling of the public good for organizational gain.
Jasmine McNealy, University of Florida College of Journalism and Communications Media Production, Management, and Technology associate professor, examined the skewed power dynamics between large internet platforms and their users in her legal essay, “A Power Analysis for Platforms: Expression, Equitable Governance, and Participation.”
In her essay, McNealy proposes that internet platforms have become so ubiquitous and indispensable for communication that they can easily place organizational benefits above the needs of society, using recent issues with TikTok to provide a stark example.
In 2020, a platform glitch made it appear that posts with the tags #BlackLivesMatter and #GeorgeFloyd did not receive any views, leading to complaints of censorship and blacklisting. While TikTok claimed that this was a systemic issue that impacted all posts, they did note that they used automatic content moderation systems to flag and remove content, and that this system could make errors. As a result of a massive complaint campaign against the platform, TikTok promised to make several changes, including better moderation, a diversity council, creating a content creator portal, and promoting better internal diversity, equity, and inclusion efforts.
These efforts seemed to bring about little actual change, however, with Black creators still facing major issues. For example, hate speech detection tools used on TikTok were flagging Marketplace profiles with terms like “BLM” and “Black success” as inappropriate, making it much more difficult for these creators to monetize their content.
Notably, there is very little recourse for users in situations like these. Black creators could leave the platform, but there is not necessarily anywhere else for them to realistically go. With the current regulatory landscape doing little to curb the damage done to users, this leads to endless cycles of complaint-apology-minimal fixes-complaint-apology where nothing really changes.
Many of the issues with regulation stem from the stark difference between the internet as it was in the 90s compared to today. Since 1996, regulation of communication on the internet has centered around Section 230 of the Communication Decency Act, where “…the idea was to offer a modest claim of immunity for internet service providers that were taking action to moderate content on their platforms.” At the time, the courts decided that the internet was similar in scope and impact to print media. Information from the internet was considered both not scarce and not pervasive; If something was seen on the internet, it was because it had been actively searched for. While information from the internet certainly did trickle into society at large, the reach simply was not great enough nor the audiences big enough for there to be much passive impact. This, then, made users of the internet akin to “pamphleteers” in the eyes of the courts.
McNealy uses a power analysis to argue that using print media as a basis for internet regulation in this way no longer makes sense.
The rise of highly curated and privately controlled platforms like Facebook and TikTok have led to the internet becoming much more like broadcast media, where huge audiences are held at the mercy of a few major players. Users have interests that need to be protected, but they face potential censorship, lost economic opportunities, and suppression on internet platforms that control access, information, and user interactions. Despite this, so much power has been recentralized into these platforms that it makes it unfeasible or at times even impossible to not use them. All of this creates a media environment typified by both pervasiveness and scarcity – just like broadcast.
This matters because broadcast media is more strictly regulated than print, especially in terms of reach and access. The government has shown it is concerned with public interest in terms of broadcast media, targeting the things that make the technology powerful in order to create something fairer and more balanced. The same should be done with platforms on the internet to place some power back into the hands of users, creating something more egalitarian for all.
This essay, A Power Analysis for Platforms: Expression, Equitable Governance, and Participation, was commissioned by PolicyLink.
The essay was summarized by Vaughn James, UFCJC Ph.D. 2022.
Posted: November 30, 2023
Insights Categories:
Legal, Policy and First Amendment
Tagged as: Jasmine McNealy, Technology platforms