Back

Discover CALS

See how our current work and research is bringing new thinking and new solutions to some of today's biggest challenges.

|
By Tom Fleischman
Share
  • Department of Communication
  • Communication
Cultural norms and trends used to be set by influential people in a few specific ways – in music, in films and on media outlets such as television and newspapers. Those days are long gone: Now there are myriad platforms through which “influencers” can set or reset cultural mores.

Platformization  – as defined by Brooke Erin Duffy, associate professor in the Department of Communication in the College of Agriculture and Life Sciences, and her collaborators – involves the rise of digital platforms in cultural industries, as well as the organization of cultural practices of labor, creativity and citizenship around these platforms.

Duffy, also a member of Cornell’s Feminist, Gender and Sexuality Studies program in the College of Arts and Sciences, has been studying the idea of platformization for several years with collaborators David Nieborg, assistant professor of media studies at the University of Toronto, and Thomas Poell, professor of data, culture and institutions at the University of Amsterdam.

The three have just come out with a new book, “Platforms and Cultural Production,” which explores both the processes and implications of platformization in the cultural industries. The book, set for release in the U.S. this month, analyzes platformization from numerous angles – examining previous literature on business, critical political economy, software studies, media industry studies and sociologies of culture – and builds on case studies from around the globe to reveal crucial differences and surprising parallels in the trajectories of platformization.

The Chronicle spoke with Duffy about the book and her sense of the short- and long-term future of platforms and the cultural changes they spawn.

How do you see the problem of “filtering” content – which can be seen as either protecting users from harmful content or censorship, depending on your point of view – evolving in the coming years?

More than a decade ago, Tarleton Gillespie (an adjunct associate professor in my department) noted how slickly the term “platforms” is deployed; it functions as a rhetorical strategy that allows platform companies to appeal to diverse interest groups while deflecting responsibility. “Filtering” – and relatedly, “curation” – are other strategic terms: To some, they are mechanisms to protect users from upsetting or offensive content, whereas others see platforms as overstepping their boundaries by engaging in censorship. I see such “filtering” as a fundamental responsibility of platforms, especially given that their technical systems make it possible for users to upload anything. In practice, the very real challenges of scale, algorithmic imprecision and user subjectivity make such filtering or moderation difficult. My hope, at least, is that recent events in the platform ecosystem – particularly the release of the “Facebook Files” – means that policymakers are taking their role in overseeing platforms’ content moderation practices more seriously.

Can government regulation be implemented in a platform ecosystem?

My graduate training is in the area of media industry studies, so I view many forms of external regulation of media and cultural content as beneficial to consumers. Historically, the U.S. government’s regulation of media has sought to protect consumers from excessive commercialism, content partisanship and ownership concentration. These are, of course, all concerns with the platform ecosystem, but implementation is infinitely more complicated given the sheer scale of content, its decentralized status (presumably, anyone can be a content creator), and its flow across regional and national borders. Such difficulty, of course, does not mean it shouldn’t be pursued; for far too long, platforms have evaded responsibility through assurances of “self-regulation.”

Has the relatively easy spread of proven misinformation on platforms such as Facebook and Twitter hurt the trust the public has both in them and in “legacy” media such as network television and newspapers? Or has it just served to polarize, generally along political lines? 

As we saw with the 2016 U.S. presidential election, platforms’ mechanisms of algorithmic filtration amplified political polarization. Such polarization is not without precedent: In the context of legacy media, conservatives and liberals have gravitated toward Fox News and CNN, respectively. But there was a level of agency there, wherein a TV viewer chose which network to view. With algorithmic filtering, we have little insight into the decisions and processes that shape which content we do and don’t see. In more recent years, concerns about vaccine misinformation, social media addiction and youth self-esteem seem to have led to the public’s mounting distrust in platforms. At the same time, we rely on these platforms for important social needs such as community, entertainment and creative expression. So I don’t expect to see a mass exodus from Facebook or Twitter anytime soon.

In the book’s introduction, you cite work from Rietveld (2020) that states “the more dominant a platform company becomes, the less bargaining power cultural producers – game publishers, creators, and news organizations – seem to have.” What effect, if any, do you think this will eventually have on the goliaths, like Facebook (Metaverse) and Twitter? Is a revolt along the lines of the mass advertiser boycott “Adpocalypse” inevitable?

My sense from both the book and more recent research I collaborated on over the summer, on social media creators, is that cultural producers consider their dependence on platforms a double-edged sword: Platforms are increasingly central to the processes of creating, distributing and monetizing cultural content, but cultural producers have little control over platform operations. Boycotts and revolts take a cue from earlier forms of labor organization, and thus I see them as vital to challenging the uneven power dynamics. Of course, the fiercely competitive nature of cultural production in the digital age means that individuals and organizations involved in cultural production may be reluctant to participate in boycotts and strikes.

Given the pace at which platforms have evolved and grown in recent years, what do you see in your crystal ball for 2022, both from a platform perspective and (maybe harder to predict) a culture-production perspective?

Given how often platform companies tend to emulate each other (with features like streaming and short-form video), I expect that trend will continue apace. I also imagine that we’ll see increased efforts to court the top creator-personalities; given wider economic precarity in the cultural sector, the creator economy is showing few signs of wear. One of the implications of this for cultural producers is that they’ll face increased demands to work, and engage audiences, across platforms. In recent months, Facebook and Instagram (now Meta) have been placed under high-powered microscopes, thanks in large part to the efforts of whistleblower Frances Haugen. So, ideally, hope is that such public and policy scrutinization will spark tightened regulations in 2022. That, of course, remains to be seen.

This article originally published in the Cornell Chronicle. 

Keep Exploring

Juan Hinestroza speaking in front of a projecter screen.

News

Faculty members are finding creative ways to deal with generative AI in their courses. Winners of Cornell’s 2024 Teaching Innovation Awards will discuss their approaches on April 11.

  • Landscape Architecture
  • Communication
A woman in a black camisole that is anxious.

News

A hostile environment that threatens Latino noncitizens with deportation is associated with psychological distress among not only Latino noncitizens but also Latino U.S. citizens who aren’t vulnerable to deportation, a Cornell-led research group...

  • Department of Communication
  • Communication