Open access
Research Article
1 August 2023

Pornhub and Policy: Examining the Erasure of Pornography Workers in Canadian Platform Governance

Publication: Canadian Journal of Communication
Volume 48, Number 2

Abstract

Abstract

Background: In 2021, the Canadian Parliamentary Standing Committee on Access to Information, Privacy and Ethics (ETHI) conducted an inquiry around Pornhub, following allegations that parent company MindGeek profits from non-consensual content.
Analysis: This article offers a discourse analysis of the ETHI’s process, testimony, and report on Pornhub using Carol Bacchi’s policy analysis method, “What is the problem represented to be?”
Conclusions and implications: This study reveals a policy process blatantly influenced by anti-porn sentiments, resulting in hearings that framed porn as sexual violence rather than sex industry labour. It exposes how ETHI’s approach failed to constructively engage existing regulations, precarious labour conditions, or platform operations. The result is ineffective policy recommendations that procedurally exclude relevant stakeholders and do not adequately protect platform users from harm.

Résumé

Contexte : En 2021, le Comité permanent de l’accès à l’information, de la protection des renseignements personnels et de l’éthique du Parlement canadien (ETHI) a mené une enquête sur Pornhub à la suite d’allégations que sa maison mère MindGeek y incluait des contenus non consensuels.
Analyse : En employant la méthode de Carol Bacchi, « Quel est le problème représenté? », cet article propose une analyse du discours du processus, témoignage et rapport effectués sur Pornhub par ETHI.
Conclusions et implications : Cette étude révèle un processus clairement influencé par un sentiment anti-pornographique, avec des auditions publiques mettant l’accent sur la violence sexuelle en pornographie plutôt que sur les travailleurs et travailleuses de l’industrie du sexe. Elle montre comment l’approche d’ETHI a échoué à s’adresser de manière constructive à la réglementation actuelle, aux conditions de travail précaires, et aux opérations de la plateforme. Il en résulte des recommandations inefficaces qui excluent les parties prenantes concernées et qui ne réussissent pas à offrir une protection adéquate aux utilisateurs de la plateforme.

Introduction

“Why does Canada allow [Pornhub] to profit off videos of exploitation and assault?” This was the hook for a New York Times opinion piece by journalist Nicholas Kristoff (2020). The article alleged that the porn streaming site Pornhub had long acted irresponsibly regarding its content moderation practices and that, as a result, countless abusive and non-consensual videos of women and girls appeared on the site while the company failed to remove flagged content promptly (or at all). Controversy had been building around Pornhub for years, but in a sudden move, the House of Commons Standing Committee on Access to Information, Privacy and Ethics (ETHI) launched an investigation, “Ensuring the Protection of Privacy and Reputation on Platforms such as Pornhub,” just days after the publication of Kristoff’s article.
Spurred by this sensationalist piece, the ensuing ETHI hearings were blatantly influenced by an anti-porn sentiment that framed the problem as a matter of sexual violence rather than one of platform governance and labour rights. As such, the committee’s recommendations failed to engage with the problem constructively. This analysis suggests that the proceedings functioned as a stage upon which witnesses and members of Parliament performed outrage over sexual violence, equating specific harms perpetrated against specific victims with the pornography industry as a whole. Doing so implied that Pornhub, or, more specifically, its holding company MindGeek, should be responsible, not to its user base of content creators and consumers, but for a massive scope of sexist and abusive behaviour. This silenced relevant concerns raised around platform governance and excluded the single largest group that would be impacted by any changes to the platform: sex workers. The result is an incoherent policymaking process, one that prioritizes the “right kind” of victim and fails to protect all users from exploitation on the platform.

Background

Pornography regulation and labour conditions

Pornography production and consumption is legal in Canada, while creation and dissemination of non-consensual intimate media or child sexual abuse material (CSAM) is not. Canada has some of the most expansive laws in the world concerning CSAM, sexual consent, and image sharing. Part V of the Criminal Code renders all CSAM a federal offense,1 including fictional representation of minors in any medium. Part V of the Code also defines sexual consent in narrow and specific terms, making it illegal to publish or distribute “intimate images” of someone without their express consent (sec. 162.1). In 2011, Canada also introduced Bill C-22, An Act Respecting the Mandatory Reporting of Internet Child Pornography by Persons who Provide an Internet Service, requiring all internet service providers (ISPs) and platforms to disclose instances of CSAM to the Canadian Centre for Child Protection (C3P).
Pornhub, founded in Montréal circa 2007 by two Concordia University graduates, was one of the first popular video aggregator “tube” sites streaming free adult content. The site earns revenue, much like other dominant internet platforms, from advertising and affiliate marketing, and by refining massive amounts of browsing data gleaned from users on its sites (MacDonald, 2019). Pornhub’s parent company, MindGeek, owns dozens of websites and has been overwhelmingly successful in consolidating user attention through its interconnected network of porn web properties (McKee & Lumby, 2022).
Because of its scale, Pornhub is an important source of income for sex workers in Canada and abroad. Pornography is increasingly produced independently by performers (Berg, 2021; Pezzutto, 2019), who earn money from Pornhub in two ways. First, verified performers used the Modelhub program to generate income through ad revenue, custom content, video sales, affiliate programs, download and subscriber fees, contests, and tips. Second, tube sites function as an important source of advertising, directing potential customers to performers’ products and services (Berg, 2021; Pezzutto, 2019). They direct traffic internally to verified models and externally to fansites, live webcams and phone sex and sexting operators, diversifying sources of income.
Pornhub is no stranger to controversy, thanks to a record of suspicious financial activity, accusations of piracy, and hosting of non-consensual material spanning its 15 years of operations. It is widely understood that Pornhub’s accession to market dominance hinged on a strategy of providing access to pirated content stolen from paysites and uploaded for free by end users (Zhang, 2019). Though Pornhub has a legal obligation to respond to the U.S. Digital Millennium Copyright Act (DMCA) takedown requests (1998), few studios or performers have resources to routinely track their own stolen content and submit requests. Content oversaturation on porn platforms has devalued products in a race-to-the-bottom approach. The result is market conditions that coerce disenfranchised performers to engage with the same tube sites that are making it difficult for them to earn a living through producing content. In upending prior production and revenue models by capturing and centralizing users’ attention, Pornhub has fundamentally altered the industry, affirming platforms’ trend toward precarious labour standards in the process (Ronson, 2017; McKee & Lumby, 2022).

Platform governance

Problems highlighted by Kristoff’s 2020 New York Times opinion piece point to mounting and widespread issues in platform governance at large. MindGeek’s content moderation problems are not unique, and the same concerns around Pornhub are raised against countless platforms, pornographic or otherwise (McKee & Lumby, 2022). There are, broadly, three principal concerns with moderation: (1) liability, that is, whether, and to what degree, platforms are liable for content that users post, (2) jurisdiction, that is, who is responsible for governing global platforms, and (3) scale, that is, how moderation practices can keep pace with platforms’ exponential growth.
Pornhub’s ambiguous jurisdiction has allowed the site to exist in a legal grey zone, enabling a sort of regulatory arbitrage to the lowest possible standard. The United States’ influence on global platform governance supports Pornhub’s preferred minimal intervention approach. In place since 1996, Section 230 from the U.S. Communications Decency Act (CDA) asserts platforms are intermediaries, not publishers, of content published to their sites, and therefore not responsible for content that users circulate (1995). The U.S. law impacts internet policy globally, with many platforms incorporated in the U.S. or otherwise abiding by U.S. legal standards. Section 19.17 of the Canada-United States-Mexico Agreement (CUSMA) codified the non-interventionist framework in North America (The agreement, 2020). Since Parliament ratified CUSMA in 2019, Canadian law must ensure similar immunity against liability for Canadian internet services (Goldman, 2021). That said, the borderless nature of the internet makes national policies difficult to enact and enforce, leaving governance largely in the hands of commercial actors. MindGeek has made it difficult to say precisely where it is based, with corporate offices in Montréal, other offices in Athens, Bucharest, and São Paulo, and a privacy policy that encourages people to contact its Cyprus location, which handles content standards and compliance (MindGeek, n.d.).
The massive volume of content shared on popular platforms has led to broad uptake of automated moderation mechanisms. These systems often rely on nascent machine learning techniques that lack accuracy, and so are undiscerning in content removal (McKee & Lumby, 2022). Internal verification systems grant certain users and types of content greater visibility than others, creating a tiered approach to governance that undermines the visibility and agency of other users (Caplan & Gillespie, 2020). Manual moderation work is frequently outsourced to underpaid labourers, often untrained in the complex sociocultural contexts surrounding the content they are assessing (McKee & Lumby, 2022).

Anti-porn influence

Content moderation poses complex problems for regulators, and these concerns have been strategically exploited by anti-porn antagonists who push for policy that harms sex workers, for example, by petitioning to remove online harm reduction tools to screen clients or reducing sex worker visibility by censoring acts and performers on social media entirely (Blunt & Stardust, 2021). The most devastating example of this tactic is 2018’s Fight Online Sex Trafficking Act/Stop Enabling Sex Trafficking Act (FOSTA/SESTA), a U.S. bill ostensibly designed to prevent sex trafficking. The bill established a rare exemption to the aforementioned Section 230, making platforms responsible for user-generated content that “promotes or facilitates prostitution” (FOSTA/SESTA, 2018, sec. 2.1). Since definitions of such content can be interpreted broadly, many platforms globally adopted policies to limit and remove all manner of sexual materials or expression (Liu, 2020). Policies banning nudity and policing sexual expression asymmetrically increased surveillance and censorship of women and queer and trans people, broadly, and of sex workers, specifically (Duguay, 2016; Liu, 2020). These conditions make life perpetually difficult for sex workers, who struggle to exist equitably online (Blunt & Stardust, 2021). The 2018 FOSTA/SESTA bill was supported and applauded by anti-sex worker organizations including the National Center on Sexual Exploitation (NCOSE, formerly Morality in Media), a lobby group with the stated goal of eradicating sex industries populated by legal and consenting workers, particularly the pornography industry (NCOSE, 2021).
Pornhub is an attractive target for anti-porn campaigns since, through market dominance and brand recognition, the site is practically synecdochal for free pornography, despite offering dismal representation of the heterogeneous media industry. Along with fellow American Evangelical anti-porn group Exodus Cry2, NCOSE campaigned aggressively against Pornhub and MindGeek prior to the incendiary New York Times op-ed (Turner, 2021). In January 2020, NCOSE started a “Dismantle Pornhub” campaign urging payment processors and advertisers to divest from the company entirely (Mohan, 2020). Its “Law Center” solicits participants for a class-action lawsuit against MindGeek’s “exploitation empire” (NCOSE, n.d., para. 9). In February 2020, NCOSE was joined by Exodus Cry’s “Traffickinghub” campaign, which launched when its director Laila Mickelwait published an op-ed (Mickelwait, 2020) along with a petition to shutter Pornhub that quickly gained 2.5 million signatures. Mickelwait met with compliance officers at Visa and Mastercard, who subsequently retracted their payment processing services without warning, all within a week of Kristoff’s New York Times piece (Celarier, 2021). This corporate skirmish dramatically impacted sex workers who depended on Pornhub for income (Cole, 2020). Pornhub responded by removing nearly half of the site’s content in an unprecedented purge of all material not uploaded by verified users, amounting to some 10 million videos.
The ETHI committee launched its investigation into Pornhub just after these events. The study was initiated by Member of Parliament Arnold Viersen (CPC, Peace River—Westlock, AB) who has a record of antagonism toward pornography and sex workers, as detailed further on. Selecting witnesses is highly political in committees, and the initial list of witnesses invited to testify excluded sex workers and sex work advocacy groups. Committee chair Chris Warkentin (CPC, Grande Prairie—Mackenzie, AB) said the committee “prioritized those who they have wanted to hear from initially” and encouraged sex worker groups to submit written briefs in lieu of testimony (Serebrin, 2021, para. 14). The committee eventually allowed three people representing sex worker interests to testify, out of 37 witnesses heard over the course of nine meetings (two in camera) held between February 1 and June 14, 2021. Other witnesses included four survivors of non-consensual porn or CSAM published to Pornhub along with one member of their legal counsel, 13 connected to law enforcement or public safety agencies, three witnesses representing anti-porn and anti-sex work campaigns, three representing child advocacy groups, and three MindGeek executives, with a few remaining witnesses from other categories (House of Commons, 2021a).
The committee heard from American religious group Exodus Cry, though Canadian sex workers familiar with and dependent on the platform in question were not invited. As this article demonstrates, hearings framed pornography moderation as an issue of sexual violence rather than recognizing moderation as a critical element of media production and digital labour. This framing influenced the selection of witnesses, impacted how the problem was defined, and resulted in proposed “solutions” that were couched in porn-prohibitionist terms, rather than centring users and workers’ rights on the platform.

Method

The analysis of the ETHI process sought to understand how the issue of moderating abusive content on Pornhub was framed, and to assess the outcomes of this framing. Carol Bacchi’s (2009) policy analysis method, “What is the Problem Represented to be?,” (WPR) was applied. The premise of WPR is that “what one proposes to do about something reveals what one thinks is problematic” about the issue (Bletsas & Beasley, 2012, p. 21). Rather than treat policymaking as objective, Bacchi’s WPR method recognizes that many assumptions are embedded in the process that can lead to harmful and unjust effects in governance. The method is a prescriptive critical discourse analysis tool, asking six specific questions that reveal the boundaries of the issue:
1.
What’s the “problem” … represented to be in a specific policy or policies?
2.
What presuppositions and assumptions underlie this representation of the “problem?”
3.
How has this representation of the “problem” come about?
4.
What is left unproblematic in this problem representation? Where are the silences? Can the “problem” be thought about differently?
5.
What effects are produced by this representation of the “problem?”
6.
How/where is this representation of the “problem” produced, disseminated, and defended? How could it be questioned, disrupted, and replaced?
The researchers collected data from seven publicly available recorded parliamentary sessions approximating 25 hours of video and representing testimony from 37 unique witnesses.3 The 68-page final committee report and the 50 briefs submitted by stakeholders (House of Commons, 2021a) were also reviewed. To contextualize the findings, the researchers included journalistic coverage of the CSAM/Pornhub issue and of the committee’s process, as well as social media responses to the issue by stakeholders.
Two authors [MM and VW] divided the data set and coded separately, meeting regularly to discuss each other’s work until consensus was reached. Coding was influenced by Ernesto Laclau and Chantal Mouffe’s notion of “nodal points” (2001). A nodal point is a “master-signifier” that assumes a “‘universal’ structuring function within a certain discursive field” (2001, p. xi). Nodal points demonstrate how the definition of foundational concepts (such as “sex,” “harm,” “safety”) influences the articulation of related ideas, precluding the possibility of defining the problem otherwise (in this case, if “porn” is understood to be fundamentally violent, any porn platform is rendered a perpetrator). Therefore, the authors coded not with the intention of enumerating themes, but with a goal of identifying when and how witnesses and MPs deployed these master signifiers. Critically, this analysis also emphasizes what goes missing in this arrangement, since discursive absence describes “the problem” as much as discursive presence (Joye, 2010). The coded data points were then organized according to which questions they addressed within the WPR framework.

Findings

1. How is the problem presented?

As noted, the initial choice of witnesses centred anti-porn organizations, framing the problem in terms of regulating sexual violence rather than addressing platform moderation or labour rights. Throughout the hearings, there was scant appreciation that pornography sites are places of employment or that regulation of them has implications for many thousands of workers.
Days before the ETHI investigation opened, Viersen tweeted, “Pornhub & its online scourge of exploitation will be brought to light. We will ensure survivors are heard & MindGeek receives the full scrutiny it deserves” (2021). Viersen suggests a targeted process of attrition against the company rather than simply ensuring protections for users and the public. Protected parties in this arrangement are survivors of abusive content, but not workers with an interest in improving privacy on the platform.
This absence is unsurprising, given that Viersen has routinely incited anti-porn government activity without consulting porn workers. In 2017, he called for M-47, a Standing Committee on Health study of the public health effects of viewing “online violent and degrading sexually explicit material” based on the assumption that pornography constitutes sexual violence (House of Commons, 2017, para. 1). Viersen later sponsored Bill C-302, the Stopping Internet Sexual Exploitation Act, which did not propose adding protections for victims of exploitation but would have burdened independent porn producers with unnecessary and privacy-violating record-keeping requirements (Webber, MacDonald, & Sullivan, 2021). Viersen also petitioned for Bill S-203, the Protecting Young Persons from Exposure to Pornography Act, which calls for age verification on pornography websites, ignoring numerous privacy and security concerns such technologies raise (Barnett, 2021). Across his campaigns, Viersen has never once sought insight from people who work in pornography production.

2. What assumptions underlie this presentation?

Two interrelated assumptions underlay the presentation of porn as violence throughout the hearings: first, that MindGeek/Pornhub is the source—rather than bearer—of the sexual violence under discussion, and second, that this violence is only valid when enacted against certain subjects.
In the first ETHI meeting, U.S. citizen Serena Fleites—featured in Kristoff’s New York Times article—described the harassment she endured when peers saw a video she had sent to a boyfriend at age 13 and that was subsequently posted to Pornhub without her consent. Boys at school followed her, asking for sexual acts, while strangers on social media threatened and abused her. This complex web of abuse in person and on social media was attributed to MindGeek alone, as Member of Parliament Charlie Angus (NDP, Timmins—James Bay, ON) declared, “You said how they hassled you, when you—as a child—were trying to get control of your life again, that they as a massive corporation [emphasis added] hassled you […] What should we make them do so they don’t hassle any other young women, anywhere in the world?” (ETHI, 2021, February 1, 13:29:00).
At times, members of Parliament noted that proceedings inappropriately focused on a single company but continued to favour the disproportionate allocation of responsibility. Member of Parliament Francesco Sorbara (LPC, Vaughan—Woodbridge, ON) chastised MindGeek executives: “We know child exploitation is an issue all over the world, not just with MindGeek or Pornhub […] but you folks have a very special responsibility” (ETHI, 2021, February 5, 14:17:30). Observations that circulating non-consensual material predates the internet or that the problem is not reducible to Pornhub were met with hostility. Member of Parliament Jacques Gourde (CPC, Lévis—Lotbinière, QC) admonished Minister of Canadian Heritage Steven Guilbeault (LPC, Laurier—Sainte-Marie, QC) for not moving forward decisively on internet legislation to “protect children.” Guilbeault retorted, “I want to point out that child pornography existed before 2015. Your party was in power and did nothing about it,” to which Gourde replied, “Please stop giving campaign speeches and tell us how you are going to help children!” (ETHI, 2021, June 7, 11:40:00).
Little to no mention was made of those who committed the inciting assaults. All were identified as men, and many as intimate partners, who made and published non-consensual recordings—a federal offence under Part V, Section 162.1 of Canada’s Criminal Code. Similar oversight met those who harassed survivors on and offline, a crime outlined in Part VIII, Section 264. In their stead, outrage was funnelled toward Pornhub and MindGeek, treated as scapegoats and instigators of harm rather than technological intermediaries operating within a culture of pervasive misogyny and abuse.
Committee work actively erased the voices of sex workers, especially when their presence complicated the sex-as-violence framing. For example, the brief submitted by Kate Sinclaire, a survivor of nonconsensual image sharing who later consensually entered the industry to operate a porn studio and site, complicated the vilification of porn platforms. Sinclaire’s brief was excluded from the final report, which stated the committee only heard from five survivors. “Shame on all of you,” Sinclaire tweeted at the members of Parliament after the report was released. “I am a survivor, even if my story doesn’t fit whatever narrative you’re looking for” (Sinclaire, 2021).
This dichotomy between “good” and “bad” victims is reflective of the committee’s carceral feminist understanding of sexual and gender-based violence. This approach is cisheteronormative and exclusionarily gendered, and entails several interrelated presumptions: that pornography is fundamentally a form of sexual violence rather than a media form embedded within a broader culture of sexual violence, that women alone are victims of pornographic and sexual violence, and that women like Sinclaire who do not characterize their relationship to pornography as violence are either misguided or victims of false consciousness, and thus unreliable witnesses (Fawkes, 2005).
Lise Gotell (1997) describes this “reconfiguration of political responses to pornography” that occurred in the 1980s, whereby
pornography as an issue has been subsumed within the social problem of violence against women, such that the two are most often understood as inseparable in policy discourse … By subsuming pornography into violence and by inserting it into a broader law-and-order agenda, government actors have contributed to the construction of a sexual panic.
(pp. 69–70)
This carceral approach relies on punitive and paternalist state apparatuses to address harm, a reliance that inevitably reserves protection for those who are already most served by the state—white, cisgender, heterosexual, monogamous, non-sex-working women—rather than those who are marginalized and/or criminalized by it (Phipps, 2021). This orientation “withhold[s] womanhood and personhood from marginalised Others” (Phipps, 2021, p. 88), placing sex workers outside of, and in opposition to, women’s safety. Indeed, the committee failed to consider the consequences of non-consensual image sharing for workers on the platform. Discourse established some victims as more deserving of protection than others, a backdrop where the state’s concern is safety for girls and women, unless they engage in sex work.

3. How has this representation come about?

The representation of porn as violence has been accomplished in part through the strategic deployment of affective displays. “Sex panics,” writes Janice Irvine (2006, p. 82), “are fueled by emotional scripts—rhetoric strategically crafted to produce volatile emotional responses.” Emotion plays an important role in political posturing, partially because emotions are perceived as a neutral barometer of morality and “a site of truth and ethics” (p. 92). But emotions, while not entirely fabricated, are managed and can be performed toward political ends.
Throughout hearings, the affective appeals made by witnesses and emotional displays offered in response by committee members enabled conflation and oversimplification of the issues at stake, while drowning out more nuanced or dissenting voices. The authors are not problematizing the strong feelings present—these crimes are horrific, and outrage justified—but rather, that moral outrage was emphasized to the detriment of engaging meaningfully toward crafting solutions. For example, in addressing MindGeek executives, Member of Parliament Marie-Hélène Gaudreau (BQ, Laurentides—Labelle, QC) positioned dramatically, “I will not be able to sleep well unless I ask this question. You said you are a father. I am a mother. We can talk about business, profit, consent … but I would like to hear what your conscience tells you, as a parent” (ETHI, 2021, overdubbed by translator, February 2, 13:29:45). MindGeek Chief Operating Officer David Tassillo began to respond, describing MindGeek’s efforts toward developing preventive technology and partnerships, when Gaudreau interrupted, “But you’re talking to me like a businessman. What if it was your child?” She later reiterated, “I don’t want to hear about you as managers, I want to hear about your conscience. How do you sleep at night!?” (13:49:40). Focus was squared in a rhetorical framing of moral character “as a parent,” demanding sympathy more than inviting informative responses. By repeatedly emphasizing their disgust at the problem, ETHI committee members prioritized moral stakes over gathering data on platform-specific policies and technical measures.
Affect was weaponized through emotionally charged language. Laila Mickelwait, director of Exodus Cry’s “Traffickinghub” campaign, made several misinformed assertions about moderation capacities in her testimony (e.g., that Pornhub only employs 10 moderators). She spent a substantial portion of her allotted time listing off porn titles with no connection to the case, seemingly to invoke disgust at industry standards. Rather than expertise, Mickelwait wielded moral authority to buttress claims about the porn industry that experts in legal jurisdictions, sex work, technology, or platforms could have easily refuted, were they present. Instead, Mickelwait was thanked and solicited for legal advice, despite no relevant expertise in that area.
In a subsequent hearing, director of the Canadian Centre of Child Protection Lianna McDonald described the internet as protecting “the privacy rights of adults at the expense of the safety and wellbeing of children” (ETHI, 2021, February 22, 12:08:39). Sex workers’ privacy is excluded from this statement, given the extensive identity and age verification processes currently required to sell pornography on Pornhub.4 The U.S. EARN IT Act (2022) and the U.K. Online Safety Bill (Bill 121, 2022) are only two of many proposed laws ostensibly designed to “protect children” that infringe on freedom of speech and create privacy risks for many (Harmon, 2020; Mullin, 2022).
Anti-porn feminists have a long history of mobilizing the affective, anecdotal style of testimony observed at the hearings. Stemming from consciousness-raising practices in the 1970s, first-person testimony was the key strategy used by Andrea Dworkin and Catharine MacKinnon in efforts to pass anti-pornography legislation in the 1980s. Supposedly promoting “public discussion over pornography’s role in social life” (MacKinnon, 1997, p. 3), anti-porn feminists selectively amplify only voices that affirm victimization by pornography (Sullivan & McKee, 2015), while ignoring other voices with a stake in the issue.

4. What is left out of this presentation?

Pornhub and other porn platforms are sites of labour, meaning any legislation altering their operations is, in effect, labour policy. Yet, sex workers who rely on the platform to earn a living were offered no space in the hearings to express needs or grievances. When sex worker organizations applied as witnesses, the committee responded that “sex workers are not relevant to this conversation” (Serebrin, 2021). The absence of sex worker voices was not mentioned until more than halfway through the proceedings, when Sorbara raised concern over the unprecedented number of submitted briefs highlighting potential harms to sex workers, stating, “it seems to me that we need to make sure we don’t drive work underground and that sex workers’ voices need to be listened to” (11:30:15). Only then were sex worker interest groups offered a last-minute opportunity to speak. Jennifer Clamen, director of the Canadian Alliance for Sex Work Law Reform (an alliance of 25 sex worker groups across the country), was unambiguous in her opening statement that “[It is the] duty of parliamentarians to take direction and leadership from sex workers who are really best placed [emphasis added] to speak to any policy or practice that may regulate online sex work or online porn” (ETHI, 2021, April 19, 11:07:56). Instead, she continued, “It is made clear that sex workers are not welcome at this table and are not considered valued participants. We were told outright that this committee didn’t concern us” (11:09:33). In response, Viersen reiterated his position that the hearings concern the “victims of Pornhub,” implying this does not include sex workers. Clamen replied:
Discrediting is a very common tactic … I do thank you, Mr. Viersen, for providing a really good example of the way conflation happens … mixing that story with stories of people actually working in the industry … weaving in and out of these tales and testimonies in a way that suggests that everything is exploitation. That’s the exact reason why this committee unfortunately is in large part failing … to actually ask the right questions.
(11:44:00)
As Clamen articulated, the hearings enthusiastically deployed “exploitation” to frame all sex work as sexual violence, but never addressed the economic exploitation of workers on platforms like Pornhub. On a few occasions, Angus reminded the committee that its mandate was not to question the right of adults to perform in or consume legal pornographic content. His concession was that “citizens have the right to watch weird things. People have the right to promote and show their consensual bedroom antics, if that’s what they like to do. Whether people like it or not, that is their right” (ETHI, 2021, February 22, 12:17:45). Gaudreau referred to sex worker concerns as “collateral damage” to be avoided, but not central to the policymaking process (April 19, 11:52:11).
Clamen and fellow sex work advocates were the only witnesses to have testimony cut short; their session was ended abruptly by Member of Parliament Brenda Shanahan (LPC, Châteauguay—Lacolle, QC), who motioned that the hearing be suspended to schedule an additional meeting in order to address this “very complex subject,” and to “hear from more balanced witnesses” rather than continue hearing from the ones present (12:13:50).
The absence of pornography workers was apparent. Witnesses and committee members alike demonstrated a lack of knowledge about pertinent subjects including porn marketing strategies and moderation capacities. The only witnesses with any direct connection to Pornhub’s business activities were three MindGeek executives. But when called at the second ETHI meeting, questioning was conducted with such unique hostility that minimal relevant industry information was communicated. In contrast, anti-porn witnesses were granted authority to claim blatant inaccuracies, for example, that video tags and titles are accurate to content. In fact, production conditions mean a product’s content and how it is marketed or labelled are often mutually exclusive processes (McKee, 2015). Prominence of the term “teen” was cited repeatedly as evidence that Pornhub allows CSAM to proliferate across its sites, despite MindGeek executives’ attempts to explain the industry standard of using “teen” to market performers aged 18–25 years. Like most media sectors, porn does not claim truth in advertising. If the committee was attentive to porn platform labour, content experts could have assisted in navigating common industry practices. Instead, ETHI directed questions about content to anti-porn advocates, child protection agencies, and law enforcement, who inaccurately described many website affordances.
Theoretically an interrogation of the platform, the ETHI process hardly engaged with technical, market, or even sociological insights regarding the platform’s operations. Working with victims of non-consensual porn to remove their content, privacy analyst Charles DeBarber was arguably the witness with the most apt technological understanding around non-consensual imagery circulation, prevention, and removal. He was the only witness other than sex worker advocates who identified porn platforms as a site of labour and addressed potential implications that poorly crafted policy could impose on sex workers. DeBarber stated that while he sought processes to hold Pornhub accountable, he intended “to make sure that it’s also not so cumbersome that sex workers [who] are free agents can’t operate without reasonable privacy” (ETHI, 2021, June 7, 12:13:00).
Child sexual abuse material and non-consensual material represent a proportionally small share of content on sites like Pornhub (NCMEC, 2022).5 Child sexual abuse material is the exception on Pornhub, whereas financially exploited, precarious workers are the ubiquitous rule. Both parties deserve the attention of policymakers, but only one group is granted it. Here, the authors observed a process that prized displays of emotion from one, while displacing expertise and evidence from the other. Had the hearings not erased the latter and framed the problem of content moderation inversely, the recommendations could have been better informed by industry workers with relevant industry expertise.

5 & 6. Where is this representation produced, and what are the effects?

Representation of this problem was produced through a parliamentary standing committee, whose speculative function paired with a largely arbitrary sex-as-violence framing, resulting in feeble policy impact and leaving established frames unchallenged (Freedman, 2010). The role of parliamentary committees is to “examine, in small groups, selected matters in greater depth” (House of Commons, n.d.) and to report conclusions that the House of Commons may act upon. Meant to reduce partisanship, increase the agency of parliamentarians, and improve the machinery of government itself, “committees have a lot of latitude in how they organise their work” and determine schedules based on “their members’ interests” (House of Commons, n.d.). A former member of the Library of Parliament notes, “the influence of committee studies is much more modest than might be suggested by the dramatically increased volume of reports, recommendations and responses being generated since the mid-1980s” (Stilborn, 2014, p. 355). There is significant debate about its efficacy and deliberative power, with function appearing more performative than productive.
Indeed, the bulk of proceedings were spent impressing how harmful non-consensual porn is and bemoaning a lack of clear regulatory responses. Rather than this emphasis on grievance, proceedings instead could have assumed the clear position that CSAM on Pornhub is harmful and unacceptable, then applied committee resources to determine how to manage this problem well. The issue of hosting CSAM could be more practically addressed by revisiting intellectual property (IP) protections or the recordkeeping requirements for platforms, bolstering antipiracy efforts, or even structurally interrogating Pornhub through the lens of ensuring fair market competition.
The Pornhub case could be addressed more productively by a governing entity like the Canadian Radio-Television and Telecommunications Commission (CRTC), which maintains jurisdiction over audio and visual streaming services. Bypassing CRTC in this case reflects a pattern of government inattention to legitimizing CRTC’s contested authority over internet regulation of media platforms (Luka & Middleton, 2017; Leavitt, 2021). Another possibility could be to integrate the case in recent processes including Canadian Heritage’s 2021 Online Harms consultation, or under C-11, Canada’s Online Streaming Act (Bill C-36, 2021; Bill C-11, 2022). With a shared focus on harmful circulation of non-consensual images, the latter’s proposed expansion of the Mandatory Reporting Act would require service providers and platforms to assume greater responsibility for content. These present more appropriate institutional routes to address moderation issues than the ineffectual volunteer-based ETHI committee.
The ETHI report (House of Commons, 2021b) offered underwhelming policycraft. Of 14 recommendations in the final report, some are pragmatic: four (Recommendations 4, 5, 6, 7) call for strengthening compliance with existing mandatory reporting laws, two (Recommendations 2, 9) are on enforcing existing age verification standards, one (Recommendation 10) calls for proactive enforcement of existing Canadian law, and two (Recommendations 11, 12) deal with CSAM survivor supports in seeking responsive moderation mechanisms, and call for further study of content depicting sexual violence.
Other recommendations were troubling. Determining that uploaders must provide “proof of valid consent” (8) is difficult to verify or even define, given “consent” is a complex legal framework barely addressed by these proceedings. It’s an alarming suggestion that a difficult, revocable legal standard ought to be codified for platforms to assess and manage. Three recommendations (1, 13, 14) call for invoking platform and ISP liability precedents that diverge significantly from legal norms set out in CUSMA Section 19.17. This recommendation is conspicuous, given managing internet operations at large is well beyond scope of this committee study.
The recommendations in ETHI’s final report suggest a performative intervention, are vague on particulars of the issue, and lack substantive contributions regarding structural points of intervention. Tellingly, the remaining ETHI recommendation (3) suggests future studies ought to “consult with survivors, child advocacy centres, victim support agencies, law enforcement, web platforms and sex workers prior to enacting any legislation or regulations relating to the protection of privacy and reputation on online platforms” (House of Commons, 2021b).

Conclusion

The ETHI hearings were purportedly concerned with content moderation failures by Pornhub, leading to the circulation of CSAM and non-consensual images. This WPR analysis demonstrates that proceedings did not successfully engage with many technical or policy mechanisms or capacities and challenges around CSAM content moderation, or account for implications that existing affordances have for users on the platform. Instead, the hearings—influenced as they were by anti-porn campaigners and carceral feminist prohibitionist sentiments—were structured on the premise that pornography is sexual violence rather than a form of media or category of labour.
This framing shaped who was and was not invited to testify, and how those perspectives were valued. Supposedly concerned with operations of MindGeek, ETHI mostly received testimony from witnesses who do not work with the company in question. Witnesses openly adversarial toward the porn industry were invited to testify, and a single company was cast as disproportionately responsible for the widespread, insidious problem of CSAM and non-consensual material being circulated online. Affective displays of outrage were granted authoritative weight while pornography’s function as a legal, regulated, and substantial media industry filled with heterogeneous producers was disregarded, an omission achieved by excluding sex workers from the process.
Pornhub’s business model has irreversibly altered working conditions across the porn industry. The company holds oligopolistic sway over countless stigmatized and precarious workers. Governing institutions have historically failed to protect, or even consider, labour conditions in the porn industry, and the ETHI hearings were no exception. Workers who produce sexual content online are intimately familiar with the industry landscape, have unique insights into platform operations, and stand to benefit most from a safe, functioning, and stable platform ecosystem. Engaging with sex worker expertise could ensure protections that porn workers have long advocated for, including higher standards in safety, recordkeeping, representation, and IP protections. These appeals align well with ETHI’s stated mission of better moderating CSAM and non-consensual material circulating on porn sites. As Clamen testified:
As the people on the frontlines of this industry, sex workers are best placed to help shape any existing or proposed regulations. Any approach that fails to consider the needs of sex workers will harm sex workers, I promise you. Sex workers are systematically ignored in policy that impacts on our lives.
(ETHI, 2021, April 19, 11:13:58)
Like many industries, porn is transformed under the centralizing influence of platforms, which remain deeply misunderstood by policymakers. Users on these platforms—both audiences and workers—deserve the same regulations and protections supporting any other culture industry. Despite being maligned by anti-porn organizations, Pornhub is not the primary channel for CSAM circulation online but merely one of many platforms failing to moderate objectionable content effectively. Informed oversight is required to ensure more responsible corporate practices. All users and workers relying on platforms have a stake in such proceedings. It is a failure of public regulatory institutions that complex and heterogeneous user concerns are overwritten to suit moralizing policy narratives. So long as sex workers remain excluded from regulatory processes relevant to their livelihoods, governments will fail to benefit from their expertise, producing ineffective policy and perpetuating structural harms against their own constituents in the process.

Footnotes

1.
See Sections 153, 162, and 163.
2.
Exodus Cry and NCOSE’s anti-pornography position is part of a larger agenda to oppress any expressions of gender and sexuality that violate fundamentalist Christian cisheteronormativity. Both groups have a long history of fighting LGBTQ+ and abortion rights (O’Hara, 2021; Provost & Whyte, 2018).
3.
ETHI committee documents state there were 40 witnesses, but Bowe, Lukings, and Wong made repeat appearances.
4.
To earn income on Pornhub, performers must provide government-issued photo ID, submit to a live biometric face scan, provide a current address, and disclose personal banking information for payment. These are rigorous measures compared with other social media or video platform requirements (Pornhub, n.d.).
5.
The National Center for Missing and Exploited Children (NCMEC) operates the CyberTipline; all U.S. electronic communication service providers are legally required to report cases of suspected CSAM, and NCMEC is then responsible for investigating and informing law enforcement when necessary. NCMEC reports that, in 2021, they sent three notifications for suspected CSAM to MindGeek (listed on the NCMEC report as “MG Freesites [Pornhub]”) (NCMEC, 2022, p. 6) and that, on average, it took Pornhub 0.2 days to remove the reported content.

References

The agreement: Canada-United States-Mexico Agreement (CUSMA)/United States-Mexico-Canada Agreement (USMCA). (2020). The Secretariat. URL: https://can-mex-usa-sec.org/secretariat/agreement-accord-acuerdo/index.aspx?lang=eng [September 29, 2020] [September 8, 2022].
Allow States and Victims to Fight Online Sex Trafficking Act/Stop Enabling Sex Trafficking Act (FOSTA/SESTA), H.R.1865, 115th Congress (2018). URL: https://www.congress.gov/bill/115th-congress/house-bill/1865 [September 8, 2022].
Bacchi, Carol. L. (2009). Analysing policy: What’s the problem represented to be? London, UK: Pearson.
Barnett, Daly. (2021, April 23). Canada’s attempt to regulate sexual content online ignores technical and historical realities. Electronic Frontier Foundation. URL: https://www.eff.org/deeplinks/2021/04/canadas-attempt-regulate-sexual-content-online-ignores-technical-and-historical [September 8, 2022].
Berg, Heather. (2021). Porn work: Sex, labor, and late capitalism. Chapel Hill, NC: UNC Press Books.
Bill 121. (2022). Online Safety Bill, UK Parliament. URL: https://bills.parliament.uk/bills/3137 [September 8, 2022].
Bill C-11. (2022). An Act to amend the Broadcasting Act and to make related and consequential amendments to other Acts, 44th Parliament, 1st session. URL: https://www.parl.ca/legisinfo/en/bill/44-1/c-11 [September 27, 2022].
Bill C-36. (2021). An Act to amend the Criminal Code and the Canadian Human Rights Act, 43rd Parliament, 2nd session. URL: https://parl.ca/DocumentViewer/en/43-2/bill/C-36/first-reading [September 27, 2022].
Bletsas, Angelique, & Beasley, Chris. (2012). Engaging with Carol Bacchi: Strategic interventions and exchanges. Adelaide, SA: University of Adelaide Press.
Blunt, Danielle, & Stardust, Zahra. (2021). Automating whorephobia: Sex, technology and the violence of deplatforming. Porn Studies, 8(4), 350–366.
Caplan, Robyn, & Gillespie, Tarleton. (2020). Tiered governance and demonetization: The shifting terms of labor and compensation in the platform economy. Social Media + Society, 6(2).
Celarier, Michelle. (2021, June 16). Bill Ackman sent a text to the CEO of Mastercard. What happened next is a parable for ESG. Institutional Investor. URL: https://www.institutionalinvestor.com/article/b1s9f698vwhczr/Bill-Ackman-Sent-a-Text-to-the-CEO-of-Mastercard-What-Happened-Next-Is-a-Parable-for-ESG [September 8, 2022].
Cole, Samantha. (2020, December 11). “War against sex workers:” What Visa and Mastercard dropping Pornhub means to performers. Vice. URL: https://www.vice.com/en/article/n7v33d/sex-workers-what-visa-and-mastercard-dropping-pornhub-means-to-performers [September 8, 2022].
Communications Decency Act of 1995, S.314, 104th Congress (1995). URL: https://www.congress.gov/bill/104th-congress/senate-bill/314 [September 8, 2022].
Digital Millennium Copyright Act of 1998 (DMCA), S.2037, 105th Congress (1998). URL: https://www.congress.gov/bill/105th-congress/senate-bill/2037 [September 8, 2022].
Duguay, Stefanie. (2016). Lesbian, gay, bisexual, trans, and queer visibility through selfies: Comparing platform mediators across Ruby Rose’s Instagram and Vine presence. Social Media + Society, 2(2).
EARN IT Act of 2022, S.3538, 117th Congress (2022). URL: https://www.congress.gov/bill/117th-congress/senate-bill/3538 [September 8, 2022].
ETHI [Standing Committee on Access to Information, Privacy and Ethics]. (2021). Protection of privacy and reputation on platforms such as Pornhub (meeting transcripts). URL: https://www.ourcommons.ca/Committees/en/ETHI/StudyActivity?studyActivityId=11088039 [September 8, 2022].
Fawkes, Janelle. (2005). Sex working feminists and the politics of exclusion. Social Alternatives, 24(2), 22–23.
Freedman, Des. (2010). Media policy silences: The hidden face of communications decision making. The International Journal of Press/Politics, 15(3), 344–361.
Goldman, Eric. (2021, June 21). Five things to know about section 230. Centre for International Governance Innovation. URL: https://www.cigionline.org/articles/five-things-to-know-about-section-230/ [September 8, 2022].
Gotell, Lise. (1997). Shaping Butler: The new politics of anti-pornography. In B. Cossman, S. Bell, L. Gotell, & B.L. Ross, (Eds.), Bad attitude/s on trial: Pornography, feminism, and the Butler decision (pp. 48–106). Toronto, ON: University of Toronto Press.
Harmon, Elliot. (2020, January 31). Congress must stop the Graham-Blumenthal anti-security bill. Electronic Frontier Foundation. URL: https://www.eff.org/deeplinks/2020/01/congress-must-stop-graham-blumenthal-anti-security-bill [March 1, 2022].
House of Commons. (n.d.). List of committees. House of Commons. URL: https://www.ourcommons.ca/Committees/en/Home [January 12, 2022].
House of Commons. (2017). Public health effects of online violent and degrading sexually explicit material on children, women and men. House of Commons. URL: https://www.ourcommons.ca/Committees/en/HESA/StudyActivity?studyActivityId=9311761 [September 8, 2022].
House of Commons. (2021a). Ensuring the protection of privacy and reputation on platforms such as Pornhub Information. House of Commons. URL: https://www.ourcommons.ca/Committees/en/ETHI/StudyActivity?studyActivityId=11088039 [September 8, 2022].
House of Commons. (2021b, June 14). Ensuring the protection of privacy and reputation on platforms such as Pornhub: Committee Report No. 3—ETHI (43–2). URL: https://www.ourcommons.ca/DocumentViewer/en/43-2/ETHI/report-3/ [June 17, 2021] [January 11, 2022].
Irvine, Janice M. (2006). Emotional scripts of sex panics. Sexuality Research & Social Policy, 3(3), 82–94.
Joye, Stijn. (2010). News discourses on distant suffering: A critical discourse analysis of the 2003 SARS outbreak. Discourse & Society, 21(5), 586–601.
Kristoff, Nicholas. (2020, December 4). The children of Pornhub. New York Times. URL: https://www.nytimes.com/2020/12/04/opinion/sunday/pornhub-rape-trafficking.html [September 8, 2022].
Laclau, Ernesto, & Mouffe, Chantal. (2001). Hegemony and socialist strategy: Towards a radical democratic politics. London, UK: Verso.
Leavitt, Kieran. (2021, May 20). Is this porn Canadian enough? That could be a question the CRTC starts asking. Toronto Star. URL: https://www.thestar.com/politics/federal/2021/05/20/is-this-porn-canadian-enough-that-could-be-a-question-the-crtc-starts-asking.html [March 1, 2022].
Liu, Jody. (2020). The carceral feminism of SESTA-FOSTA. Queer sites in global contexts: Technologies, spaces, and otherness. London, UK: Routledge.
Luka, Mary E., & Middleton, Catherine. (2017). Citizen involvement during the CRTC’s Let’s Talk TV consultation. Canadian Journal of Communication, 42(1), 81–98.
MacDonald, Maggie. (2019). Desire for data: PornHub and the platformization of a culture industry [master’s thesis]. Montréal, QC: Concordia University.
MacKinnon, Catharine. (1997). In harm’s way: The pornography civil rights hearings. Cambridge, MA: Harvard University Press.
McKee, Alan. (2015). Methodological issues in defining aggression for content analyses of sexually explicit material. Archives of Sexual Behavior, 44(1), 81–87. Medline:24609608
McKee, Alan, & Lumby, Catharine. (2022). Pornhub, child sexual abuse materials and anti-pornography campaigning. Porn Studies, 9(4), 464–476.
Mickelwait, Laila. (2020, February 9). Time to shut Pornhub down. Washington Examiner. URL: https://www.washingtonexaminer.com/opinion/time-to-shut-pornhub-down [September 8, 2022].
MindGeek. (n.d.). Careers. MindGeek. https://www.mindgeek.com/careers/ [September 26, 2022].
Mohan, Megha. (2020, May 8). Call for credit card freeze on porn sites. BBC. URL: https://www.bbc.com/news/world-52543508 [September 8, 2022].
Mullin, Joe. (2022, August 5). The UK Online Safety Bill attacks free speech and encryption. Electronic Frontier Foundation. URL: https://www.eff.org/fr/deeplinks/2022/08/uks-online-safety-bill-attacks-free-speech-and-encryption [September 8, 2022].
NCMEC [National Center for Missing and Exploited Children]. (2022). 2021 notifications sent by NCMEC per electronic service providers (ESP). NCMEC. URL: https://www.missingkids.org/content/dam/missingkids/pdfs/2021-notifications-by-ncmec-per-esp.pdf [February 20, 2023].
NCOSE [National Center on Sexual Exploitation]. (2021, January 8). An end to the internet pornography industry as we know it. NCOSE. URL: https://endsexualexploitation.org/articles/an-end-to-the-internet-pornography-industry-as-we-know-it/ [September 8, 2022].
NCOSE [National Center on Sexual Exploitation] Law Center. (n.d.). What to do if you’ve been exploited on Pornhub. NCOSE. URL: https://sexualexploitationlawsuits.com/get-help/pornhub/ [September 8, 2022].
O’Hara, Mary Emily. (2021, September 10). From eBay to OnlyFans, Trump’s anti-sex internet crusade is silencing LGBTQ culture. NBC. URL: https://www.nbcnews.com/think/opinion/ebay-onlyfans-trump-s-anti-sex-internet-crusade-silencing-lgbtq-ncna1278927 [September 8, 2022].
Pezzutto, Sophie. (2019). From porn performer to porntropreneur: Online entrepreneurship, social media branding, and selfhood in contemporary trans pornography. About Gender—International Journal of Gender Studies, 8(16), 30–60.
Phipps, Alison. (2021). White tears, white rage: Victimhood and (as) violence in mainstream feminism. European Journal of Cultural Studies, 24(1), 81–93.
Pornhub. (n.d.). Model program. Pornhub. URL: https://www.pornhub.com/partners/models [September 14, 2022].
Provost, Claire, & Whyte, Lara. (2018, May 10). Revealed: the US ‘Christian fundamentalists’ behind new Netflix film on millennial sex lives. Open Democracy. URL: https://www.opendemocracy.net/en/5050/revealed-christian-group-netflix-spring-break-sex/ [September 14, 2022].
Ronson, Jon. (2017). The butterfly effect [audio series]. Audible. URL: http://www.jonronson.com/butterfly.html [September 14, 2022].
Serebrin, Jacob. (2021, March 14). Sex workers call to be included in House of Commons committee studying Pornhub. Global News. URL: https://globalnews.ca/news/7695681/commons-committee-pornhub/ [September 8, 2022].
Sinclaire, Kate [@MsKateSinclaire]. (2021, June 25). “@chriswarkentin @BShanahanLib @MHGaudreauBQ @CharlieAngusNDP @MikeBarrettON @ColinCarrieCPC @handongontario @GregFergus @Gilbermilo @patricialattan3 @fsorbara Shame on all of you. I am a survivor …” [Tweet]. Twitter. URL: https://twitter.com/MsKateSinclaire/status/1408482500876275712 [March 1, 2022].
Stilborn, Jack. (2014). The investigative study role of Canada’s House Committees: Expectations met? The Journal of Legislative Studies, 20(3), 342–359.
Sullivan, Rebecca, & McKee, Alan. (2015). Pornography: Structures, agency and performance. Cambridge, UK: Polity Press.
Turner, Gustavo. (2021, January 2). The new war on porn: How moral crusaders, mainstream media and politicians are gunning for XXX. XBIZ. URL: https://www.xbiz.com/news/256588/the-new-war-on-porn-how-moral-crusaders-mainstream-media-and-politicians-are-gunning-for-xxx [January 2, 2022].
Viersen, Arnold [@ArnoldViersen]. (2021, January 29). “BREAKING: Monday Canada’s Ethics Committee will investigate Pornhub & it’s online scourge of exploitation will be brought to light. We …” [Tweet]. Twitter. URL: https://twitter.com/Arnold-Viersen/status/1355298452679307272?s=20&t=SPJiBaWfudlAs6GChUpzxg [April 29, 2022].
Webber, Valerie, MacDonald, Maggie, & Sullivan, Rebecca. (2021, July 4). Eradicating sexual exploitation in porn should not be at the expense of sex workers. The Conversation. URL: https://theconversation.com/eradicating-sexual-exploitation-in-porn-should-not-be-at-the-expense-of-sex-workers-163064 [September 8, 2022].
Zhang, Muqing M. (2019, June 20). If PornHub wants to support a cause, start with sex worker rights. The Outline. URL: https://theoutline.com/post/7582/pornhub-latches-on-to-causes-support-sex-workers?zd=1&zi=xft3yveb [March 1, 2022].

Information & Authors

Information

Published In

Go to Canadian Journal of Communication
Canadian Journal of Communication
Volume 48Number 2June 2023
Pages: 381 - 404

History

Received: 10 June 2022
Revision received: 28 September 2022
Accepted: 26 October 2022
Published in print: June 2023
Published online: 1 August 2023

Keywords:

  1. discourse analysis
  2. platform governance
  3. pornography
  4. sex work
  5. labour rights

Mots clés : 

  1. analyse du discours
  2. gouvernance des plateformes
  3. pornographie
  4. travail de l’industrie du sexe
  5. droit du travail

Authors

Affiliations

Valerie Webber
Biography: Valerie Webber is a postdoctoral fellow at the SHaG Lab, Dalhousie University. Email: [email protected].
Dalhousie University, Halifax, Nova Scotia, Canada
Maggie MacDonald
Biography: Maggie MacDonald is a PhD candidate at the Faculty of Information, University of Toronto. Email: [email protected].
University of Toronto, Toronto, Ontario, Canada
Stefanie Duguay
Biography: Stefanie Duguay is Associate Professor at the Department of Communication Studies, Concordia University. Email: [email protected].
Concordia University, Montréal, Québec, Canada
Fenwick McKelvey
Biography: Fenwick McKelvey is Associate Professor of Information and Communication Technology Policy at the Department of Communication Studies, Concordia University. Email: [email protected].
Concordia University, Montréal, Québec, Canada

Notes

Data Accessibility: All data will not be made publicly available. Researchers who require access to the study data can contact the corresponding author for further information.
Funding: No funding was received for this work.
Disclosures: The authors of this paper wish to acknowledge a potential after-the-fact conflict of interest. As of March 15, 2023, two of the authors (MacDonald and Webber) were appointed as independent advisors to Ethical Capital Partners, the private equity firm that acquired MindGeek and all its holdings, including Pornhub. This advisory role includes contractual protection of their academic freedom. The research, writing, and peer-review of this manuscript was conducted before their advisory positions were established. In addition, this manuscript was written with the intent of advancing public knowledge and understanding, and contributing to sound public policy.

Metrics & Citations

Metrics

VIEW ALL METRICS

Related Content

Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

Format





Download article citation data for:
Valerie Webber, Maggie MacDonald, Stefanie Duguay, and Fenwick McKelvey
Canadian Journal of Communication 2023 48:2, 381-404

View Options

View options

PDF

View PDF

EPUB

View EPUB

Restore your content access

Enter your email address to restore your content access:

Note: This functionality works only for purchases done as a guest. If you already have an account, log in to access the content to which you are entitled.

Figures

Tables

Media

Share

Share

Copy the content Link

Share on social media

About Cookies On This Site

We use cookies to improve user experience on our website and measure the impact of our content.

Learn more

×