1. How is the problem presented?
As noted, the initial choice of witnesses centred anti-porn organizations, framing the problem in terms of regulating sexual violence rather than addressing platform moderation or labour rights. Throughout the hearings, there was scant appreciation that pornography sites are places of employment or that regulation of them has implications for many thousands of workers.
Days before the ETHI investigation opened, Viersen tweeted, “Pornhub & its online scourge of exploitation will be brought to light. We will ensure survivors are heard & MindGeek receives the full scrutiny it deserves” (
2021). Viersen suggests a targeted process of attrition against the company rather than simply ensuring protections for users and the public. Protected parties in this arrangement are survivors of abusive content, but not workers with an interest in improving privacy on the platform.
This absence is unsurprising, given that Viersen has routinely incited anti-porn government activity without consulting porn workers. In 2017, he called for M-47, a Standing Committee on Health study of the public health effects of viewing “online violent and degrading sexually explicit material” based on the assumption that pornography constitutes sexual violence (
House of Commons, 2017, para. 1). Viersen later sponsored Bill C-302, the
Stopping Internet Sexual Exploitation Act, which did not propose adding protections for victims of exploitation but would have burdened independent porn producers with unnecessary and privacy-violating record-keeping requirements (
Webber, MacDonald, & Sullivan, 2021). Viersen also petitioned for Bill S-203, the
Protecting Young Persons from Exposure to Pornography Act, which calls for age verification on pornography websites, ignoring numerous privacy and security concerns such technologies raise (
Barnett, 2021). Across his campaigns, Viersen has never once sought insight from people who work in pornography production.
2. What assumptions underlie this presentation?
Two interrelated assumptions underlay the presentation of porn as violence throughout the hearings: first, that MindGeek/Pornhub is the source—rather than bearer—of the sexual violence under discussion, and second, that this violence is only valid when enacted against certain subjects.
In the first ETHI meeting, U.S. citizen Serena Fleites—featured in Kristoff’s
New York Times article—described the harassment she endured when peers saw a video she had sent to a boyfriend at age 13 and that was subsequently posted to Pornhub without her consent. Boys at school followed her, asking for sexual acts, while strangers on social media threatened and abused her. This complex web of abuse in person and on social media was attributed to MindGeek alone, as Member of Parliament Charlie Angus (NDP, Timmins—James Bay, ON) declared, “You said how they hassled you, when you—as a child—were trying to get control of your life again, that they
as a massive corporation [emphasis added] hassled you […] What should we make them do so they don’t hassle any other young women, anywhere in the world?” (
ETHI, 2021, February 1, 13:29:00).
At times, members of Parliament noted that proceedings inappropriately focused on a single company but continued to favour the disproportionate allocation of responsibility. Member of Parliament Francesco Sorbara (LPC, Vaughan—Woodbridge, ON) chastised MindGeek executives: “We know child exploitation is an issue all over the world, not just with MindGeek or Pornhub […] but you folks have a very special responsibility” (ETHI, 2021, February 5, 14:17:30). Observations that circulating non-consensual material predates the internet or that the problem is not reducible to Pornhub were met with hostility. Member of Parliament Jacques Gourde (CPC, Lévis—Lotbinière, QC) admonished Minister of Canadian Heritage Steven Guilbeault (LPC, Laurier—Sainte-Marie, QC) for not moving forward decisively on internet legislation to “protect children.” Guilbeault retorted, “I want to point out that child pornography existed before 2015. Your party was in power and did nothing about it,” to which Gourde replied, “Please stop giving campaign speeches and tell us how you are going to help children!” (ETHI, 2021, June 7, 11:40:00).
Little to no mention was made of those who committed the inciting assaults. All were identified as men, and many as intimate partners, who made and published non-consensual recordings—a federal offence under Part V, Section 162.1 of Canada’s Criminal Code. Similar oversight met those who harassed survivors on and offline, a crime outlined in Part VIII, Section 264. In their stead, outrage was funnelled toward Pornhub and MindGeek, treated as scapegoats and instigators of harm rather than technological intermediaries operating within a culture of pervasive misogyny and abuse.
Committee work actively erased the voices of sex workers, especially when their presence complicated the sex-as-violence framing. For example, the brief submitted by Kate Sinclaire, a survivor of nonconsensual image sharing who later consensually entered the industry to operate a porn studio and site, complicated the vilification of porn platforms. Sinclaire’s brief was excluded from the final report, which stated the committee only heard from five survivors. “Shame on all of you,” Sinclaire tweeted at the members of Parliament after the report was released. “I am a survivor, even if my story doesn’t fit whatever narrative you’re looking for” (
Sinclaire, 2021).
This dichotomy between “good” and “bad” victims is reflective of the committee’s carceral feminist understanding of sexual and gender-based violence. This approach is cisheteronormative and exclusionarily gendered, and entails several interrelated presumptions: that pornography is fundamentally a form of sexual violence rather than a media form embedded within a broader culture of sexual violence, that women alone are victims of pornographic and sexual violence, and that women like Sinclaire who do not characterize their relationship to pornography as violence are either misguided or victims of false consciousness, and thus unreliable witnesses (
Fawkes, 2005).
Lise
Gotell (1997) describes this “reconfiguration of political responses to pornography” that occurred in the 1980s, whereby
pornography as an issue has been subsumed within the social problem of violence against women, such that the two are most often understood as inseparable in policy discourse … By subsuming pornography into violence and by inserting it into a broader law-and-order agenda, government actors have contributed to the construction of a sexual panic.
(pp. 69–70)
This carceral approach relies on punitive and paternalist state apparatuses to address harm, a reliance that inevitably reserves protection for those who are already most served by the state—white, cisgender, heterosexual, monogamous, non-sex-working women—rather than those who are marginalized and/or criminalized by it (
Phipps, 2021). This orientation “withhold[s] womanhood and personhood from marginalised Others” (
Phipps, 2021, p. 88), placing sex workers outside of, and in opposition to, women’s safety. Indeed, the committee failed to consider the consequences of non-consensual image sharing for workers on the platform. Discourse established some victims as more deserving of protection than others, a backdrop where the state’s concern is safety for girls and women, unless they engage in sex work.
3. How has this representation come about?
The representation of porn as violence has been accomplished in part through the strategic deployment of affective displays. “Sex panics,” writes Janice
Irvine (2006, p. 82), “are fueled by emotional scripts—rhetoric strategically crafted to produce volatile emotional responses.” Emotion plays an important role in political posturing, partially because emotions are perceived as a neutral barometer of morality and “a site of truth and ethics” (p. 92). But emotions, while not entirely fabricated, are managed and can be performed toward political ends.
Throughout hearings, the affective appeals made by witnesses and emotional displays offered in response by committee members enabled conflation and oversimplification of the issues at stake, while drowning out more nuanced or dissenting voices. The authors are not problematizing the strong feelings present—these crimes are horrific, and outrage justified—but rather, that moral outrage was emphasized to the detriment of engaging meaningfully toward crafting solutions. For example, in addressing MindGeek executives, Member of Parliament Marie-Hélène Gaudreau (BQ, Laurentides—Labelle, QC) positioned dramatically, “I will not be able to sleep well unless I ask this question. You said you are a father. I am a mother. We can talk about business, profit, consent … but I would like to hear what your conscience tells you, as a parent” (
ETHI, 2021, overdubbed by translator, February 2, 13:29:45). MindGeek Chief Operating Officer David Tassillo began to respond, describing MindGeek’s efforts toward developing preventive technology and partnerships, when Gaudreau interrupted, “But you’re talking to me like a businessman. What if it was your child?” She later reiterated, “I don’t want to hear about you as managers, I want to hear about your conscience. How do you sleep at night!?” (13:49:40). Focus was squared in a rhetorical framing of moral character “as a parent,” demanding sympathy more than inviting informative responses. By repeatedly emphasizing their disgust at the problem, ETHI committee members prioritized moral stakes over gathering data on platform-specific policies and technical measures.
Affect was weaponized through emotionally charged language. Laila Mickelwait, director of Exodus Cry’s “Traffickinghub” campaign, made several misinformed assertions about moderation capacities in her testimony (e.g., that Pornhub only employs 10 moderators). She spent a substantial portion of her allotted time listing off porn titles with no connection to the case, seemingly to invoke disgust at industry standards. Rather than expertise, Mickelwait wielded moral authority to buttress claims about the porn industry that experts in legal jurisdictions, sex work, technology, or platforms could have easily refuted, were they present. Instead, Mickelwait was thanked and solicited for legal advice, despite no relevant expertise in that area.
In a subsequent hearing, director of the Canadian Centre of Child Protection Lianna McDonald described the internet as protecting “the privacy rights of adults at the expense of the safety and wellbeing of children” (ETHI, 2021, February 22, 12:08:39). Sex workers’ privacy is excluded from this statement, given the extensive identity and age verification processes currently required to sell pornography on Pornhub.
4 The U.S.
EARN IT Act (2022) and the U.K.
Online Safety Bill (
Bill 121, 2022) are only two of many proposed laws ostensibly designed to “protect children” that infringe on freedom of speech and create privacy risks for many (
Harmon, 2020;
Mullin, 2022).
Anti-porn feminists have a long history of mobilizing the affective, anecdotal style of testimony observed at the hearings. Stemming from consciousness-raising practices in the 1970s, first-person testimony was the key strategy used by Andrea Dworkin and Catharine MacKinnon in efforts to pass anti-pornography legislation in the 1980s. Supposedly promoting “public discussion over pornography’s role in social life” (
MacKinnon, 1997, p. 3), anti-porn feminists selectively amplify only voices that affirm victimization by pornography (
Sullivan & McKee, 2015), while ignoring other voices with a stake in the issue.
4. What is left out of this presentation?
Pornhub and other porn platforms are sites of labour, meaning any legislation altering their operations is, in effect, labour policy. Yet, sex workers who rely on the platform to earn a living were offered no space in the hearings to express needs or grievances. When sex worker organizations applied as witnesses, the committee responded that “sex workers are not relevant to this conversation” (
Serebrin, 2021). The absence of sex worker voices was not mentioned until more than halfway through the proceedings, when Sorbara raised concern over the unprecedented number of submitted briefs highlighting potential harms to sex workers, stating, “it seems to me that we need to make sure we don’t drive work underground and that sex workers’ voices need to be listened to” (11:30:15). Only then were sex worker interest groups offered a last-minute opportunity to speak. Jennifer Clamen, director of the Canadian Alliance for Sex Work Law Reform (an alliance of 25 sex worker groups across the country), was unambiguous in her opening statement that “[It is the] duty of parliamentarians to take direction and leadership from sex workers who are really
best placed [emphasis added] to speak to any policy or practice that may regulate online sex work or online porn” (
ETHI, 2021, April 19, 11:07:56). Instead, she continued, “It is made clear that sex workers are not welcome at this table and are not considered valued participants. We were told outright that this committee didn’t concern us” (11:09:33). In response, Viersen reiterated his position that the hearings concern the “victims of Pornhub,” implying this does not include sex workers. Clamen replied:
Discrediting is a very common tactic … I do thank you, Mr. Viersen, for providing a really good example of the way conflation happens … mixing that story with stories of people actually working in the industry … weaving in and out of these tales and testimonies in a way that suggests that everything is exploitation. That’s the exact reason why this committee unfortunately is in large part failing … to actually ask the right questions.
(11:44:00)
As Clamen articulated, the hearings enthusiastically deployed “exploitation” to frame all sex work as sexual violence, but never addressed the economic exploitation of workers on platforms like Pornhub. On a few occasions, Angus reminded the committee that its mandate was not to question the right of adults to perform in or consume legal pornographic content. His concession was that “citizens have the right to watch weird things. People have the right to promote and show their consensual bedroom antics, if that’s what they like to do. Whether people like it or not, that is their right” (ETHI, 2021, February 22, 12:17:45). Gaudreau referred to sex worker concerns as “collateral damage” to be avoided, but not central to the policymaking process (April 19, 11:52:11).
Clamen and fellow sex work advocates were the only witnesses to have testimony cut short; their session was ended abruptly by Member of Parliament Brenda Shanahan (LPC, Châteauguay—Lacolle, QC), who motioned that the hearing be suspended to schedule an additional meeting in order to address this “very complex subject,” and to “hear from more balanced witnesses” rather than continue hearing from the ones present (12:13:50).
The absence of pornography workers was apparent. Witnesses and committee members alike demonstrated a lack of knowledge about pertinent subjects including porn marketing strategies and moderation capacities. The only witnesses with any direct connection to Pornhub’s business activities were three MindGeek executives. But when called at the second ETHI meeting, questioning was conducted with such unique hostility that minimal relevant industry information was communicated. In contrast, anti-porn witnesses were granted authority to claim blatant inaccuracies, for example, that video tags and titles are accurate to content. In fact, production conditions mean a product’s content and how it is marketed or labelled are often mutually exclusive processes (
McKee, 2015). Prominence of the term “teen” was cited repeatedly as evidence that Pornhub allows CSAM to proliferate across its sites, despite MindGeek executives’ attempts to explain the industry standard of using “teen” to market performers aged 18–25 years. Like most media sectors, porn does not claim truth in advertising. If the committee was attentive to porn platform labour, content experts could have assisted in navigating common industry practices. Instead, ETHI directed questions about content to anti-porn advocates, child protection agencies, and law enforcement, who inaccurately described many website affordances.
Theoretically an interrogation of the platform, the ETHI process hardly engaged with technical, market, or even sociological insights regarding the platform’s operations. Working with victims of non-consensual porn to remove their content, privacy analyst Charles DeBarber was arguably the witness with the most apt technological understanding around non-consensual imagery circulation, prevention, and removal. He was the only witness other than sex worker advocates who identified porn platforms as a site of labour and addressed potential implications that poorly crafted policy could impose on sex workers. DeBarber stated that while he sought processes to hold Pornhub accountable, he intended “to make sure that it’s also not so cumbersome that sex workers [who] are free agents can’t operate without reasonable privacy” (
ETHI, 2021, June 7, 12:13:00).
Child sexual abuse material and non-consensual material represent a proportionally small share of content on sites like Pornhub (NCMEC, 2022).
5 Child sexual abuse material is the exception on Pornhub, whereas financially exploited, precarious workers are the ubiquitous rule. Both parties deserve the attention of policymakers, but only one group is granted it. Here, the authors observed a process that prized displays of emotion from one, while displacing expertise and evidence from the other. Had the hearings not erased the latter and framed the problem of content moderation inversely, the recommendations could have been better informed by industry workers with relevant industry expertise.
5 & 6. Where is this representation produced, and what are the effects?
Representation of this problem was produced through a parliamentary standing committee, whose speculative function paired with a largely arbitrary sex-as-violence framing, resulting in feeble policy impact and leaving established frames unchallenged (
Freedman, 2010). The role of parliamentary committees is to “examine, in small groups, selected matters in greater depth” (House of Commons, n.d.) and to report conclusions that the House of Commons may act upon. Meant to reduce partisanship, increase the agency of parliamentarians, and improve the machinery of government itself, “committees have a lot of latitude in how they organise their work” and determine schedules based on “their members’ interests” (House of Commons, n.d.). A former member of the Library of Parliament notes, “the influence of committee studies is much more modest than might be suggested by the dramatically increased volume of reports, recommendations and responses being generated since the mid-1980s” (
Stilborn, 2014, p. 355). There is significant debate about its efficacy and deliberative power, with function appearing more performative than productive.
Indeed, the bulk of proceedings were spent impressing how harmful non-consensual porn is and bemoaning a lack of clear regulatory responses. Rather than this emphasis on grievance, proceedings instead could have assumed the clear position that CSAM on Pornhub is harmful and unacceptable, then applied committee resources to determine how to manage this problem well. The issue of hosting CSAM could be more practically addressed by revisiting intellectual property (IP) protections or the recordkeeping requirements for platforms, bolstering antipiracy efforts, or even structurally interrogating Pornhub through the lens of ensuring fair market competition.
The Pornhub case could be addressed more productively by a governing entity like the Canadian Radio-Television and Telecommunications Commission (CRTC), which maintains jurisdiction over audio and visual streaming services. Bypassing CRTC in this case reflects a pattern of government inattention to legitimizing CRTC’s contested authority over internet regulation of media platforms (
Luka & Middleton, 2017;
Leavitt, 2021). Another possibility could be to integrate the case in recent processes including Canadian Heritage’s 2021 Online Harms consultation, or under C-11,
Canada’s Online Streaming Act (
Bill C-36, 2021; Bill C-11, 2022). With a shared focus on harmful circulation of non-consensual images, the latter’s proposed expansion of the
Mandatory Reporting Act would require service providers and platforms to assume greater responsibility for content. These present more appropriate institutional routes to address moderation issues than the ineffectual volunteer-based ETHI committee.
The ETHI report (
House of Commons, 2021b) offered underwhelming policycraft. Of 14 recommendations in the final report, some are pragmatic: four (Recommendations 4, 5, 6, 7) call for strengthening compliance with existing mandatory reporting laws, two (Recommendations 2, 9) are on enforcing existing age verification standards, one (Recommendation 10) calls for proactive enforcement of existing Canadian law, and two (Recommendations 11, 12) deal with CSAM survivor supports in seeking responsive moderation mechanisms, and call for further study of content depicting sexual violence.
Other recommendations were troubling. Determining that uploaders must provide “proof of valid consent” (8) is difficult to verify or even define, given “consent” is a complex legal framework barely addressed by these proceedings. It’s an alarming suggestion that a difficult, revocable legal standard ought to be codified for platforms to assess and manage. Three recommendations (1, 13, 14) call for invoking platform and ISP liability precedents that diverge significantly from legal norms set out in CUSMA Section 19.17. This recommendation is conspicuous, given managing internet operations at large is well beyond scope of this committee study.
The recommendations in ETHI’s final report suggest a performative intervention, are vague on particulars of the issue, and lack substantive contributions regarding structural points of intervention. Tellingly, the remaining ETHI recommendation (3) suggests future studies ought to “consult with survivors, child advocacy centres, victim support agencies, law enforcement, web platforms and sex workers prior to enacting any legislation or regulations relating to the protection of privacy and reputation on online platforms” (
House of Commons, 2021b).