Full Swarm Symposium 2019 line-up announced
We’re excited to reveal the full program for the 2019 Swarm Symposium: platform governance for safer communities.
The Swarm Symposium is one of the things that makes the annual Swarm conference unique – bringing together researchers, policymakers, and academics with working online community professionals across sectors to dig into the big issues affecting us all.
This year’s theme is platform governance – our relationships with the platforms that host our communities, and how their governance infrastructure supports or fails our efforts at healthy community stewardship.
Curated by Dr. Fiona Martin, from the University of Sydney, the program features a keynote from Professor Nic Suzor and presentations from a range of scholars on fascinating topics.
Swarm Symposium 2019 program
University of Sydney Tuesday, August 20
9.15 am – 12:45 pm in the Law Lounge, New Law Building, University of Sydney
Building cohesive, productive online communities depends on good governance relationships – particularly with the platforms that host our conversations and content. But increasingly transnational communications platform providers are being portrayed as reluctant to control the violence and misinformation they host, and slow to provide moderation tools that help community managers minimise harmful content and deal with bad behaviour.
In light of the Christchurch Call, the 2019 SWARM Symposium investigates how we can govern online communities more effectively, for safer sociality. It’s sponsored by the Department of Media and Communications and Faculty of Arts & Social Sciences, and is inspired this year by a joint ARC Discovery grant with colleagues from QUT, Platform Governance: Rethinking Internet Regulation as Media Policy (DP190100222).
9.30 – 9.35: Welcome and Acknowledgment of Country,
Dr. Fiona Martin, Research Director, Dept. Media & Communications
9:35 – 9:50 Keynote: Professor Nicolas Suzor, Law School, Queensland University of Technology
A new social contract for the digital age: the responsibilities of platforms
Rampant abuse, hate speech, censorship, bias, and disinformation – our Internet has problems. It is governed by technology companies – search engines, social media platforms, content hosts, and infrastructure providers – whose rules influence what we are allowed to see and say. These companies govern our digital environment, but they are also subject to pressure from governments and other powerful actors to censor and control the flow of information online. As governments around the world grapple with how to regulate digital media platforms, it’s clear that big changes are coming. We are now at a constitutional moment – an opportunity to rethink the basic rules of how the Internet is governed. Can we build a vibrant, diverse, and flourishing internet that can promote fundamental human rights? I argue that, if we care about the future of our shared social spaces, we need new constitutionalism: real limits on how power is exercised online.
Bio: Professor Nicolas Suzor studies the regulation of networked society, including the governance of the internet and social networks, digital copyright, and knowledge commons. Nic is also the Chapter Lead of the Creative Commons Australia project and the deputy chair of Digital Rights Watch, an Australian non-profit organisation whose mission is to ensure that Australian citizens are equipped, empowered, and enabled to uphold their digital rights. He is the author of Lawless: the secret rules that govern our digital lives (Cambridge, 2019).
Nic teaches intellectual property and technology law at QUT. He is an award-winning educator, receiving QUT’s David Gardiner Teacher of the Year medal in 2016 and was nationally recognised as a recipient of an Australian Awards for University Teaching Citation for Outstanding Contributions to Student Learning in 2017 for his engaging and innovative teaching.
9.55 – 10.15 Mr. Luke Munn – Western Sydney University
Angry by Design?: Technical Affordances and Toxic Communication
Hate speech online is on the rise. Recent studies describe this rise statistically (Safehome 2017; Hango 2016), but stop short of analyzing its underlying conditions. Harm reduction on platforms seems heavily focused on improving automated systems (Pavlopoulous et al. 2017) or human content moderators (Gillespie 2017). Mainstream literature, for its part, often blames a toxic individual, someone with a predilection for hating or bullying, racism or sexism (Jennings-Edquist 2014). In contrast, this study seeks to understand how hate emerges from hate-inducing architectures. Just as the design of urban space influences the practices within it, the design of platforms, apps, and technical environments shape our behaviour in digital space. How does the design of technical environments promote toxic communication?
In the last few years, technical designers have admitted that their systems are addictive (Bosker 2016) and exploit negative “triggers” (Lewis 2017). Others have spoken about their tools “ripping apart the social fabric of how society works” (Vincent 2017). Facebook’s design privileges base impulses rather than considered reflection (Bosker 2016). Social media functionality enables negative messages to be distributed farther and faster (Vosoughi et al. 2018), while anger spreads contagiously (Fan et al., 2016). The “incentive structures and social cues of algorithm-driven social media sites” amplify the anger of users over time until they “arrive at hate speech” (Fisher & Taub 2018). Indeed such gradual amplification of hate creates a potential pipeline for alt-right radicalization (Munn 2019a; Munn 2019b). In warning others of these negative social effects, designers have described themselves as canaries in the coal mine (Mac 2019).
Very recently, a new wave of designers and technologists have begun thinking about how to redesign platforms to foster calmer behaviour and more civil discourse. How might design create ethical platforms that enhance users’ wellbeing (Han 2019)? Could technology be designed in a more humane way (Harris 2018)? And, what would the core principles and processes of such design look like (Yablonski 2019)? Identifying a set of hate-promoting architectures would allow designers and developers to construct future platforms that mitigate communication that is used to threaten, harass, or incite harm.
“Angry By Design,” recently funded by Netsafe, picks up on this nascent work, tracing the relationship between technical architectures and toxic communication. Three distinctly different platforms are examined: Facebook, Twitch, and 4chan. How does Facebook’s privileging of metrics influence the intensity of content that gets shared? What kind of features support Twitch’s “gamer bro” culture of misogynistic trolling? And how does message board design encourage memes that normalize hate against marginalized communities? This paper will survey the terrain of platform design and hate speech, introduce some early findings, and suggest some promising directions for future research.
Bio: Based in Tāmaki Makaurau, Aotearoa New Zealand, Luke Munn uses both practice-based and theoretical approaches to explore the intersections of digital cultures, investigating how technical environments shape the political and social capacities of the everyday. He is currently completing a Ph.D. at Western Sydney University on algorithmic power.
10.20 – 10.40 Ms. Jenna Price – University of Technology/University of Sydney
The emotional labour of online activism
As activists work on digital campaigns, they struggle with detractors, negotiate with other activists, and come face-to-face with the perpetually demanding participatory nature of online activism. In other words, they invest emotionally in their online labour. It is a requirement to manage their feelings in this setting in much the same way as it is a requirement in paid work – activists must manage their own feelings, their feelings about each other, and about the impact of both campaigning and campaigns in order to achieve their end goals. Digital activism is also outward-facing, in the public sphere, making the job of exerting control of emotional states more pressing than in intimate surroundings. This presentation discusses the impact of emotional labour on the moderators and administrators of a feminist activist Facebook group. It’s not all bad news either.
Bio: Jenna Price is a senior lecturer in journalism at UTS and a Ph.D. candidate with the University of Sydney. She is a columnist for the Sydney Morning Herald, and a co-founder of the feminist action group Destroy the Joint.
10.45 – 11.05 Dr. Lukasz Swiatek (UNSW Sydney) and Chris Galloway (Massey University)
Platform Governance, AI & Boundary Spanning: New Approaches for PR Managers
As harmful online content and behaviour increasingly negatively impact online communities, and as concerns also mount about international platform providers’ inadequate regulation of such content and behaviour, the role of public relations (PR) managers becomes more and more important. This paper examines the vital ways in which PR managers – with management rather than ‘technician’ roles (Grunig & Grunig, 1992) – can contribute to platform governance. Specifically, it examines the contributions that they can make as boundary-spanners: operating across organisational boundaries (Grunig & Hunt, 1984), facilitating communication flow between different departments, staff, and other internal stakeholders.
The paper makes a novel contribution to theory and practice in platform governance (as well as PR) by examining the ways in which boundary-spanning by PR managers can help build and maintain safer online communities in an era of proliferating artificial intelligence (AI). To the best of the authors’ knowledge, based on an extensive review of the literature, this approach has not been considered before, with PR-AI possibilities only recently beginning to be investigated thoroughly in scholarly literature (see, for example, Tilson, 2017; Yaxley, 2018; Galloway & Swiatek, 2018). The paper argues that boundary-spanning by PR managers for more effective platform governance needs to take into account the affordances of AI technologies, as well as the dilemmas that they entail. It presents a new practice-based framework for boundary-spanning that includes both synchronous and asynchronous communications monitorable by AI technologies.
The paper’s case study is Facebook and its ‘policy team’ (comprising public relations professionals, crisis management practitioners, and lawyers), its 7,500 human moderators, and its technologists, among other organisational groups (Koebler & Cox, 2018). The paper speaks directly to the symposium theme, and two specific sub-themes: (1) AI and its impacts on community development, and (2) regulating live streaming and synchronous chat.
11.05 – 11.15 MORNING TEA
11.20 – 11.40 Mr Tim Koskie – Centre for Media Transition, UTS, Sydney
Insert culture here: Culturally intermediating online communities
The legal and ethical challenges facing organisations that choose to host user comments on their websites are increasingly visible. However, it is unclear what set of comment moderation and community management practices and objectives organisations are employing to achieve their goals. This research project investigated the work of the cultural intermediaries working in these comment sections in online newsrooms. It found that these workers often grounded their choices in the journalistic field that surrounded them and that they themselves inhabited. Through observations and interviews at Fairfax Media and The Conversation with the staff who watch, moderate, manage, and shape user comments on news stories, the research uncovered the influences guiding their work as well as some of the distinct tasks and practices they employ to cultivate the culture they want to see in the discussions below the articles. Employing newsroom ethnographic techniques, the study used participant observation alongside deep, unstructured interviews with nine participants, including journalists, editors, a web developer, and, most prominently, the key staff dedicated to comment moderation and community management. It found that, while participants did not equally value comments, their practices and judgments related strongly to their backgrounds and the context in which they operated, which ultimately shaped their cultural intermediation work. Further, it revealed that, through the way they prioritise their work, they can have a significant influence on the way the comment sections develop. These results show that organisations need to consider how their culture and the background of their staff shape the comment sections they host. It also reveals subtle but crucial tasks and challenges facing the staff who engage in this work.
Bio: Timothy Koskie is a researcher, government consultant, and doctoral student at the Centre for Media Transition, UTS, as part of the joint Media Pluralism Project with the University of Sydney. His academic interests are user-generated content, the culture of media work, and media pluralism, with his current project investigating how user-generated content plays a part in pluralistic media ecosystems.
11.45 –12.05 Dr. Fiona Martin & Ms. Venessa Paech, University of Sydney
Working with platforms: the parameters of community governance relationships
As media and communications research into platform content regulation has focused on the problem of moderation (Roberts, 2016, Gillespie, 2018), and everyday user experience of platforms’ regulatory regimes (Crawford & Gillespie, 2014; Sarikakis, 2017; Gerrard, 2018; Tan 2018), little study has been undertaken of the governance relationships developing between platforms and community managers. Based on preliminary data from the 2019 Australian Community Managers Career survey, this paper explores the scope of community manager relationships with platform providers and the challenges that they identify in governing groups on their communications infrastructures. Using a nodal governance framework (Holley and Shearing, 2017), it pinpoints areas for further research into the uneven dynamics and power inequities between platform companies and professional community managers.
The study finds Australian community managers worked with over 20 different online communications platforms, specialist and non-specialist, to build their communities with Facebook hosting the most used non-specialist applications for this purpose. However, in contrast to the diversity of hosting architectures reported, the challenges community managers face in negotiating their relationship were more homogenous, concentrating on functionality and usability, with data privacy, service regulation, and content regulation as additional concerns. In exploring how they would like to improve their relationships with their platform providers, community manager responses highlight the barriers they face to achieving transparent, timely, relevant, and consistent responses to the issues they raise.
Bio: Dr. Fiona Martin researches digital journalism and dialogic technologies, as well as the uses, politics, and regulation of online media (internet, web, mobile and social media) and the implications of these technologies for media industry change. She is the co-author, with Tim Dwyer, of Sharing News Online (Palgrave Macmillan, 2018) and the author of Mediating the Conversation (Routledge 2020). She is a co-investigator on the ARC Discovery project Platform Governance: Rethinking internet regulation as media policy, 2019-2022 and on the Facebook Content Policy Research on Social Media Platforms award: Regulating Hate Speech in the Asia Pacific.
Ms. Venessa Paech is an internationally regarded community builder, manager, and strategist. She has been engaged by numerous organisations as a community principal, consultant, and strategist, including Lonely Planet, REA Group, Envato, and Australia Post. She is the co-founder of the SWARM conference and the Australian Community Manager’s network, and is a founding member of the Global Community Management Leadership Group (with fellow community leaders from five nations), working closely with industry, government, and researchers to grow community management practice in the Asia Pacific region. Venessa is a Ph.D. student at the University of Sydney, studying the impact of automation and AI on community building and governance, and is a member of the Socio-Technical Futures (STuF) Lab.
12.05 – 13.00 Panel – Evaluating Australia’s platform governance strategy
Moderator: Fiona Martin. Panelists: Nic Suzor, Andre Oboler, Venessa Paech
In the Abhorrent Violent Materials Act and the ACCC’s Digital Platforms Inquiry, the Australian federal government has shown an appetite for regulating social media platforms. How effective are the approaches it’s favouring, and what other directions might we take in governing online communities and content? We welcome to the discussion Dr. Andre Oboler CEO of the Online Hate Prevention Institute.
Bio: Dr. Andre Oboler is a Senior Lecturer in the La Trobe Law School and CEO of the Online Hate Prevention Institute. He serves on the IEEE’s Global Public Policy Committee and on the Australian Government’s delegation to the International Holocaust Remembrance Alliance. He holds a Ph.D. in Computer Science from Lancaster University (UK) and LLM(Juris Doctor) and Honours in Computer Science from Monash University.