Professor Jean Burgess, Associate Director of the ARC Centre of Excellence for Automated Decision-Making and Society speaks to Kate about how research is helping us understand the social implications of digital media technologies, platforms, and cultures. Kate and Jean also get heated about content moderation on social media and the Treasury Laws Amendment (News Media and Digital Platforms Mandatory Bargaining Code). Jean thinks it’s just bad code on both sides.
00:00:08 Kate Joyner
Welcome to QUT ExecInsights, brought to you by QUTeX Professional and Executive Education for the real world. With me today is Jean Burgess. Gene is Professor of Digital Media and Associate Director, The ARC Centre of Excellence for Automated Decision Making and Society, and School of Communication, QUT.
00:00:27 Kate Joyner
Jean was nominated by The Australian as one of Australia’s top researchers and leader in the field of communication. Professor Burgess’s research focuses on the social implications of digital media technologies, platforms and cultures, as well as new and innovative digital methods for studying them. Her most recent publication is Twitter, a Biography with Co-author Nancy Baym. Welcome, Jean. Did I get that right?
00:00:53 Jean Burgess
You did, yeah.
00:00:54 Kate Joyner
There’s a lot in those titles, isn’t there? Yeah.
00:00:53 Jean Burgess
Thanks for having me.
00:00:58 Kate Joyner
Yeah, I know I’m really, we really, it’s so good to have you know like Australia’s field leader, yeah. That’s amazing. In fact, QUT’s got a few leaders in different fields so we’ll interview them over, as the year progresses. So, Jean, you’ve researched two digital platforms in some depth, both YouTube and Twitter. So, when you look at these platforms, what kind of questions do researchers such as yourself look at?
00:01:24 Jean Burgess
Yeah, the questions have changed over time. I first started looking at both of those platforms when they were really quite new, and there was just sort of a bit of buzz surrounding them. And I guess we were really just trying to understand what role these new platforms might be playing in the way our media and communication system works.
00:01:48 Jean Burgess
And why they were interesting at the beginning, was that they were bringing a whole new population of participants into the media system. So, ordinary people, or bloggers, or amateur aspiring creatives were playing a role in public communication, in our cultural life through their participation on these platforms. So that’s what that’s what interested us at the beginning.
00:02:12 Jean Burgess
But of course, as they’ve grown and matured and gone mainstream, they’ve become both much more significant to society, and much more problematic as well in different ways.
00:02:23 Kate Joyner
So yes, they have changed, so there’s you know some wonderful social things that have blossomed from our participation. But there’s like everything, that’s complex. There’s a dark side.
00:02:34 Kate Joyner
So, what as engagement in these platforms, I suppose it’s different for YouTube and Twitter because they’re different kinds of media and one’s visual. I suppose Twitter is visual too. Yeah, so how is that sort of, the general population engaging in media, in social media? How do you think in a big picture sense that’s changed? The nature of information flows and how we get our information.
00:02:59 Jean Burgess
Well, yeah, these are really, really big questions, aren’t they?
00:03:04 Jean Burgess
On the one hand, what motivated me originally was thinking how, frankly, how wonderful it was that we were able to connect with people on the other side of the world, just through sharing aspects of our everyday life with them and the really interesting way that we could sort of combine our roles as citizens of the world, consuming news and talking about news with our roles as, you know, as human beings, as friends as family. So that was wonderful.
00:03:31 Jean Burgess
As the platforms have matured and as their business models have matured, and particularly as journalists or marketing people and politicians have come to see that they could use these platforms for their own purposes, the stakes have gotten much higher. And so, as you’re kind of hinting at, sort of malign or less well-intentioned actors have found ways to exploit the same affordances or features of those platforms that enable ordinary people to just share baby photos and have a bit of a laugh with their friends, which is really what they were designed for at the beginning.
00:04:10 Kate Joyner
Yes, it is fascinating, isn’t it? That, I suppose, when I guess what we were alluding to is the fact that a US leader can engage directly with the populace in the same way that we can, so that we’re on the same platform as the US, President of the US.
00:04:28 Jean Burgess
Yeah, I don’t think anyone would have thought at the beginning of Twitter, that a Head of State would be virtually declaring war, like…
00:04:35 Kate Joyner
Yes. He did, yeah.
00:04:36 Jean Burgess
You know, ’cause our words when we’re the president of the United States, our words can have great import, so it’s a bit more than just talking directly to your voting base. That they can actually, that these platforms can actually be the stage on which…
00:04:52 Kate Joyner
I mean, it was ever thus, I suppose like if for, an authoritarian leader would find a social way, so use media to speak directly to the people without any disintermediation. You know, so…
00:05:05 Jean Burgess
Absolutely. Or you just make really good friends with a really powerful media mogul and make sure their newspapers reflect your perspective, as we see in Australia.
00:05:14 Kate Joyner
Yes, but we who don’t have friends in high places, we too can participate. And I must say that I had a, you’re good on social. There’s an art, isn’t it, now that we’re all engaging in Twitter as universities. So, most university leaders will be engaged in some way on Twitter. I have to say I engage as the unreconstructed middle-aged academic that I am. So, you know, I write in full sentences.
00:05:38 Kate Joyner
And there’s no hashtags or, and it’s not terribly social, because I don’t necessarily reference other people’s works, you know, so I’m learning on the on that, yes on that scale. But you’re really good at it. So, what makes someone effective on these platforms so their message cuts through as opposed to, you know, someone like myself?
00:05:58 Jean Burgess
It’s, well, it’s funny that you say I’m good at it because I don’t do what’s in the marketing communication manuals at all.
00:06:03 Kate Joyner
Communication, about how to engage, yeah.
00:06:07 Jean Burgess
Not at all. And you know one of the things that we look at in our book, Twitter, a Biogrpahy, is how the idea of what it means to be good at social media, good at Twitter, has changed over the life of the platform.
00:06:13 Kate Joyner
Oh, okay, tell me.
00:06:17 Jean Burgess
Really, what I do is what I did at the beginning, which is to combine silliness, you know, really deep personal stuff with my work. Because a lot of us came to Twitter from blogs. Blogs that were partly personal. You know, we were early career academics, so we were trying to figure out what our thesis was about and make friends, and that’s what we carried over into social media. It’s very much what Twitter was designed for. So, it’s both maybe that I’ve kind of grown up with the platform. But also, I’m like, I have a really strong explicit commitment to continuing to practise social media in that way.
00:06:59 Kate Joyner
To practise social, so probably yeah….
00:07:00 Jean Burgess
Despite all the pressures from being in a workplace that has all kinds of policies about…
00:07:04 Kate Joyner
00:07:06 Jean Burgess
And it can be a fine line to tread where you want to be an authentic person, and you might have a political point of view, for example. So you have to be careful, but…
00:07:14 Kate Joyner
We in universities are allowed to express. Apparently there’s a rule.
00:07:17 Jean Burgess
Free speech is very important in universities.
00:07:21 Kate Joyner
It is, yes. So, it is that. So, that’s really interesting, so that you do combine yourself with your work so that when you speak on Twitter, you would combine so people know who Jean Burgess is as well as Jean Burgess’s work, yeah.
00:07:33 Jean Burgess
Well, it’s actually a pretty deep thing that’s actually important to the discipline that I, the scholarly discipline of Cultural Studies.
00:07:38 Kate Joyner
Yeah, that’s what I thought. Yeah, I thought you were good at it because you know that was kind of your expert field as well.
00:07:42 Jean Burgess
Yeah, you know there’s a strand of humanities and social science called Cultural Studies that was really built in the study of everyday life and how that’s the stage on which politics really plays out for people. That’s where all kinds of, all kinds of policies and economic realities, really, you know, that’s where the rubber hits the road for people, and so that’s where I kind of focus my research. So, it’s important that I’m authentic about my own life.
00:08:13 Kate Joyner
You’re authentic, yes. So, speaking about, I meant to start with this, speaking about being authentic, I understand that you were a musician to start with, and a flute player.
00:08:21 Jean Burgess
00:08:22 Kate Joyner
See, I’ve done my research.
00:08:23 Jean Burgess
You have done your research. My first degree was a Bachelor of Music at UQ.
00:08:26 Kate Joyner
Oh, me too. Oh, and I thought “I bet you went to UQ.”
00:08:30 Jean Burgess
I went to UQ and The Con, and so I graduated into the recession that we had to have with a degree in flute performance. Which was useful for flute performance. And I did that for ten years and then went back to uni as a mature age student, actually.
00:08:42 Kate Joyner
Yeah. Okay, yeah, and that’s pretty much my story too. So there you go, hey, yeah?
00:08:47 Jean Burgess
It’s a common story.
00:08:48 Kate Joyner
Yeah, starting with something and progressing. But it does add depth, doesn’t it? I think.
00:08:54 Jean Burgess
I tell you what, also being a musician, there’s a lot of transferrable skills that aren’t recognised. Playing in a band, is the best kind of teamwork you can imagine.
00:09:00 Kate Joyner
I think so, too. Yeah, social. Did you play in a band?
00:09:04 Jean Burgess
I played in a band, played in orchestras. A lot of teaching, yeah.
00:09:10 Kate Joyner
Oh well, done. So yeah, I went to The Con, which at that time, see I’m older than you, was actually here on the QUT Gardens Point campus.
00:09:31 Jean Burgess
I remember. No, it was here when I went there too.
00:09:18 Kate Joyner
Oh, was it? Yeah. I can’t imagine how we were all in that kind of small space. And in a time pre- any social media. I can’t imagine what the life, it was a hothouse anyway, so add social media to that and it would have just, I think just exploded.
00:09:32 Jean Burgess
I’m sure we’ve both got lots of stories.
00:09:35 Kate Joyner
But talking about exploding, but there’s, when you said that there’s all kinds of benefits from us engaging in media, but there’s also downsides. So, it has, I mean, I don’t know, I’m imagining that bad social behaviour has been with us for all time, but social media tends to amp it up, I think. So, what do we know about that phenomenon of, I guess of trolling, I suppose you’d say, and dysfunctional, I’ll call it dysfunctional social behaviour on media platforms, what have we observed about that kind of behaviour?
00:10:09 Jean Burgess
Well, as you say, bad or harmful social behaviour is not new and as you also said, social media platforms amplify everything. Trolling originally meant just deliberately kind of messing with the online community. Playfully, mischievously, just trying to, kind of trying to get a rise out of people. That’s what trolling means, I think. Well, some of what we’re seeing now, particularly with white supremacism and alt-right politics is a deliberate weaponisation of the internet’s natural kind of affordances for amplification and for trolling, and also a deliberate exploitation of an aspect of Internet culture where you can always say you are just kind of joking.
00:11:02 Jean Burgess
So this is kind of baked into the internet’s culture. This kind of, some scholars talk about the ambivalence of internet culture where it’s like trying to figure out whether something is trolling or abuse, or parody or satire, or actually deadly serious hate speech is one of the biggest challenges more for these platforms now. And due to the network effects where things can be amplified and spread very far beyond their original, the original speaker or the original poster, that makes the stakes much higher.
00:11:39 Kate Joyner
It does make the stakes much higher and people can find themselves the target of shame, you know. I was just listening to Tim Minchin’s song this morning, you know, 15 Minutes of Shame, when we’ll all be unforgivable? So it has, the amplification has meant that you could be the target of a global hate campaign really, which I think in analogue media probably was not the case so much.
00:12:02 Jean Burgess
Yeah, we could talk about similarities I suppose to gossip. Gossip in small villages, for example.
00:12:11 Kate Joyner
Like the gossip column?
00:12:12 Jean Burgess
Well, just a small village, right, where your world is that big. And so, if the whole village is shunning you, it’s just as profound, isn’t it than if you live in a global village? So, I think the dynamics are the same. But yeah, the role of media has changed.
00:12:26 Kate Joyner
On a larger scale. So, the village would have been your whole world, which is probably, the shame would have been of a similar magnitude, I suppose.
00:12:35 Jean Burgess
Yeah, it’d ruin your life in just the same way, I think.
00:12:38 Kate Joyner
And I guess, you know, when you talk about, it could be ridicule, it could be you know, poking fun, so if you’re the target of a social media troll it’s always a bit of a dilemma about what you do with it. You know, do you play along, or do you arc up? You know, and put that troll on the front page of your Instagram feed. So, is there received wisdom about what we should do if we find ourselves…
00:13:07 Jean Burgess
Well, as you probably know, the received wisdom is “Don’t feed the trolls.”
00:13:09 Kate Joyner
Don’t feed the trolls.
00:13:14 Jean Burgess
So if it appears that people are just trying to get a, trying to get it, not just, but trying to get a rise out of you, trying to get a reaction and trying to mobilise other people to get involved in piling on, then just blocking and reporting and denying oxygen is far more effective. However, if I don’t know, you’re a public figure and another public figure is apparently mobilising people to engage in a sneaky hate campaign against you based on some aspect of your identity, it might be in the public interest for you to actually call that out and get media coverage of it. So, like everything it depends, and it’s complicated.
00:13:53 Kate Joyner
I know, makes it way more complex. Yeah, fortunately yes, because I have led a low-profile life it hasn’t, oh actually, yeah it does happen even in small social situations like being that you know, Chair of a nonprofit organisation and you find yourself someone in the committee, you know, has a vendetta, then they can use social media very effectively I think.
00:14:17 Jean Burgess
Things can go horribly wrong on little neighbourhood Facebook groups as well, as we all know.
00:14:20 Kate Joyner
Yes, I know. That video that just became popular last week. The Village Church Committee, did you see that one?
00:14:26 Jean Burgess
Yeah, it was brilliant. In my neighbourhood Facebook Group, it’s always about stray dogs.
00:14:33 Kate Joyner
Stray dogs. And break-ins.
00:14:34 Kate Joyner
Yeah, yeah, but in terms of ability for the large social platforms to respond, I saw that Facebook, I think it was it Facebook who tried artificial intelligence as a mechanism? And they had very grandly said that AI will just about solve the problem, and they will be able to use algorithms that will remove hate speech. But not so much, I think?
00:15:05 Jean Burgess
Yeah. All of the platforms use automation and AI to some extent, and they will continue to do that. But sort of hidden behind the scenes is a huge army of quite poorly paid manual labourers who have to look at awful things.
00:15:20 Kate Joyner
00:15:21 Jean Burgess
Awful things, so that we don’t have to. The other thing that makes it really tricky, and where the platforms are a little bit slow to come on board is that it’s not necessarily, you can’t necessarily diagnose hate speech from a piece of content or abuse, in fact. So we know, for example, with people in domestic violence situations, maybe they get a text every single night from their ex that says, “How’s the dog?”
00:15:46 Jean Burgess
Right, so that individual piece of content could look innocuous on its own. You have to understand…
00:15:50 Kate Joyner
So an algorithm wouldn’t pick it up.
00:15:52 Jean Burgess
Yeah, so the person who is experiencing harm has to be involved as well in reporting and in diagnosing and explaining what’s going on. So, there’s going to be a role for machine learning and for AI, but there’s going to be a role for more respectful engagement with the community as well.
00:16:09 Kate Joyner
I guess that issue about the poorly paid people behind the scenes who have to play a role in this hate speech, kind of brings up a larger issue about employment in the digital sphere. You know, they shouldn’t be poorly paid really, should they, playing such a vital function. And the kind of, yeah they should be getting, yes, danger money, I think really.
00:16:35 Jean Burgess
It’s a pretty big broad issue for our planet, I think as a whole, that the extreme wealth disparity that’s emerging. The way that these digital platforms have made huge profits during the pandemic and then the treatment of workers throughout their value chain. Yeah, Yep.
00:16:54 Kate Joyner
Yeah, and that we should continue, I’d like to have a whole series actually on that phenomenon because I think that the pandemic and the response to pandemic has just kind of amplified those kind of tendencies. So, and sometimes we have deliberately stayed away in the podcast from critical observation, but I think we probably shouldn’t.
00:17:16 Jean Burgess
Yeah, I reckon there’s a big seismic shift underway, though. You see, I think you see a growing realisation and consensus among most people, most States, most governments on either side of politics in the world, that these platforms companies are probably too large and probably need both more oversight and regulation and to be broken up.
00:17:35 Kate Joyner
And to be broken up, yeah, and they’ve always got good reasons why that’s not in the, you know, the public interest. But did you have any thoughts or comments on Australia’s proposed legislation to ask Facebook to contribute to the revenue of our traditional media?
00:17:35 Jean Burgess
Oh, yes. So, much of the reporting around that proposed scheme, sort of pits Facebook against Australia, but it’s, and Australia is fighting on behalf of local citizens ability to access news and local news companies. But it’s a bit more complicated than that. It’s really about Australia, on behalf of Rupert Murdoch, trying to get a slice of Facebook’s revenue for established commercial media players.
00:18:31 Jean Burgess
They might have dragged the ABC into it, which would be a bit tricky because it’s very important that the ABC’s precisely not a commercial media operation, and it wouldn’t have helped the huge number of local, rural, regional, community and minority news organisations, so I think it’s really just honestly, it’s just a bunch of very powerful white men writing bad code at each other, honestly.
00:18:59 Kate Joyner
Writing bad code at each other.
00:19:00 Jean Burgess
Well, whether it’s legal code or whether it’s, ’cause Facebook’s sort of posturing and saying “Right, well we’ll turn off the news again.” They did it in a very unsophisticated way, purely demonstrating how hard it is to make determinations about what a news organisation actually is. So, they were doing it to prove that the proposed legislation was written badly, but they just proved that actually they’re interfering in, they’re playing a major role in the news and media environment of the world and don’t really have a thoughtful approach to doing that.
00:20:02 Kate Joyner
So, is that their own sort of organisational algorithm doesn’t allow for, you know, I suppose to me their algorithm as an organisation are anything that makes a profit. And to put themselves as social actors is not part of their modus operandi, I think.
00:20:04 Jean Burgess
It wasn’t early on in their DNA, but it’s incontrovertible that Facebook is playing a major role in mediating political processes. And you know, the information ecosystem around a global pandemic, for example. So, it’s unavoidable, and very complex.
00:20:25 Kate Joyner
It is very complex, but we did learn with Twitter that they did say originally that they can’t stop President Trump and his hate speech, but they could. That’s, at the end they did remove President Trump’s account and apparently 70% of that kind of disinformation evaporated with him. So, it is possible if they choose.
00:20:49 Kate Joyner
Which kind of brings us to the question I guess about some of the questions that will guide your research and the research of your colleagues into the future, so you are involved in an ARC Centre for Excellence, Automated Decision Making and Society. So what questions are you pursuing in that sense?
00:21:09 Jean Burgess
Yeah, that Centre is really taking our social, digital and social media research much more deeply into these questions about what kinds of platform governance, content curation, content distribution are being automated and with what implications and some of the most exciting and challenging areas of research are in trying to figure out what methods we can use to, actually to get access.
00:21:34 Kate Joyner
What methods can you use?
00:21:36 Jean Burgess
Well, traditionally, we’ve found all kinds of ways to access the digital traces of activity on those platforms, so you know, gathering lots and lots of tweets to try to map patterns of public communication around the Queensland floods was one of the first projects that we did. But if you want to understand what the platforms are actually doing in terms of what content they leave up and what content they take down, there’s even more creative methods that are required. At the same time, platforms are increasingly kind of locking researchers out from accessing data directly from their platform.
00:22:11 Kate Joyner
So you can’t scrape or trace.
00:22:14 Jean Burgess
Yes, there are things you can do. There are things you can do but also we’re…
00:22:18 Kate Joyner
What would be the motivation for doing that?
00:22:21 Jean Burgess
A lot of it is very sensible commercial motivations which, there are little gizmos called APIs that allow one piece of software to talk to another piece of software and get data from it, and they are open to misuse by bad actors who want to mess with the platform. They also have rules in place to prevent people just scraping all of the content from the platform and republishing it somewhere else, which would be bad.
00:22:47 Jean Burgess
But I think it’s fair to say there’s been a tense relationship in terms of oversight, so that the platforms would generally prefer to launch their own little funding schemes for research and handpick the researchers who get to do that research rather than having real public oversight, but a lot of our research projects at the moment are pushing towards involving the public quite broadly in donating their data, so using various kinds of software to enable people if they want to, to share with the researchers what search results they get, for example. Or what YouTube recommendations they get. So this is a really exciting new area of methods development for us.
00:23:27 Kate Joyner
And it shows what, I think of what an intensely boring person I am. Would that be revealed by the basis of what my search profile looks like?
00:23:39 Jean Burgess
Well, I’m sure you’re not boring at all.
00:23:42 Kate Joyner
Yeah, that’s, no I should stop that. It is a week where we celebrate International Women’s Day, so women’s tendency for self-deprecation is something we’ve got to monitor.
00:23:49 Jean Burgess
Yeah, don’t put yourself down!
00:23:42 Kate Joyner
But, what you individually, Professor Burgess, what interests, what research are you currently thinking of yourself? So, is it going to be Instagram, a Biography?
00:24:05 Jean Burgess
I’m working on a new book with colleagues called Everyday Data Cultures. That’s really about how datafication, so you know all of the different aspects of everyday life that somehow get rendered into data and aggregated and used for various kinds of commercial processes, how that’s actually experienced across all sorts of aspects of our everyday life, but also the role that we have as citizens in negotiating that, in resisting that and in educating ourselves and others. So that’s one big thing on my plate and I’m involved in a lot of these projects that use new kind of computational methods to figure out what platforms are actually doing in terms of their own automation of governance decisions and moderation and so on.
00:24:52 Kate Joyner
Just one last, I haven’t fed this to you before, so I apologise for that. It had occurred to us that the use of social media has changed the way that we provide learning for our students so that we, the way that we traditionally would teach, our methods, our design for engaging or sharing information and having students engage with information and content, we need to shift that too, particularly as the new generation comes and will expect new things. So is your School ahead of the curve, I’d like to think school communication be ahead of curve on that kind of thing? But…
00:25:23 Jean Burgess
Well, this is an opportunity to plug our Masters of Digital Communication, which is, it’s specifically designed around really skilling people up to operate as skilled communicators, but also in a range of kind of knowledge professions that require a sophisticated understanding of not only of social media, but of data science as well. And so it’s very hands-on and kind of immersed in the digital media environment, so learning, both learning computational methods as you learn to analyse what’s going on with social media.
00:26:02 Kate Joyner
So in fact we in Higher Education should probably undertake that Masters.
00:26:06 Jean Burgess
I think everyone should undertake that Masters.
00:26:08 Kate Joyner
It sounds like yeah, we’ll have to develop a little bit of a professional education component of that, Jean, do you think? And maybe QUTeX can promote that. Yeah, so thanks Professor Burgess, that was fabulous and thank you very much for your time.
00:26:24 Jean Burgess
Thanks for having me.