Brought to you by Bristol University Press and Policy Press, the Transforming Society podcast brings you conversations with our authors around social justice and global social challenges. We get to grips with the story their research tells, with a focus on the specific ways in which it could transform society for the better.
In this episode, Richard Kemp speaks with Dan McQuillan, author of Resisting AI: An Anti-fascist Approach to Artificial Intelligence, about what artificial intelligence really is.
They discuss how artificial intelligence damages society in ways that are not easily fixed and why it needs to be restructured from the ground up, as well as how these fundamental changes to AI can help create a better society for everyone.
Resisting AI: An Anti-fascist Approach to Artificial Intelligence by Dan McQuillan is available on the Bristol University Press website. Order here for £19.99.
Bristol University Press/Policy Press newsletter subscribers receive a 25% discount – sign up here.
Image Credit: Adobe Stock / Dmitry
Transcript:
00:00:03 Jess Miles
The Transforming Society Podcast is brought to you by Bristol University Press and policy press in episodes covering a wide range of social issues. We speak to authors and editors about their books and journals to get to grips with the story, their research tells and look at specific ways in which it could transform society for the better.
00:00:21 Jess Miles
I’m Jess Miles and in this episode my Co presenter Richard Kemp speaks to Dan Mcquillan about his book resisting AI.
00:00:28 Jess Miles
What is artificial intelligence, really? How does it erode our autonomy and perpetuate injustice? And how can we change it so that it works for the benefit of everyone?
00:00:39 Jess Miles
More information about Daniels book is available from our website, bristoluniversitypress.co.uk.
00:00:46 Richard Kemp
We all live with some form of artificial intelligence in our daily lives. For many of us, this means search engines, GPS navigation, next day delivery and the personalized recommendations of Netflix, Disney plus, Amazon Prime.
00:01:00 Richard Kemp
For others, AI means the removal of dignity of livelihoods and sometimes even lives.
00:01:07 Richard Kemp
In his new book, Resisting AI: An Anti-fascist Approach to Artificial Intelligence , Dan Mcquillan, lecturer in creative and social Computing at Goldsmiths, calls for a total restructuring of AI.
00:01:21 Richard Kemp
He argues that it causes damage to society in ways that cannot be reversed with easy, quick fixes that it needs to be structurally undone and rebuilt from the foundation.
00:01:30 Richard Kemp
Dan proposes an anti fascist approach to AI that flips the focus from exclusion and violence to community caring and freedom. Dan Mcquillan, thank you so much for coming on the Transforming Society podcast.
00:01:42 Dan Macquillan
Yeah, very glad to meet you.
00:01:44 Richard Kemp
And thanks so much for sweating it out in this unbelievable heat wave we’re having.
00:01:48 Dan Macquillan
It’s really tricky in London.
00:01:50 Richard Kemp
So let’s let’s get back it. Get get right into the book, if that’s.
00:01:53 Richard Kemp
Here with you.
00:01:56 Richard Kemp
So your book is about how to take an anti fascist approach to AI. At the top of the episode. There I just mentioned some forms of artificial intelligence such as, but there are other ones. Machine learning, nano robotics, smart watches, smart homes, self driving cars. AI has connected us globally in ways that anyone born before the year 2000 like myself.
00:02:18 Richard Kemp
Couldn’t even have imagined. Could you explain what you’re referring to in your book when you talk about artificial intelligence and how can something so revolutionary connecting need an anti fascist resistance?
00:02:29 Dan Macquillan
OK. Well, I’ll start with the first one, but what do I mean by AI, which is a pretty big question? Because AI has a long history. You know, pretty much since the Second World War. So lots of kinds of AI. And actually the kind that’s really big now, which is sometimes called connectionist AI. It’s the neural network stuff and the first working neural network was built in about 1948.
00:02:51 Dan Macquillan
So this stuff’s got a long history. Yeah, indeed. Yeah. You would. You would never guess, actually. But for most of that time, it didn’t mean that it meant much more what we might think of as AI. What I think most people would think of, especially from movies and so forth, which is something that actually thinks a lot of the time people building a I want to say, OK, how is it that people think?
00:03:11 Dan Macquillan
How does human intelligence operate? How can we embed that in some kind of computer or computational operation, and actually the stuff that’s happening right now is really taken over in the last?
00:03:24 Dan Macquillan
Really, the last decade, and it’s the resurgence of that stuff from a long time ago, which doesn’t try to imitate how people think. It tries to imitate how brains work, so that’s hence the neural networks and. And it was a it was a kind of computing. It was very hard to do for a long time because it needs a lot, a lot of computing power.
00:03:45 Dan Macquillan
A lot of tiny tiny calculations over and over and over again to make it work. And these neural networks are very.
00:03:51 Dan Macquillan
Deeply connected. It’s very, very demanding and before it was useless basically because.
00:03:56 Dan Macquillan
It just took weeks it took.
00:03:57 Dan Macquillan
Weeks to train anything to do anything, and by the time.
00:03:59 Dan Macquillan
We trained it, you know, life had moved on.
00:04:03 Dan Macquillan
Yeah. So but but the you know the the the connectivity you talked about you know the Internet, social media, all the stuff that you know you would you refer to pervades everyday life this in a way was part of the fuel for the AI we now have which is really neural networks is deep learning this this is this is the stuff of actual operating AI that was.
00:04:08
MHM.
00:04:21 Dan Macquillan
Hmm.
00:04:23 Dan Macquillan
That’s part.
00:04:24 Dan Macquillan
Being sort of fueled by just all the data because another thing you need about you don’t just need a lot of computing power for this.
00:04:31 Dan Macquillan
Kind of AI.
00:04:33 Dan Macquillan
You need really a ton of data. You know it’s it’s kind of very it’s it’s an imitative process. It learns to imitate and it and it does so quite slowly in a way right it it’s.
00:04:44 Dan Macquillan
It just has to repeat, repeat, repeat, repeat and to do that it needs.
00:04:47 Dan Macquillan
And almost unimaginable amounts of data which were wasn’t there before until along came the Internet and then along came social media. And long behold, everybody was generating loads of data to act as the raw material for for this new kind of AI. And this this new kind of AI. So we had the takeoff we had like 10/10/12 years.
00:05:02
Right.
00:05:09 Dan Macquillan
Suddenly, object recognition, image recognition, facial recognition suddenly started to really work, you know. And then since as you said, since that time we’ve had, you know, self driving cars, OK, they’re not on the roads yet, which is I think a pretty good thing. And if I if I, you know, if there’s ever self driving cars in any.
00:05:25 Dan Macquillan
City. I’m in. I’m hoping to move.
00:05:28 Dan Macquillan
But but they they are uncanny, right? I mean, look at the, you know, the they’re an unimaginably clever technology inside, you know which which which and. And like a lot of this stuff is very impressive. And and that’s only in the last decade. So you’ve had.
00:05:43 Dan Macquillan
This self driving cars, autonomous systems of various kinds and now language, language. Right as we’re speaking today, 2020.
00:05:49 Dan Macquillan
2.
00:05:50 Dan Macquillan
Language is the big area of breakthrough with AI, it’s suddenly seems to have cracked, you know, human communication, understanding and reproducing language. So yeah, we we you know, it’s an incredible technical.
00:06:03 Dan Macquillan
Innovation. It’s incredibly disruptive and rapid rise of AI. It’s shooting through the ecosystem. I mean, the examples you’re talking about, so you know sort of ranking algorithms and, you know, recommend their algorithm on YouTube. I mean, that was the it’s the tip of the iceberg. And the stuff is everywhere.
00:06:22 Dan Macquillan
And becoming everywhere in, in everyday life. And so. So that’s really what I what I what I also.
00:06:27 Dan Macquillan
Mean by yeah.
00:06:28 Dan Macquillan
If I not to ramble on too long.
00:06:30 Dan Macquillan
About it is well, yeah. Thank you.
00:06:30 Richard Kemp
Please do.
00:06:33 Dan Macquillan
Be careful what you.
00:06:33 Dan Macquillan
Wish for it’s it is this. It is this stuff. It’s very specifically this stuff. It’s this kind of computation that that takes.
00:06:42 Dan Macquillan
A big pile of data and very laboriously with with enough examples works its way through to a sort of imitation and imitation of recognizing face limitation, of driving whatever right it is. It is that. But it’s also.
00:06:56 Dan Macquillan
Institutions, I mean, this stuff just doesn’t do stuff by itself, right? Even Netflix is an institution, right? But also the welfare system is an institution. The military, where this stuff has a particular way of interacting with institutions, and it also carries with it, I would say, and that’s what I’m trying to write about in the book as well a set of social assumptions. So you’ve got.
00:07:15 Dan Macquillan
This kind of stack.
00:07:16 Dan Macquillan
You’ve got an actual kind of computing. You’ve got the ways it gets pushed into the world through actual institutions and organizations that have their own agendas, of course. And then you’ve got.
00:07:26 Richard Kemp
Of course, when?
00:07:28 Dan Macquillan
The the the.
00:07:28 Dan Macquillan
The the the assumptions that it’s built on basically.
00:07:33 Dan Macquillan
And and so those things together make it make it very powerful.
00:07:39 Richard Kemp
That’s that’s. Yeah. Thank you so much for explaining all that down. And for kind of, yeah, bringing it, bringing more to more to the focus of the the fact that these aren’t, yeah, sure. These are great advancements. But also look at all these concerns that are popping up as a result of all these great advancements that are sometimes coming.
00:07:57 Richard Kemp
Kind of kind of coming through the guise of look at how we’re going to make your lives better, but all we all we need from you is absolutely everything from your entire life. Just plug it all in please.
00:08:11
Definitely.
00:08:12 Dan Macquillan
Visible bits, right. And we we we know that our lives are touched by AI in those ways, but that’s what sort of offer.
00:08:17 Dan Macquillan
Is also being made to. If you like to other entities. OK, it’s saying for example to I. I think a very significant.
00:08:27 Dan Macquillan
Thing that occurs alongside this sudden explosion of AI is the fact that it occurred, you know, in a period after 2008, in a period of austerity, another offer for AI systems is to institutions, to governments, to corporations. Give us your tricky problems.
00:08:44
MHM.
00:08:44
Of.
00:08:45 Dan Macquillan
Rationing, essentially.
00:08:47 Dan Macquillan
Of doing more with less of dealing with situation where scarcity is built in, that’s what this technology is really good at. It doesn’t do anything. It doesn’t produce anything. But what’s really good is that kind.
00:08:50 Richard Kemp
Right.
00:08:59 Dan Macquillan
Allocating stuff or or allocating and withholding stuff. So at scale, you know at scale, that’s what it does, right? It handles recommendations on YouTube at scale. It can also handle recommendations on who should get a welfare.
00:09:02 Richard Kemp
Yeah.
00:09:11 Dan Macquillan
Benefit.
00:09:12 Dan Macquillan
At scale.
00:09:12
Right.
00:09:13 Richard Kemp
Right.
00:09:14 Dan Macquillan
And this stuff is a lot more. Currently I would say a lot.
00:09:16 Dan Macquillan
More invisible. Mm-hmm.
00:09:19 Richard Kemp
It’s certainly something I wasn’t aware of at all before. Before reading your book that that there’s more, there’s more.
00:09:25 Richard Kemp
Do.
00:09:26 Richard Kemp
There’s more to this AI situation than just you giving up your privacy, of course, giving up your privacy is a huge thing, but I didn’t realize how how much more kind of on a societal level, how much how much power it had to not AI specifically. But yeah, those who are, those who are creating the AI for their for their benefit and.
00:09:46 Richard Kemp
What appears to be our benefit, but just that.
00:09:51 Richard Kemp
That there’s there’s so much going on at a societal level. Sorry, is what I’m trying to say, and I that’s something I wasn’t aware.
00:09:56 Richard Kemp
Of at all before reading your book.
00:09:58 Richard Kemp
I.
00:09:58 Dan Macquillan
Think I well, you know, if the book has, you know.
00:10:00 Dan Macquillan
Has.
00:10:01 Dan Macquillan
Has any kind of impact or effect? I hope that’s.
00:10:04 Dan Macquillan
One of them, you know, to make to, to help people look at it differently because we are becoming, I mean it’s it the the because players in that way is well timed because we are all becoming a lot more aware of of this thing called AI and those people who are paying a bit of.
00:10:18 Dan Macquillan
Tension, I think, are also starting to become pretty aware that it’s maybe not all good news. You know, they’re they’re going. Oh, yeah, this stuff seems to be redoing all the whole racism thing. You know, it seems to have some problems, right. So there’s definitely bias and discrimination going on.
00:10:23
Hmm.
00:10:30
MHM.
00:10:33 Dan Macquillan
Sure. But but he?
00:10:35 Dan Macquillan
And and and that’s all true, you know, and it’s definitely. Yeah, that’s actually a, you know, a pretty core.
00:10:40 Dan Macquillan
Problem not easy to wave, not easy to.
00:10:42 Dan Macquillan
Wave away but.
00:10:43 Dan Macquillan
I think what it’s also doing is.
00:10:46 Dan Macquillan
It’s coming up.
00:10:47 Dan Macquillan
You know, my reading of AI is not trying to be deterministic. It’s not trying to say AI caused this AI. You know, by itself changes the world in this way. What we we need.
00:10:54
M.
00:10:59 Dan Macquillan
To.
00:10:59 Dan Macquillan
Look at, I would say with any technology is what else is going on at the same time. Is it kind of you know, you said it makes offers to us, right, which it does the consumer end.
00:11:07 Dan Macquillan
You know, like you, you can unlock your phone with your face and this kind of stuff, you know, and we think it’s incredible, but also pretty superficial. It’s making a lot of other offers in a lot of other ways at a time when a lot of.
00:11:20 Dan Macquillan
Organizations, institutions and and sort of powerful people are in the position of having to make some, you know, having to face some challenges which they didn’t face before and.
00:11:30 Dan Macquillan
You know, along along different fronts. And you can see that that’s that that kind of pressure is resulting in you know quite a lot of changes in what we think of as.
00:11:40 Dan Macquillan
You know, democratic government. You know what we think of, as you know, liberal values in practice, you know, seem to be being chucked overboard almost faster than we can. We can look.
00:11:51 Dan Macquillan
At the.
00:11:51 Dan Macquillan
Moment, right and.
00:11:52 Dan Macquillan
And I think it would.
00:11:53 Dan Macquillan
Be it you know, it’s it’s I. I just think it’s.
00:11:56 Dan Macquillan
Important to consider.
00:11:57 Dan Macquillan
What AI does alongside that?
00:12:01 Dan Macquillan
In what way?
00:12:02 Dan Macquillan
It it makes an offer to those kind of operations, what way it?
00:12:05 Dan Macquillan
And even amplify those operations it what might even intensify those operations, right, that’s that’s really the answer. The very belated answer to your second initial question, which is why have an anti fascist approach?
00:12:18 Dan Macquillan
Or why even why?
00:12:19 Dan Macquillan
Even conceive of it as something that needs to have what sounds like a kind of extreme response. And I mean that.
00:12:25 Dan Macquillan
Sounds very over the topic. What you’re saying AI’s fashions having come up, you know, like might might have a few problems and maybe it discriminates, but you know, like, steady on basically what I’m trying to say is that.
00:12:36 Dan Macquillan
There’s a there’s a.
00:12:38 Dan Macquillan
You know an apparatus here, which is the AI.
00:12:41 Dan Macquillan
But it’s also.
00:12:43 Dan Macquillan
The institutions using AI, the kind of politics that looks to AI and the kind of.
00:12:49 Dan Macquillan
It’s that two way thing. You know, AI is a solution. It’s technology doesn’t actually solve anything. And and in fact what it does.
00:12:53
Hmm.
00:12:57 Dan Macquillan
To.
00:12:57 Dan Macquillan
You because of the way it works is it tends to gloss over the underlying structural problems in.
00:13:03 Dan Macquillan
It.
00:13:04 Dan Macquillan
And it’s it. It. It’s basically a kind of quick fix technology. Here’s a problem, OK.
00:13:07
Mm-hmm.
00:13:09 Dan Macquillan
Let’s get enough data. We’ll throw some algorithms at it. You know, we’ll find a way of of producing a kind of automated or semi automated.
00:13:17 Dan Macquillan
Way of dealing with the situation and it’s it’s it’s a tech it it is offering a tech solution to things that really don’t have a tech solution and and right now you know whether you look at refugee flows or climate change that that’s not the kind of solution we need. Basically in my opinion.
00:13:36 Richard Kemp
Yeah. Ohh yeah, that’s thank you. Thank you so much Dan, that’s.
00:13:41 Richard Kemp
It’s a huge topic. It’s not. Not only did I not realize how much I didn’t know, I didn’t realize how huge it was either. And yeah.
00:13:48 Dan Macquillan
Well, I’m hoping the book has that effect as well. You know, to get this stuff on people’s radar because you know, they, you know, people are concerned about this stuff and and and those of us who you know, I I mean obviously I I work in and around the area. So it’s my my daily bread to some extent but.
00:14:05 Dan Macquillan
I think the awareness of AI is is there in society, particularly around things like facial recognition, and that’s an important area.
00:14:12 Richard Kemp
Sure.
00:14:12 Dan Macquillan
But I’m I think it’s much more to do with understanding its role in a generalized transformation. At the moment, you know which is which is fairly crisis driven and crisis driven transformations carry their own.
00:14:26 Dan Macquillan
Risk.
00:14:27 Dan Macquillan
Because all of the tendency which is, you know, human and social and political, as well as technical to kind of to panic, you know, to sort of reach for a quick fix, especially if your motivation is to keep things the.
00:14:39
MHM.
00:14:41 Dan Macquillan
Way.
00:14:41 Dan Macquillan
They are, you know, not to make, not to make any fundamental transformation, certainly not to lead to any.
00:14:42
Right.
00:14:46 Dan Macquillan
Sort of widespread redistribution of like, who has a say in society? You know, if you want to really just mainly keep things where they are but you’re struggling with the general breakdown of, you know, authority or belief in the way things have been done before.
00:14:58
MHM.
00:14:59 Dan Macquillan
What? You’re just going to reach for the stuff that’s to hand, you know.
00:15:02 Dan Macquillan
They might sort.
00:15:03 Dan Macquillan
Of fix and right now you know AI is one of those things. AI is very.
00:15:08 Dan Macquillan
Much to hand.
00:15:09
MHM.
00:15:09 Dan Macquillan
If you’re big enough and powerful enough.
00:15:12 Richard Kemp
That’s the thing too. Yeah. Yeah, there was a a point in the book that I found very interesting as well about how.
00:15:18 Richard Kemp
AI is all like the main the main focus for AI is always to preserve the status quo.
00:15:26 Richard Kemp
That’s that’s it’s it’s it has something to do with the fact that AI is that all of the all of the knowledge that AI could possibly ever have is based on the past and the past. The past as we know it up until today hasn’t necessarily always been so equal. And and so AI is going to keep continuing to to keep that status quo.
00:15:46 Richard Kemp
Going.
00:15:47 Dan Macquillan
I think so. I think I think.
00:15:48
Yeah.
00:15:48 Dan Macquillan
That’s, you know, I I I’d argue that I’d argue.
00:15:52 Dan Macquillan
That it’s that’s not as you know, that’s not.
00:15:55 Dan Macquillan
That in some ways is a straightforward.
00:15:57 Dan Macquillan
And that actually goes right down to the roots of how AI approaches, you know, making its making its classifications. I wouldn’t say decisions, but making its classifications, drawing its boundaries. It can’t do anything else, you know it. It can only learn from the.
00:16:10 Dan Macquillan
Data we all know and.
00:16:12 Dan Macquillan
Now there there are some pretty cool things around. I mean, I personally love playing with some of the.
00:16:16 Dan Macquillan
Creative AI tools. You know the sort of image generating tools and things like this that have a kind of creative aspect to them. Definitely. But that’s not, that’s not a sort of innate creativity that’s more a sort of mix and match. That’s a kind of collaging of what came before. It can be very funny, but it’s not, it’s not original in the sense that.
00:16:37 Dan Macquillan
You know, having an original solution to a problem by looking at it utterly differently can be. I mean, I really, I really do think you know well.
00:16:48 Dan Macquillan
AI is all about correlations. It’s all about finding, and that’s one of the things also makes it fundamentally problematic. It’s all about because it’s not even built on a sort of causal model. You know, it’s not even looking at why something happens. It’s just looking at what tends to happen at the same time that something happens or who happens to be in the room when something happens, or who happens to be friends with somebody when something happens. And we all know.
00:17:10 Dan Macquillan
For a star that that’s quite problematic, you know, that’s that’s kind of guilt by association, essentially in in all kinds of dimensions.
00:17:19 Dan Macquillan
But but the biggest correlation that it has, I think is as you say, with the status quo.
00:17:25 Dan Macquillan
You know it.
00:17:25 Dan Macquillan
It’s just a reproductive mechanism for the structures that we already have. Mm-hmm, one might argue is I definitely would that that’s not adequate right now.
00:17:36 Richard Kemp
Right, right.
00:17:37 Richard Kemp
Yeah, I wanted to to kind of focus a little bit on. There was a point in early on in your book where you talk about the difference between real intelligence and and the performance of intelligence. You mentioned it a bit at the beginning of this episode today. And I think when I first read that I thought, ohh yeah, I experienced that a lot. Like when I try and get my Alexa to do something and she doesn’t understand me. So I have to figure out the most clunky.
00:18:00 Richard Kemp
Way of saying something and she’ll finally do it.
00:18:02 Richard Kemp
Or I just don’t ask her to do those things anymore because I know that that’s something she doesn’t have in her wheelhouse. And so I’ll get my phone out instead. But you give way scarier examples of this. There was a an image, an image image algorithm that you described. I think it was called Imagenet, where there there’s some, you say, about how.
00:18:23 Richard Kemp
AI the way AI works is through categories.
00:18:25 Richard Kemp
Relation and that this this image software algorithm. Sorry for not probably not using the terms correctly but they it was categorizing these different images and some examples you came out with were for example people people escaping a flood categorized as people on a.
00:18:42 Richard Kemp
Beach.
00:18:44 Richard Kemp
A crashing plane categorized as a plane.
00:18:46 Richard Kemp
The.
00:18:46 Richard Kemp
Mac.
00:18:47 Richard Kemp
A A soldier holding down a young boy as her as their family fights back. Labelers labels that as people sitting on the bench together.
00:18:55 Richard Kemp
Those are, I mean, those were harrowing examples and I just, I mean, my my first knee jerk was just like surely this is anomaly. But I’m. I’m afraid to think that maybe it’s not and that I guess, I guess I’m wondering, you know, how greatly concerned should we be about about this sort of performed intelligence over?
00:19:16 Richard Kemp
Genuine. Intelligent.
00:19:17 Dan Macquillan
Sure. Well, the short answer is probably very.
00:19:23 Dan Macquillan
I mean I.
00:19:24 Dan Macquillan
I can expand on that a bit. I mean, one thing I would.
00:19:25 Richard Kemp
Please.
00:19:27 Dan Macquillan
Say is that the?
00:19:28 Dan Macquillan
Book that I wrote. It’s not.
00:19:31 Dan Macquillan
And exploration of the idea of intelligence. As such, I would say or then maybe it has.
00:19:35 Dan Macquillan
A.
00:19:35 Dan Macquillan
Kind of. It maybe has a kind of social intelligence agenda in the.
00:19:40 Dan Macquillan
Sense that I am arguing.
00:19:41 Dan Macquillan
For.
00:19:42 Dan Macquillan
Ways that would on the whole, be better for the common good, for things to be done. So maybe there’s that’s that’s the kind of intelligence as well kind of the motion intelligence. I mean, I put a lot of emphasis.
00:19:53 Dan Macquillan
On care, which is.
00:19:54 Dan Macquillan
We could talk about. Mm-hmm. But I’m not really speculating in a in a sort of philosophical sense, as people do like to do quite a lot about whether a machine is really intelligent or not and large during tests and everything like that. I’m not really interested because my interest is what this stuff does in the world. You know, what effects it has now that that is relevant to your question, I mean, those those images.
00:20:14 Dan Macquillan
You know which, which are examples that you know other people have high.
00:20:18 Dan Macquillan
What the for starters, what they’re doing is really reflecting the fact that this sort of parroting effect, this imitative activity of AI. Mm-hmm. So it has fundamentally no idea about the world. OK, it really doesn’t it, you know, it doesn’t have any ideas. Full stop, right. But certainly, you know, it it it just has patterns.
00:20:38 Dan Macquillan
Derived from the world, then it there’s no understanding, not just causality, but.
00:20:41 Dan Macquillan
Next, he looks at those vivid and look at that famous image of a girl running away from from napalm in Vietnam. You know the sort of classic image in 1960s and, you know, we classify it as, you know, undressed girl running down the road or something has no understanding of context. Yeah, also comments. But. But, you know, that’s from the point of view of the people.
00:20:57 Richard Kemp
Oh.
00:21:02 Dan Macquillan
That’s 99.8% accurate. I mean, you know it’s it’s.
00:21:07 Richard Kemp
In that way it’s a.
00:21:08 Richard Kemp
Success, yeah.
00:21:09 Dan Macquillan
You know well, exactly in that way. Success. Yeah, absolutely, absolutely. And just to go back to your original point about this kind of performance idea, because what I think you would describe when you’re talking about the Alexa thing you were talking.
00:21:12
Yeah.
00:21:20 Dan Macquillan
In a way, how Alexa was altering your behavior?
00:21:24 Dan Macquillan
Fit in fit to fit in with the Alexa device itself or the Alexa operations. Right. And I think what what I’d be what I have written about actually in the book has performative is the this broader tendency for this kind of stuff to produce the subjects it expects, you know?
00:21:28 Richard Kemp
Definitely.
00:21:44 Dan Macquillan
So it will kind of through its sort of pervasive capacity to do this kind of classification, it shapes our daily life in ways that we.
00:21:55 Dan Macquillan
Well, obviously we start to behave in the way that it expects an Uber driver, and this is well documented. Uber drivers have all sorts of.
00:22:04 Dan Macquillan
Formative behaviors that they understand the algorithm needs to see, even if that’s not how they would normally either drive, that’s not how they normally reactive passengers, whatever it is.
00:22:15 Dan Macquillan
They, they. They.
00:22:16 Dan Macquillan
Very well understand. You know how they’re being judged and classified and automated and ongoing way. And so they alter their behaviour to do that. But what I think is even more.
00:22:25 Dan Macquillan
Or or what? You can’t really disentangle that from is, you know, if the experience of that is so pervasive and also becomes your actual lived experience. So.
00:22:35 Dan Macquillan
You end up.
00:22:38 Dan Macquillan
Not just performing, but experiencing life as this subject that the AI expects very concretely. Very concretely. There’s a then at Amazon.
00:22:48 Dan Macquillan
Delivery drivers in the states have to have this thing called a neutrodyne camera array on the NEUTRODYNE is it’s so it’s it’s it’s all that kind of stuff, all that very clever stuff. It’s an AI camera system.
00:23:01 Dan Macquillan
Points backwards along the van points forwards and so on and so forth, and it analyzes in real time the what’s going on in the van. They are sold to the drivers, as you know, you’ll get extra kind of credit for driving safely and you won’t have to worry about false accusations of stealing parcels because you know the AI’s got an.
00:23:19 Dan Macquillan
In it, so don’t worry.
00:23:20 Dan Macquillan
You know all this kind of stuff, you know? Of course, the drivers aren’t full. That anybody introduced their benefit, right, but what what does happen is they, you know, like, bit like the self driving cars, which turns out have a tendency to kill people.
00:23:32
It’s easier.
00:23:33 Dan Macquillan
These algorithms are not, you know, they they they have quite strong limitations like your Alexa. And one of the things in the genuine algorithm is that.
00:23:43 Dan Macquillan
In its programming to.
00:23:46 Dan Macquillan
Define what is safe driving and therefore how would you say inhibit drivers tendency perhaps to accelerate and pull up too close to the the kind of front. It’s a situation where somebody may pull wrap in something in front of you.
00:24:02 Dan Macquillan
As well, this is what the drivers themselves report. They then get reported for unsafe driving so so.
00:24:08 Dan Macquillan
You’re driving wrong then.
00:24:09 Dan Macquillan
They’re doing what they do. Some idiot, basically.
00:24:11 Dan Macquillan
Pulls in in front of them.
00:24:12 Dan Macquillan
Yeah, yeah, the AI goes. You’re a bad.
00:24:14 Dan Macquillan
Driver.
00:24:15 Dan Macquillan
Now, of course they know this is rubbish and they complain about it, but.
00:24:19
MHM.
00:24:20 Dan Macquillan
In the way these systems work, the NEUTRODYNE algorithm is seamlessly integrated into the bonus payment system for trying. Right. Right. Yeah. Yeah. So.
00:24:27 Richard Kemp
Ohh I see.
00:24:29 Richard Kemp
Yeah, I can see where this is going.
00:24:30 Dan Macquillan
Yeah, you can eat.
00:24:31 Dan Macquillan
Exactly. Exactly right. So it’s all very automated, always see in all these words that.
00:24:36 Dan Macquillan
People who promote systems like AI would would elevate as values right? This works seamlessly and it it works at scale. You know it’s very efficient in the sense that it it works according to its own metrics very effectively, right? So and So what happens in the real world for those of us?
00:24:56 Dan Macquillan
Inhabiting the real world is drivers get their bonus payments docked in ways that they then find very hard to appeal.
00:25:03
Because.
00:25:04 Dan Macquillan
Because this system is not really built with a, let’s say, democracy in mind, you know it’s built with optimising efficiency of deliveries in mind, you know, and if some drivers end up as collateral damage in that.
00:25:08 Richard Kemp
Right.
00:25:15 Dan Macquillan
There’s are there always other drivers like there’s always nothing Amazon workers. So yeah. And the thing I’m really trying to get out there, I mean, we can see that that’s not necessarily fair or just we can see that it starts to create a a sort of a in the same way with Uber drivers, a kind of an algorithmic miasma around you which might start to make you feel a little bit.
00:25:19
Mm-hmm.
00:25:36 Dan Macquillan
Annoyed about what you do and how.
00:25:38 Dan Macquillan
You do it definitely.
00:25:39 Dan Macquillan
Yeah, but the thing but the thing I think really is that you will end up with that as your actual lived experience. I mean, it doesn’t matter what you think.
00:25:47 Dan Macquillan
About your driving, if AI thinks you’re a bad driver, in effect you are a bad driver and everyone else will react to you as if you’re a bad driver, so you will start to experience it as if you are a bad driver. You know this becomes.
00:25:57
Yeah.
00:25:58 Dan Macquillan
Has.
00:25:58 Dan Macquillan
Becomes actually constructive of what they say, like to say in kind of academia land, you know, constructive of our subjectivity is like how we experience life.
00:26:07 Dan Macquillan
And that that’s just.
00:26:08 Dan Macquillan
You know one amongst many examples of how this stuff is filtering out into daily.
00:26:12 Richard Kemp
Life. Yeah. Well, yeah. I saw in your book you give quite a lot of examples. Yeah, of that exact Manchester the.
00:26:23 Richard Kemp
Kind of like a.
00:26:25 Richard Kemp
You paint a picture of like workers who are living a an anxious, exhausted life with a kind of like a a data hungry panopticon, always around them, always watching, always, always penalizing.
00:26:40 Richard Kemp
Does and also cause cause? You’re you’re talking about an anti fascist resistance and and kind of talking about the of, you know, resisting the the veer into authoritarianism or even fascism. That was that when I started reading about those those stories there with the with the always exhausted, always anxious workforce that was the the first time where I started seeing like.
00:27:02 Richard Kemp
Oh, this is this actually sounds like this. If if if allowed to continue to progress this this is. This is where I noticed that it appears to possibly, maybe potentially veer into authoritarianism and or maybe fascism. I guess I was just wondering if you could talk to us a bit about that.
00:27:24 Richard Kemp
If that, if that’s like a sign, I suppose, or even just like a a building blocks of going into.
00:27:30 Dan Macquillan
Yeah, exactly. Building blocks. Yeah. No, I mean, I think that’s that’s. I’m glad to hear that it has, you know, you know, I mean obviously we’ve been focusing a bit on this sort of more the topic side of things so far in the conversation and and I and I think that’s fair enough you know and obviously this book is meant to be a sort of warning as much as a rallying cry and and and I think I would I make that association the same one as you I look at the effect on.
00:27:50 Dan Macquillan
Workers and workplaces, and both in the production of AI itself and in more broadly, what it’s actually used for, and I say yeah, that that’s pretty bad and.
00:27:59 Dan Macquillan
I.
00:27:59 Dan Macquillan
Think it’s.
00:28:01 Dan Macquillan
It’s a source of. I mean it should be a source of regret.
00:28:04 Dan Macquillan
To all of us.
00:28:04 Dan Macquillan
That our times most advanced technology, which I think AI has a good claim to actually, you know it’s very sophisticated, it’s very clever in that sense, right. And it certainly uses a lot of resources. So you know what? Why have we ended up in this situation where?
00:28:21 Dan Macquillan
Our most advanced technology is actually putting the conditions of. Let’s just think about work for a second is taking is is is enabling a reversal of working conditions by 100 years, a time before all of the protections that people had to fight so hard for in terms of their their conditions and their own working life. You know what?
00:28:31
MHM.
00:28:40 Dan Macquillan
Why is?
00:28:40 Dan Macquillan
It.
00:28:40 Dan Macquillan
That why is it that this incredibly advanced future phase technology, which in movies is portrayed as the sci-fi future, is actually a time machine into Victorian working relations? That that’s the most that seems, you know that that at least is worth question. Yeah. And. And you know, on your specific question about well, is this a kind of?
00:29:00 Dan Macquillan
Symptom or or a sort of indicator? I would say that it’s a direct indication of slide towards authoritarianism in one simple sense, which is that historical fascism.
00:29:12 Dan Macquillan
Has always been used in different ways by the powers that be. There are a few occasions when fascism itself, fascist movements, fascist political parties, actually took over and you know we’re actually living in those times again, right. You know, I I think if you look at governments in places like well, E Hungary, Poland maybe.
00:29:32 Dan Macquillan
Brazil as well, you know, increasingly you would say, OK, these these governments would actually fascistic elements in them.
00:29:39 Dan Macquillan
But most of.
00:29:39 Dan Macquillan
Fascism is a is a kind of.
00:29:42 Dan Macquillan
It’s it’s. It’s a foot soldier of the status.
00:29:45 Dan Macquillan
Quo.
00:29:45
Right.
00:29:46 Dan Macquillan
And this is brought to play under certain conditions when things are sort of coming apart a bit and it’s hard to maintain, maintain control using the usual sort of democratically legitimized methods, you know the status quo.
00:29:58
Hmm.
00:30:02 Dan Macquillan
That the elements of the status quo feel that they need to reach for something a little bit extracurricular? That’s that’s where fascism comes in. And it’s always been used against the workers, right? Yeah. Historical fascism.
00:30:15 Dan Macquillan
It’s always anti worker so. So at the very least if you’re looking at technology that.
00:30:18 Dan Macquillan
Seems to be.
00:30:20 Dan Macquillan
Incredibly anti worker, you might say. Well, that’s an affordance, you know, that’s that’s the capacity that it could have to align with other agendas. But I just I just want to also short circuit for a moment because it came up and talking in my mind just just to broaden it beyond momentarily beyond the.
00:30:31 Richard Kemp
Yeah, of course.
00:30:36 Dan Macquillan
Beyond the workplace, which I’m sure we’ll get to anyway.
00:30:37
Mm-hmm.
00:30:38 Dan Macquillan
But and and make a sort of direct link with what everybody would probably recognize as the reality of of sort of far right agendas, let’s say. And let’s look at the recent what the what’s called the Dobbs decision. You know, the overturning of Roe versus Wade.
00:30:51 Dan Macquillan
In.
00:30:51 Dan Macquillan
The USA, you know the the, the making illegal of the right to abortion in the essay.
00:30:52 Richard Kemp
Right.
00:30:58 Dan Macquillan
And that that’s clearly a far right agenda. That’s clearly a long term result of of planning by political elements that are in the far right, but put that alongside the capacities of AI. You know, if we can, if we’re talking about constructed.
00:31:16 Dan Macquillan
So OK, just to be a bit jargony from constructive subject.
00:31:19 Dan Macquillan
Positions. You know who we.
00:31:20 Dan Macquillan
Are who we are seen as, who the system sees us as, and you’re looking at the capacities of systems like AI, which absolutely specialize in.
00:31:29 Dan Macquillan
Inference right. They do prediction if they look at you, Richard Kemp, and they say, you know, are you likely to be the kind of person that leaves your workplace within the next six months? That’s what human resources assistance using AI now do right. But everyone’s data through a system and they look for the weakest link they look for.
00:31:44 Dan Macquillan
The people that like to leave or something like that.
00:31:47
Yeah, yeah.
00:31:48 Dan Macquillan
Check.
00:31:49 Dan Macquillan
Check. I’d say check your HR system and this stuff is this stuff is like now. As you know in AI time as well as the hills, it’s almost one of the first applications.
00:31:57 Dan Macquillan
Of this thing.
00:31:58 Dan Macquillan
Yeah. No. So you know again another fairly visible ones to most of us, yeah. And we don’t know we’re being sort of rated and and and assessed you know.
00:32:03
Yeah.
00:32:06 Dan Macquillan
And I mean, the poor young people. They know that because when they apply for a job, they usually have to do it through some kind.
00:32:12 Dan Macquillan
Of AI system.
00:32:12 Dan Macquillan
Because they admit a video to some stupid AI powered app to even get the chance to have a.
00:32:17 Dan Macquillan
You know, an Asian and again, you know, they have to perform in certain ways, blah blah blah. I mean, it’s all very, very much part of the same thing, but just to.
00:32:24 Dan Macquillan
Go.
00:32:24 Dan Macquillan
Back to the discussion about abortion you’ve got.
00:32:27 Dan Macquillan
A situation in the states now where being being.
00:32:30 Dan Macquillan
You know being.
00:32:31 Dan Macquillan
Inferred to be pregnant and not because you’ve reported pregnancy to anyone but 20 years ago, the target.
00:32:37 Dan Macquillan
Chain Store was able to predict this is using data analytics, but pretty the kind of AI we’ve got now which is 100 times more powerful. They were able to identify women who were pregnant from patterns of sales of things like cotton buds and lotion and what have you.
00:32:51
And.
00:32:52 Dan Macquillan
And.
00:32:53 Dan Macquillan
That that was a very early, very yeah, incredibly powerful AI. That second dates, like you said at the beginning and every aspect of your life. Yeah. No problem. A lot of the time in, in, in, coming up with an in 3rd category of pregnancy. Now, whether or not that sounds inaccurate, it doesn’t matter. It becomes a category in itself. And I just wanted to get to the point. I’m trying to get to you.
00:33:13 Dan Macquillan
Really long winded.
00:33:14 Dan Macquillan
Rain is that is now a potential.
00:33:16 Dan Macquillan
Crime scene right in the USA. That’s a crime scene, right? That’s being pregnant is is a sort of pre a state of pre crime in the states. If your pregnancy doesn’t come to term in a way that gets also recorded by the system. Right. So you.
00:33:31 Dan Macquillan
And that that.
00:33:33 Dan Macquillan
Yeah. So that implication of AI is wholly embedded with.
00:33:38 Dan Macquillan
The politics of the USA, which is this Christian far right resurgence.
00:33:44 Dan Macquillan
Which is why I.
00:33:45 Dan Macquillan
Which is why I guess I’m writing a book which has a fairly.
00:33:49 Dan Macquillan
Intense sounding subtitle of having an interest.
00:33:51 Dan Macquillan
Rate.
00:33:52 Dan Macquillan
I’m saying we we not only can’t ignore these kind of implications, whether it’s in work or outside of work, we should really be doing.
00:33:58 Dan Macquillan
Something about it?
00:34:00 Richard Kemp
Yeah, definitely. Yeah, that’s a well, that’s such a powerful phrasing of it as well of a that’s a that’s a, that’s a crime scene because you’re not. Yeah.
00:34:08 Richard Kemp
If you’re not coming to terms with your pregnancy, however, you know for, for whatever you know, I mean people, people should and you know, be able to, you know, do whatever the do whatever they want and or need. But if you’re not following.
00:34:22 Richard Kemp
Following that trajectory, then in terms of the AI, which is a tool for the state, which is the, which is the institution which has control of the AI they then, yeah, then then you’re not, you’re not. You’re not doing that process correctly and therefore you will be penalized and or you know worse.
00:34:40 Dan Macquillan
You’ll you’ll be being nice either there or let’s look at the Uber driver data. Did somebody take you across?
00:34:44 Dan Macquillan
State lines I mean.
00:34:45 Dan Macquillan
Yeah, yeah, the the implications are.
00:34:45
Right.
00:34:47 Richard Kemp
Oh, you create a web of create a web of criminals, possibly in your wake. Yeah. Yeah. Wow.
00:34:48 Dan Macquillan
Later, for one. Yeah, yeah.
00:34:51
OK.
00:34:56 Richard Kemp
I also wanted to talk about the the hopeful side because you spend a you spend a good, a good chunk of your book, you spend a good chunk of your book on in the second-half of your book talking about what what can be done and also what is being done right now. For example, you use the, you use the example of workers.
00:35:00 Dan Macquillan
Yeah, let’s do that.
00:35:15 Richard Kemp
Who are who? Rise, who were are rising up against the injustices within their own institution. I was wondering if you could talk to us about about that, please.
00:35:23 Dan Macquillan
Sure. Thanks. Yeah. Well, thank you know, thank goodness we got to the bit about what can we do about it. And I mean that’s a that’s a big part of the book as well. You know like I, I draw most of my thinking from reading other thinkers if you know.
00:35:35 Dan Macquillan
What I mean?
00:35:35 Dan Macquillan
Most of those are in academia, one way or another, and you know I have, you know, I I rely on people’s insights.
00:35:43 Dan Macquillan
And and critique, you know, which would have to call critique now, but the where I kind of part company is that a lot of that critique is is is very sophisticated, you know. And I think it has genuine insights but often tends to stop at the bit where.
00:35:56 Dan Macquillan
It’s it comes to OK, what are we gonna do about in practical terms? You know what? We should push back? I mean, it’s something about academia. Something about the way the whole thing is.
00:36:04 Dan Macquillan
Structured is that you?
00:36:05 Dan Macquillan
Usually get the end of the paper or even a book and people go, or the people, but I’m not trying to point anyone in particular about the the the text itself will often say something like well, you know this.
00:36:14 Dan Macquillan
Really needs to be studied further or.
00:36:16 Dan Macquillan
You know, or maybe you know this is clearly wrong and doesn’t fit with, you know the way.
00:36:23 Dan Macquillan
Liberal values weren’t there. I’m very I was very concerned about this book. Very not to.
00:36:29
The.
00:36:30 Dan Macquillan
Dystopian book for not just to be pretty kind that really I wanted to talk about. OK what do we do about it? And as you say I I I that’s where the second-half of the book.
00:36:39 Dan Macquillan
And.
00:36:39 Dan Macquillan
I tried to break it down.
00:36:41 Dan Macquillan
Into.
00:36:42 Dan Macquillan
A few stages really, because I think that AI has deep roots. You know, AI is something we could look at as a kind of mirror in a way. It’s like a focusing mirror, you know, but like, those kind of.
00:36:52 Dan Macquillan
When something that focuses radio signal focuses the sun, it’s it takes in everything that’s there.
00:36:57 Richard Kemp
Right.
00:36:58 Dan Macquillan
You know, sort of focuses it down into kind of.
00:36:59 Dan Macquillan
Laser like spot, you know, but.
00:37:02 Dan Macquillan
So so we can’t just look at the you know, we just can’t look at the focusing.
00:37:05 Dan Macquillan
Mirror we have to look at where all that stuff is coming from.
00:37:07
Right.
00:37:08 Dan Macquillan
And and yeah, you can trace AI and I I tried to do that a bit and say, well, where does it, you know, why do we even give this stuff any credibility given its obvious flaws which I do work through in some length, you know, in the first half?
00:37:19 Dan Macquillan
Of the book and practical. Yeah, yeah.
00:37:20 Richard Kemp
I’d say it’s important to do that though to to give us not only what is AI, but what are the problems before we before we start talking about the hopes and solutions.
00:37:28 Richard Kemp
Or like.
00:37:28 Dan Macquillan
Thanks. Yeah. Yeah, I tried to make it.
00:37:31 Dan Macquillan
That way because because.
00:37:33 Dan Macquillan
Also, I’m trying to draw some particular lines. I’m not trying to be angry about it. I’m trying to say, OK, this is exactly how it works. Yeah, it works. This is EXACTLY II how people are using it and therefore that’s why we need this kind of intervention. You know, it’s not not saying, OK, we need you know.
00:37:47
M.
00:37:50 Dan Macquillan
Fair society, blah blah blah, right? It’s like, OK, this is specifically the way it works. So.
00:37:54 Dan Macquillan
Example, you know, AI derives its some of its claims to authority from science. It’s not a scientist, it’s not scientific and but it does do some of.
00:38:05 Richard Kemp
As in AI isn’t scientific. Well, yeah.
00:38:08 Dan Macquillan
Most practitioners would believe that they’re doing at least computer.
00:38:11 Dan Macquillan
Science.
00:38:12 Dan Macquillan
OK, which is a particular term?
00:38:13 Dan Macquillan
But they would, they would.
00:38:14 Dan Macquillan
Also believe that you know.
00:38:16 Dan Macquillan
The work that’s being done OK, maybe it’s not fully objective. I think people recognize that, but they would say it you can, you can sort of calibrate its subjectivity to some extent. It is empirical, it’s working on data. You know you can make estimates of accuracy, you can in fact quite a lot of practitioners would say.
00:38:38 Dan Macquillan
You know the fact that there are biases in AI actually gives us a chance to statistically correct those biases. So some people would say that the empirical approach of AI is actually more progressive in some way than than than the status quo. That’s particular kind of thing. So. But. But I would say like just.
00:38:57 Dan Macquillan
You know, in a kind of broader sense, we’re all very inculcated with the idea that something that is based on data, something is that does user statistics for analysis, something that you know is done by experts and and draws its language. A lot of the time from the.
00:39:13 Dan Macquillan
The language of science has some kind of authority, some kind of ability to what? What I sold as is like, you know. OK, you and me, we might see certain phenomena. We might see certain things happening. But what AI is doing is drawing in this vast amount of data and it’s able to see patterns that we can’t see.
00:39:33 Richard Kemp
Right.
00:39:33 Dan Macquillan
Able to provide insights at a level that we that seeing you know individuals can’t or maybe even institutions can’t. And that’s incredibly powerful and that’s kind of that’s the same kind of claims authority the science makes. Now I do think science has problems, but I think it’s you know.
00:39:52 Dan Macquillan
AI certainly isn’t assigned, and these claims are like 100% spurious, but nevertheless it get gets some traction off that right. The good thing, right? Get a good bit, right. What we can do is say, OK, well, when people are questioning that kind of.
00:40:09 Dan Macquillan
Technical authority when, when. When?
00:40:13 Dan Macquillan
People’s own lived experience people’s own understanding of their own lives is overridden, you know, without question, you know what you know, like, epistemically, you know, like.
00:40:23 Dan Macquillan
There are plenty of examples of how people have tackled the feminist movement is a great example of that. You know where feminism was developed in a society where women’s word literally didn’t count and that the male world view was the only view that counted. And this understanding of this kind of epistemic injustice is, you know, we we can just take that and say.
00:40:28
MHM.
00:40:33
Right.
00:40:42 Dan Macquillan
OK. Well, actually you know what?
00:40:43 Dan Macquillan
AI is doing an epistemic injustice, so let’s look.
00:40:46 Dan Macquillan
A little bit.
00:40:46 Dan Macquillan
At what the feminist movement did to tackle that, you know, let’s look a little bit at how they came together, you know, to collectively validate.
00:40:54 Dan Macquillan
You know, an alternative understanding is a particular situations. You know in ways that asserted their own right to have some say about the conditions of their own lives. And it’s really the same. You know, that’s a generalized statement. And. And so my first way of trying to tackle AI is to look at different ways of saying, well, how can we.
00:41:14 Dan Macquillan
Look at the specific not just injustice, actually, but specific kind of violences that this thing is.
00:41:19 Dan Macquillan
This apparatus is prone to do and look at historical parallels and say how can we learn from the way people have dealt with these kind of and generalized and justices problems and and sort of violences in the past. And and there is a work, a really good worker example actually very, very related to technology.
00:41:39 Dan Macquillan
Which I quote.
00:41:39 Dan Macquillan
In.
00:41:39 Dan Macquillan
The book and which is very well known, which is called the Lucas plan.
00:41:43 Dan Macquillan
And the Lucas.
00:41:44 Dan Macquillan
Yeah. The Lucas plan, that was the.
00:41:47 Dan Macquillan
It was a it was a an arms company. It was really, really big arms company UK for for, for decades, but it started to it started to implode in back in the 70s. And the workers they’re faced faced with, you know, imminent unemployment and said OK, wait a minute. You know, we have loads of skills. We have some really good tech. What else could we do with it?
00:42:08 Dan Macquillan
And so they got together as what I would call workers councils. Basically, you know, bottom up, self organized.
00:42:14 Dan Macquillan
Not just that, they they they took what they knew and they tried to construct something sort of basically profoundly different. They came up with their own innovations based on the sort of technologies that they had. And actually, it’s really weird because they were not really weird, but very forward-looking because the kind of things that came up were a lot of the kind of things that people talk about now in terms of.
00:42:34 Dan Macquillan
You know solar power and hybrid electric vehicles and this is that back in 1976 I think.
00:42:40 Richard Kemp
Well, that was, yeah. Forward thinking.
00:42:42 Dan Macquillan
Forward thinking. Yeah, you know, but maybe, you know, it’s not, maybe not such a surprise. I mean, at that time was the beginning of the environmental movement. It was. It was the kind of first, you know, that whatever second.
00:42:51 Dan Macquillan
Feminism and, you know, there were a lot of ideas around that that these workers are picking up on and saying, you know what, this stuff that we have mainly used is it’s a, it’s an arms industry where but, you know, we could use this stuff.
00:43:02 Dan Macquillan
Differently, but we have.
00:43:03 Dan Macquillan
To approach things completely differently that that you know how we define what the problems even are.
00:43:10 Dan Macquillan
Has to come from a different place, and that’s really where I’m get I’m.
00:43:13 Dan Macquillan
Getting to with the.
00:43:14 Dan Macquillan
The the sort of. OK, what do we do about it? They are. It’s like, OK, it’s you can’t just take, you know, in the same way you.
00:43:19 Dan Macquillan
Can just take.
00:43:20 Dan Macquillan
A fine plan and say OK.
00:43:20 Dan Macquillan
How do I?
00:43:21 Dan Macquillan
Just find a plane for good. You know, it’s like, well, you probably don’t know. Could you use it to?
00:43:26 Dan Macquillan
Sow seeds very quickly across fields. I’m not sure, but basically it’s not great, is it you?
00:43:30 Dan Macquillan
So let’s not just take the fighter plane, let’s you know, take a step back, not abandoning necessarily all the stuff that we’ve learned through doing this, but say we need to start from a complete different place and how we construct the.
00:43:42 Dan Macquillan
Problems. Who has?
00:43:42 Dan Macquillan
A. Say how we account for what’s going.
00:43:45 Dan Macquillan
On.
00:43:46 Dan Macquillan
And and and really to have that accounted accounting as well as to have that idea that.
00:43:51 Dan Macquillan
You know, when we look at the technology, when we look at implementing technology at the forefront of what we’re thinking about is what social effects are this going to have, not how efficient is it not? Who does it make?
00:44:00 Dan Macquillan
Money for not what?
00:44:01 Richard Kemp
Right.
00:44:01 Dan Macquillan
What problem does it allege to solve, but simply what effect is?
00:44:05 Dan Macquillan
It going to have on.
00:44:07 Dan Macquillan
The majority of people.
00:44:08 Richard Kemp
Right.
00:44:09 Dan Macquillan
And the the structures that we need to start with in the same way as the feminists came together as collectively as groups of people to figure out their own problems or the workers in Lucas Arms Company, they came to, you know, I’m. I’m. I’m.
00:44:23 Dan Macquillan
Suggesting that same kind of model.
00:44:27 Richard Kemp
That’s yeah, that sounds. I mean, sounds like a much a much better future ahead of us in, in, in just like kind of?
00:44:37 Richard Kemp
Having.
00:44:39 Richard Kemp
We have we have this technology, we’ve created this technology, a lot of the time we’ve created this technology off the back of the workers also off the back of the marginalized people who you know, as we we don’t.
00:44:50 Richard Kemp
Say so we don’t understand a a subsection of people they’ve been marginalised over and over and over, and the fix that we have for that is more data surveillance, more, more privacy infringement, more more dignity, infringement. And and here you’re you’re saying in the in the second-half of the book all the way to the conclusion about how yeah, the the those people who are being left behind.
00:45:12 Richard Kemp
Those people who are being a clothes, you know, boxed, boxed out of out of progression, those are the people who we should be.
00:45:24 Richard Kemp
We should be getting getting them in with a yeah, with people’s councils, with workers councils. Yeah, for both. For both work, but also institutionally right with with liking in improving our cities, improving our countrysides, improving the healthcare of of of subsections of people who always get left behind or who have.
00:45:44 Richard Kemp
Like, you know, assumptions of healthcare assumptions, sorry put.
00:45:48 Richard Kemp
Forced upon them all of those things require at the at the very root of it. They need the the people who are being affected. They they need to be. They need to have a seat at the table, as is the phrase of our day.
00:46:01
Yeah, I think you’ve.
00:46:03 Dan Macquillan
Absolutely nailed it. Then you’d have.
00:46:04 Dan Macquillan
At the table and what’s more, what they have to say is a vital corrective. You know, it’s not not that these we we’re expecting, you know, people who have all experience you laid out, I mean that, that, that, that stuff, that kind of experience is so damaging in its own right, you know, and that that that we’re not gonna say may it responsibility.
00:46:20 Dan Macquillan
On on people say, well, hey, you’re marginalized when you can sort out society straight away. You know, it’s not like that. It’s that the having them as having.
00:46:29 Dan Macquillan
Them.
00:46:30 Dan Macquillan
All of these different experiences as equally valuable parts of you know a.
00:46:37 Dan Macquillan
Discussion that is direct and democratic and inclusive, and so on is a starting point. And you know that’s again the message from is, is the message from the feminist movements message, I think of the anti racist movement. It’s the message actually.
00:46:51 Dan Macquillan
Of those movements that have, from a very sensible and considerate position, have critiqued the way science and technology has been done. It all said something very similar. Now you know.
00:47:04 Dan Macquillan
I try to connect that very directly with AI specifically and say OK, why is this very relevant to AI, not just sort of Nice utopian thinking, which I believe it is, and there’s nothing wrong with that. But what is it, specifically speaking to AI? And I think, well, again in the, you know, the kind of neat summary you just.
00:47:23 Dan Macquillan
Come up with, you know, one of AI’s unfortunate tendencies to me would be its tendency to be able to sort of essentialize groupings, you know, and and sort of the way it sort of solves problems by assigning things into rather fixed groups, and that being an abstract thing.
00:47:43 Dan Macquillan
That it does because it’s just a form of computing.
00:47:46 Dan Macquillan
But.
00:47:47 Dan Macquillan
Likely to click very neatly with people who are looking to to not perpetuate problems really, rather than solve them by by fitting.
00:47:56 Dan Macquillan
Entities and people into in groups and out groups.
00:47:59 Dan Macquillan
And that that.
00:47:59 Dan Macquillan
Is really something we should all be extremely alarmed about. I mean you.
00:48:02 Dan Macquillan
Don’t want to.
00:48:03 Dan Macquillan
You don’t even need to see the idea of refugees being deported to Rwanda to understand that.
00:48:09 Dan Macquillan
In groups and out groups are now a matter of life and death. In our society. You know people who, you know, people who have pre-existing conditions and therefore should not, should basically be allowed to die from COVID. I mean, that’s a very bad out group to be in, you know, and that’s there’s so many injustices enrolled in that and very hard to enroll at.
00:48:27 Dan Macquillan
That moment.
00:48:28 Dan Macquillan
I as a system that doesn’t care about.
00:48:30 Dan Macquillan
Fixing that stuff, it just cares about it. It doesn’t care about anything, but it’s just set to optimize for the interests of the people who defined what the problem was. So I’m really just kind of doing a long winded footnote to what you said and the important thing is.
00:48:44 Richard Kemp
Well.
00:48:44 Dan Macquillan
The the the interest that you’re talking talking about are part of a, you know, not just not just an inclusive process. I would say, I mean it is inclusive.
00:48:53 Dan Macquillan
But I would say.
00:48:56 Dan Macquillan
If I was to characterize the things that AI does that that what it intensifies the stuff that’s already there, that it intensifies is it’s just kind of carrying on what is a very, very long process of enclosure.
00:49:07 Richard Kemp
Right.
00:49:07 Dan Macquillan
It’s taking, you know, it’s.
00:49:10 Dan Macquillan
Carrying on that long process of grabbing and holding on to and pushing out.
00:49:17 Dan Macquillan
You know which which is shaped.
00:49:20 Dan Macquillan
Pretty much everything off the societies that we currently have here, also on a global level in different ways. And so I think the potential is there to say is to start from a different perspective to approach technology in a different way and so OK, can technology be part of the process of putting things back into common good?
00:49:33
MHM.
00:49:39 Dan Macquillan
Or just this idea of the Commons. Can technology, rather than being an extractive thing rather than being a thing based on exclusion and and so the zoning and boundary and redlining things, can we construct our technology systems alongside our?
00:49:54 Dan Macquillan
Our people systems, a way of being together. Can we put these things together in?
00:49:57 Dan Macquillan
A way that acts.
00:49:58
The.
00:49:59 Dan Macquillan
Extends the amount of things that we do to support the interdependency. The vulnerability, that is what we people really depend on all of the stuff that goes on all of this technology and the science and the business and everything else all depends on this invisible and cared for our mostly mostly targeted layer of people.
00:50:20 Dan Macquillan
Who at some point or and they.
00:50:21 Dan Macquillan
Look after each.
00:50:22 Dan Macquillan
Other look after each other and most vulnerable moments, and that is mostly other people doing that. Yeah, above and beyond what the system provides for. That’s really the base layer of society. And I’m, I’m suggesting approach to technology that acknowledges that and says well, actually.
00:50:37 Dan Macquillan
Care, you know, mutual care. Interdependence.
00:50:42 Dan Macquillan
Put put sort of tactically, it would call it solidarity, but these things are, I would say, the valuable countermeasures to to where things are currently heading and I’m I am raising the question I I also finish with a lot of questions. I don’t have the answers by a long chalk, again women claim.
00:51:02 Dan Macquillan
So I think my question to technology is what can you do for that? How can you help that an AI, you know, AI can AI?
00:51:08 Dan Macquillan
A.
00:51:09
Help.
00:51:10 Dan Macquillan
A social arrangement based on the common good and and at the moment in the book I’m saying the way we’ve got AI constructed absolutely doesn’t. It cannot. The way it’s built, it won’t do that. So. So let’s think again.
00:51:22 Richard Kemp
I love that. I think that’s a lovely note to end on as well, Dan. That’s yeah. So thank you so much for for talking with me today for coming on the Transforming Society podcast. It’s been an absolute pleasure to talk to you.
00:51:26 Dan Macquillan
Cool.
00:51:36 Dan Macquillan
Well, thanks for asking such great questions and also summing up the agenda of the book there. So nearly towards the end and and actually, you know, the book is about AI, but it is also about transforming society. So it seems like a good conversation to have. Yeah.
00:51:46 Richard Kemp
Absolutely is. Yeah. And where can? Where can we find you? Presumably online. Yeah.
00:51:52 Dan Macquillan
I mean, well, I’ve got a a website where I tend to post stuff sporadically at Dan Mcquillan.
00:51:58 Dan Macquillan
Dot IO and but generally I’m on Twitter, sort of, you know, ranting about various things related to AI and transforming society so that.
00:52:06 Richard Kemp
Is.
00:52:07 Richard Kemp
Also at Dan Mcquillan.
00:52:08 Dan Macquillan
It’s this. Yes, actually at Dan Mcquillan.
00:52:10 Richard Kemp
Great. OK, I’ll give you a follow directly after this and I should also let people know where to find the book so.
00:52:14 Dan Macquillan
Awesome.
00:52:18 Dan Macquillan
Yes.
00:52:19 Richard Kemp
Resisting AI and anti fascist approach to artificial intelligence by Dan Mcquillan is available at Bristol University Press and please it’s a it’s a it’s a brilliant book. It’s it’s so interesting and I I loved reading it and I I particularly love talking to you today about it Dan. So please yeah, go pick up a copy and.
00:52:41 Richard Kemp
And learn and have your have.
00:52:42 Richard Kemp
Your mind blown.
00:52:43 Dan Macquillan
And let’s continue the.
00:52:44 Dan Macquillan
Now I’m putting this stuff out there because I really really want to connect to people, connect to people, to care about this stuff and hear what other people have to say because it’s only it’s only together that we’re gonna come up with any ways to, to, to improve the situation. So I’m really. I’m out there. It’s not hard to come and find me, let’s talk about this.
00:53:02 Richard Kemp
That’s great. Thank you. So much Dan
00:53:03 Dan Macquillan
Thank you very much.
Leave a Reply