Ars Electronica Q&A with Lauren McCarthy about her Transmit³-Residency: The Changing Room.
Designer, artist and programmer Lauren McCarthy explores the structures and systems of social interaction, identity as well as self-presentation and the potential of technology of how it can influence the dynamics between those traits. During her Transmit³-Residency she wanted to find out about the influence of Big Data on the togetherness in front of The Cube.
Being one of the world’s largest digital interactive learning and display spaces, The Cube is perfect for social experiments based on the technological manipulation of human beings. For Lauren McCarthy, it functions as a “smart social environment,” which reacts to the surrounding mood. Moreover, it influences the gathering people – by the flick of the switch, so to say.
Renowned for raising delicate questions on interpersonal affairs in virtual spheres, her recent experiment is based on the assumption of a utopian setting: People who have lost their human touch, trying to regain feelings by technological devices, according to the “make a wish”-principle. This time not only for themselves, but for an entire surrounding.
Lauren McCarthy gave a thorough account on the state of her Transmit³-Residency and how students and the staff at QUT react towards the “provocations” of the display.
How did you get the idea to consider Big Data in the context of social interplay?
Lauren McCarthy: We hear companies and politicians talk about “Big Data” in a way that is abstract and opaque. I think it’s difficult to understand what all that really means to us. But for me, this is crucial. Our personal data is deeply related to our identity — it describes us, it reveals insights into our lives and personalities. It also shapes us. The systems and structures we build and use for gathering, storing, and interacting with our data defines the way we think about ourselves. The models of ourselves that we create with this data determine the actions we take. So I am wondering, can we do something more interesting than counting our steps, tracking our calories, collecting likes on our facebook posts? What happens if instead we look at what this data means in the context of our social relationships. Is there possibilities for improvement, or is it something we should be critical of and questioning?
Help or “HELP!!!”: Take a look at this Google+ Hangout app which analyses voice and facial expressions to ease the conversational flow.
Can you please introduce us to the set-up around the Cube at the QUT?
Lauren McCarthy: As you might know, the set-up of the sensors around The Cube provides the possibilities of emotions-tracking, but we enhanced them, so that we can track people walking around that space, their facial expressions, their audio and voice pitch. People who are walking into this area, are facing a certain question, that is: “How do you want to feel?” There is an interface presenting several hundred emotions and when you pick one, you obviously are not picking an emotion only for yourself, but for the entire space. Assume you are picking “nostalgic” or “envious”, The Cube responds in a way to make everyone else in this place feel the same way. That is by changing the ambience, meaning via color, sound, light, and so on.
Why DO YOU havE one person in power to choose a mood for the surrounding people?
Lauren McCarthy: Sometimes, I get on a train and face someone who is crying. Then I think to myself: “Why aren’t there more people crying, it can’t be that nobody else in this place has a sad emotion, or maybe support that person by crying over his or her problem too?”
What would be a counterbalance to the way we are raised, that is not showing any unpopular emotion which goes along with a lot of phobia. But that would mean you don’t instruct people to become a uniform society, but rather the other way round, to have a more varied society by allowing people to show more shades of feelings.
Lauren McCarthy: Yeah, the question that I had for weeks was what the instructions ought to do. Are they designed to make you feel good or to behave well? Maybe we’ll leave that goal up to the people and let them pick for whatever goal and to not weight them. It’s up to them what they want to experience. But it’s not like you’re buying train tickets! There is no round trip, so there is no guarantee that you’ll come back to where you started from.
How can you confirm that people are not flipping the switch like zapping TV programs, so that there is no real investment in the feeling?
Lauren McCarthy: We are still tweaking that, trying to make sure that it won’t be just sporadic.
As you are considering the use of those feelings, how do you decide on the desirable ones, because there won’t be an indefinite number of emotions on offer?
Lauren McCarthy: We have a few hundred and yeah we can’t have everything, but there is a vast range. For future purposes, I considered this installment in homes as well as in public spaces. So when you return from work, you maybe don’t switch on TV but instead put yourself into different types of emotions. Or you’ll have a dinner party and instead of turning on music, you’ll engage feelings. What is nice in public is, that you interlink total strangers on the spot. When a friend of mine was on a train and he witnessed an accident where someone was badly hurt, he could not believe that instead of helping that person, everyone kept staring at the phone. I asked him how could the other people know how upset he felt when he was on the phone too, writing about how he felt. So my method might be a way of maybe knowing what the person next to you feels like.
Do you think, you get a more honest impression of people of how they feel via such a device than for example with Facebook, where you can choose whatever you want to be for others?
Lauren McCarthy: THAT is one of the implications of this work. There was that one Facebook study some years ago, of which everyone was really upset about, because they found out that Facebook experimented on the users by showing them more positive content, so that they tend to be happier. They eliminated the negative stuff from your feed. So everyone was exposed to the fact how unethical that Facebook experiment was – but I was wowed and thought about “what you would make of to turn on your Facebook and be able to pick how excited or happy you would like to feel over the day. And what are the implications of that? Would it mean you’ve got more control over your or other people’s emotions? Or are you endangered of missing something. If you want to be happy all the time, then aren’t you lacking something deeply ingrained in human beings?
Which is also very disturbing, because you are raised in a certain way by technology and then try to get back the human factor also via machines….
Lauren McCarthy: Yes, that is a kind of dystopian thought, that we could have lost any human touch or feelings at all in a near future. So that we would have to use a device in order to get our daily dose of sadness, melancholy, happiness and so forth.
Isn’t it also very 1984-ish to imagine displays telling you how to behave and react?
Lauren McCarthy: Of course it is a utopia, having us to think we are in total command, walking inside of a room telling a machine the way we want to feel like, but then again, it is the machine which is in control – and that is what machines already do. We may think to have a smartphone and possess the power of that tool whereas it is the other way round and a big company is imposing a whole system on us. They are in surveillance and control over you, just to find out about what you are doing all of the time. The nature of this work is to oppose those two positions where we think it to be fun selecting emotions, adding freedom and to express yourself, and on the other hand you’ll get that dystopic side.