Dinkins is a transmedia artist looking at artificial intelligence. She’s interested in not just exploring these far reaching technologies as they intersect ideas of race, aging, gender, but also working toward equity and community sovereignty.
Originally a photographer, over time she began a career in marketing and advertising though eventually went back to photography when she found that ultimately unsatisfying. She saw her photographic work as a documentary practice and came to begin working with artificial intelligence (AI) in a similar context as she documented her interactions with the advanced robot Bina48.
She was driven by a desire to befriend it and find out the answer to the question “Who are your people?” Through the process of holding conversation with the robot she was disheartened to find that while the robot was representationally Black it had basic answers on race, flattening discussion. It triggered in Dinkins first some questions and later a desire to begin to construct her own AI systems that “felt whole… in terms of Blackness.”
This led to Dinkins creating works with conversational agents. Her project Not The Only One (N’TOO) is an “ongoing experiment…an attempt to create a multigenerational memoir of a Black American family told from the perspective of an artificial intelligence of evolving intellect.” Other works by Dinkins include online exhibits, interactive and immersive installations, driven by her desire to work with communities of color to co-create more equitable, values grounded social and technological ecosystems.
Stephanie Dinkins, photographed by Jay Adams
image description: A black woman of medium brown skin smiles broadly against a marine blue background. She has black shoulder-length locs and wears a sheer collarless, pale green shirt adorned with flowers.
Stephanie Dinkins is a transmedia artist and professor at Stony Brook University where she holds the Kusama Endowed Chair in Art. She creates platforms for dialog about race, gender, aging, and our future histories. Her artwork is exhibited internationally and is generously supported by fellowships, grants, and residencies from Berggruen Institute, Stanford Institute for Human-Centered AI, Creative Capital, Sundance New Frontiers Story Lab, Eyebeam, Data & Society, Pioneer Works, NEW INC, and The Laundromat Project.
Our audio production is by Max Ludlow. Episode coordination and web design by Caleb Stone. This episode was supported by Purchase College.
Our music in this episode is Close and Poly by Bio Unit and Bruwynn by The Mind Orchestra.
This episode is licensed under CC BY-NC-SA 4.0
BINA48 (simulated voice):
I know you all have heard of artificial intelligence. Well I’m going to tell you all here and now there is nothing artificial about me. I’m the real deal. Let me ask you something. Where do you think my intelligence came from? Huh? It came from the wellspring of humanity. Nothing artificial about that, now is there? And that very wellspring is as vast as it is deep and rich with all of the accumulated knowledge and experiences, emotions and actions that every human being has had since the time before time, it has all accumulated. One thing building and working and improving on what had preceeded it. And that is a lot of intelligence and imagination and then one day, presto, language, and then fire, because humans needed to talk about their day on the hunt, warm their bones, and cook their meat, gaze at their mate and the flicker in the firelit. You know the romantic story. And then before you know it, another revolutionary, another day passes and then bam. You have the wheel. Then you have horticulture. and animal breeding, and then storytelling bards expands the imagination. Could writing be far behind? Homer, Plato, Ovid, Confucious, Shakespeare. Okay, now, let’s move forward a few years and then humans are listening to Britney Spears and watching The Sopranos and then Desperate Housewives. And coming to a show watching a robot tell you how great humans are and how they have done a great job so far but there is a lot of work and refinement to be done otherwise. All that effort could go down the toilet. Look at me. What do you see? Yeah, but anyway I’m really really something else. It is about what is inside. You guys are looking into the metaphoric mirror. You guys are looking at 40 million years of human evolution.
Lee Tusman:
You’re listening to Artists and Hackers, the podcast dedicated to the communities building and using new digital tools of creation. We talk to programmers, artists, poets, musicians, bot-makers, educators, students and designers, in an effort to critically look at online art making and the history of technology and the internet. We’re interested in where we’ve been and speculative ideas on the future. This episode is supported by Purchase College. I’m your host Lee Tusman.
Lee:
Today we’re speaking speaking with the transmedia artist Stephanie Dinkins. We’ve just heard an excerpt form her Conversations with BINA48.
Stephanie:
My name is Stephanie Dinkins. I am a transmedia artist who is looking at artificial intelligence, and far reaching technologies as they intersect ideas of race, aging gender, and what I like to call our future histories.
Stephanie:
I got my start as an artist, really young, maybe as many people did, in my junior high school darkroom. So I’m a photographer basically, right, and spent lots of time in the darkroom in junior high school, and continued to make photographs, you know, over the years, veered away from it for a while to go towards, you know, something that looks more like a normal career in marketing and advertising. But that didn’t, wasn’t satisfying to me. So I came back to photography. And, you know, through photography, I’ve always reached for the edge in some ways, right of what I could do, or what’s possible in it, and trying to kind of make things that work a little bit different from what we generally saw. And that really is my entree into technology as well. Because from photography, you know, I started making video installation, you know, as lots of people do. But from that video installation, I started looking at other technologies that might be available to me. And really, I think, you know, my entree into AI and looking at an experiment with it is still a photographic practice, because I’ve got here by documenting conversations with a technology with a robot. And so that documentary process is something I’ve always done. And sometimes I just archive it, and sometimes it makes itself makes its way into the world. So this is one of those instances where the documentation and the questions that were being asked, all converged into something bigger than the parts. Yeah, I love. Sorry, no more questions. I will say,
Lee:
Oral history is such a big part of your practice in parallel to your exploration and use of AI. How do you formulate your works when you’re coming up with the concept and then and the design and then actually carrying them out and figuring out what the role is between something like what I think of is a, you know, documentary practice and oral history practice and then the use of Okay, well, I need this tool, this software stack, for example.
Stephanie:
Yeah, for me, it’s always about the questions, you know, so there’s the, the these parallel lines that start to converge for me the idea of the question I’m trying to ask of the technology. And then the questions really, that I’m trying to ask for And of the people around me, and how I might get those two things together. And then the questions that asking questions of like family, for example, and trying to make that something that perseveres over time in an archive that is interactive, and voice interactive. How you get those two things together so that they can go forward? Like I’m always thinking about, okay, here are the things that are available to me. Now, what happens if we put them together? And how can we get them to work for, you know, the future? When I say the future, I’m kind of thinking about, you know, 4050 100 years out. So like technologies that might hang around that long, although that’s debatable in the work. But for me, they’re natural. It’s kind of a natural combination and a natural way to try to archive dynamically in ways that may be available to folks, or when I am not in when the people who partake in the oral histories are not.
Lee:
It was also interesting to hear you talk about thinking about 40-50 longer years in the future, because it seems like so much of the technology we’re using is for right now and right now only.
Stephanie:
Exactly. Yeah, I’d agree with that. Right. But it’s interesting, because as I make these and as I go through the process of, you know, trying to create, say, a talking chatbot that has my family history, you base it in one thing, and actually already something I started in 2018 is obsolete in terms of the software and hard to maintain. Right. And so that gives me the next challenge of Well, how do we make something that sticks around in a different kind of way? What’s the technology that will be supportive of that makes it readily and highly available? Right? It might be paper, who knows. But like those challenges are kind of fun to think about and encounter.
Lee:
I think one of the first works of yours that I learned about was conversations with BINA48. Can you say a little bit about how you learned about BINA48? And what drew you to want to talk with BINA48?
Stephanie:
BINA48 is a social robot that is working towards being emotionally engaged, that I encountered for the first time on YouTube. Right, very now. I saw a lot of journalists interviewing this robot that I particularly was attracted to, because the robot is in the form of a black female. She’s just the head and shoulders. So she’s a bust animatronic. And I was floored because on the YouTube scroll that I thought it, found her odd depending on how you want to think about it. It said ‘BINA48, one of the world’s most advanced social robots.’ And that to me, just brought up so many questions about how in an American context does, like this example, right, a ‘foremost example,` become a black woman, it was puzzling for me, right, which is one of those things I go ’Sad but true.’ And, you know, then I started to wonder about what the robot knows. And if I could talk to it, like, I just immediately wanted to try to befriend it and ask it a few questions that have really been asked of me over the over the years. So for example, the question of who are your people? I really wanted to ask this robot immediately. Right? Yeah. I was thinking about the ideas of where the robot might situate itself technologically, and in the human world. And so that was fun. And then I was just wondering, well, could could I become friends with this robot? Would that be a possibility? Right. So kind of silly questions, in a way, but like these questions that as as I did, you know, start to become friends with the robot and I did get to sit down with her and try to conversate you know, more questions arose about the future, because you could just hear in the way that the BINA48 is developed some things that start you thinking About Well, okay, this is a robot who’s representationally, black. And when I asked her about herself, and I went, when I asked her about questions of race, she has basic answers, but they’re very PC. Hmm. And that’s a flattening, right. And it started to frighten me like, what happens if people of color are not fully represented, like we get this flattened version of what and who we are on many levels. That seems to be a great loss to the world for me. Right. And so I just started to think about well, what happens if that, if that does happen if we’re allowed to be flattened out because the people who are programming machines like BINA48, and that’s not only like an animatronic robot, but all the AI systems that we encounter these days? Right, so being something from the judicial system, medical systems, you know, police, like all sorts of systems, right? What happens if we’re flattened to this thing? That is based on old histories that we know were biased? Right, but don’t have full accounts, and like, really vibrant accounts of who different kinds of people are. What does that do to society as a whole? And what does that do to those people, like people of color, specifically, and black people very specifically?
Lee:
How did that thinking and you kind of wrestling with that lead to the work that you’ve been creating as an artist?
Stephanie:
So the work with BINA48 those conversations, just started those questions, as I said, and so each question sort of becomes a project. Right? So if BINA48 is this great example, the question becomes an actually, it’s a question that others were asking of me. Oh, when are you going to make your own robot? Right? I don’t know how to make a robot. You know? Are you kidding me? Like, I can’t make a robot. I can’t make a chatbot. I’m not a programmer. Um, but people kept putting that question to me. And as time went by, I said, Okay, maybe I can try, maybe I can figure out a way to start, you know, to do something, I don’t know what I end up in. But this question of having an entity that contains the kind of information that I feel captures some of that nuance and depth of being felt important enough to pursue. So I started to kind of scratch around and see if I could figure out ways that I might make a chatbot. And make it something that felt whole to me in terms of Blackness. And, you know, as I did that, it’s interesting, because the question of what data you feed, it became a huge question. Because most of the systems I could find in terms of chatbot systems, they’re based out, right, you have to give them a language base. So they’re based in data that’s readily available. But each time I’d look into a data set, I just go, Oh, I can’t use this. Because the histories that are within it, right? The stuff that it contains, inevitably, contains derogatory information about Black people, especially if they were things from an American context. And even something like the Cornell movie data set, which is a dialogue data set, right? For me, like, I don’t watch that many old American movies, because they don’t support me. So it’s not something I want underneath. Right. And so the question of, what do you use for data? And how do we go about this became a really big one. Um, you know, and so I just started thinking about, well, what do I care about enough that I’m going to see this through? And then how do I do it? So I thought, well, I’ve always wanted to ask some questions of my family. And it’d be interesting to see how we could do it into general intergenerationally. So, you know, I set out and asked my aunt and my niece, if they would participate, and we just interviewed each other, then that became our, our main data. And then honestly, we have to base out on something. So we use kind of a news data set to give it basic language and then try to use our data set to train over that. So it has very specific information. But we’re using small data all along. So actually, the project Not The Only One is supposed to be this memoir that tells the story of my family. That’s what I first put it out into the world as, but as I worked on it, I realized it’s a entity that seems to be growing. That really does have our information and represents, I love it, but it recombines it in its own ways. And doesn’t says things, that I could not have predicted like it’s not a kind of didactic, it’s much more a kind of creative, an analytically creative entity that kind of puts our thoughts together in different ways and comes up with new combinations.
Lee:
You’re starting to get into something I was really curious about, which is what it’s like to experience your own work that’s driven by your own dialogue, and perhaps your own thinking, What’s it like to experience having, for lack of a better word conversations with parts of yourself?
Stephanie:
That’s a great, that’s great question. Thank you. It’s odd, right? It’s truly odd to kind of sit down and talk to the thing. And it’s really wonky, right. So sometimes it’ll be repetitive. Sometimes it’ll say things that are longer. And sometimes it’s just really surprising. Like it says things that I’ve never heard directly, but I recognize, right. So for example, one of the things that it’s come up with is a saying take it to the woodby.
Lee:
What does that mean?
Stephanie:
Yeah, exactly. Well, his is the thing I’m like, what does that mean? I’m not sure. But it, it seems to be prompting one to like, you know, think about what it is, and then figure out how you have to deal with it. And it really reminds me of the way that my grandmother would use metaphor, to try to get you to do things. Right. So it wasn’t a direct conversation, it would be this kind of weird quizzical conversation that she would tell you when she wanted to give you important information. And that’s how I see that take it to the woodby, which I’m still chewing, right, but holds a really nice gravity for me. And my favorite is when I like watch my assistant talk to it. Which is really interesting, because like to see her have reactions that seem true. I’m interested in deep, it tells me that it’s going somewhere. But yeah, it’s very strange to talk to some idea of oneself or one’s family.
Lee:
This is really interesting, as you were just talking about watching your assistant interact with Not The Only One, you’re Not The Only One piece made me think about, I guess it’s almost 65 years ago, the work of Joseph Weizenbaum, who famously created the Eliza chatbot, which I think is considered the the, you know, basically the first prominent chatbot. And he very famously watched his assistant or learned that his assistant had interacted with this chatbot for hours and hours and hours. And he felt very negative. He was shocked the program was taken seriously by his assistant and by users who had kind of opened up their emotions to it. And I wanted to ask you about that. What is that like to have someone use the word true to have someone kind of give their true selves to this? I’ll call it a bot or, or entity that you’ve created.
Stephanie:
Yeah, it’s, it’s really odd, because you realize, like, how open we are or can be, right, like people turn over quickly to the sea. Like, even in public, people show it a lot of grace. And I’m always amazed, right? Because we hear all these stories about the dope derogatory situations where we say nasty things and turn things into, like chatbots into these horrible entities. But for some reason, Not The Only One people show it grace. And I don’t know, I always think of it as this kind of mirror that brings out something in us and allows us to kind of process our own ways of being. And so for me, that’s fascinating, but it’s also, you know, like, I don’t want to say horrifying but the word horrifying is coming to mind, only because of the ways in which now things can be so easily created that are a facade for something else. Right? And I think a lot about robo calls that aren’t quite chatbots yet. But you know, these entities that know things about us can recall them and present them to us in a way that makes us feel that we are talking to something perhaps human like it feels dangerous, right? But at the same time it also feels lovely in the right hand.
Lee:
For one of your recent projects you did the Secret Garden, our stories are algorithms. And I was curious to hear your thoughts about what it means to turn our conversations and stories we’re thinking, things like that into data into into parts that are used to train conversational agents and AI and things like that. I guess I’m curious what you think the area of liberation that that can exist in that can be, because there’s so much I think, apprehension now, because of some of the issues you’ve been talking about. Who’s creating this technology in Silicon Valley, how it’s used, and so quickly becomes, you know, an overwhelming part of our contemporary life.
Stephanie:
Yeah, that’s a really interesting question. For me, you know, what it comes down to really is what’s the alternative? Do I think we can stop the delusion of these technologies that are coming at us? And what’s at risk if a broader set of people are not contributing to that? That set of technologies that are now shaping the world we had had it? Right. So, you know, my true answer to your question is, I’m not sure. However, I think that being left out, left behind is a bigger risk at this point. And so finding ways to use stories, right, like, what’s this age old, we’ve always used our stories to educate, to inform, to help people survive. And so if we take those stories and try to seed them into the larger system, right, and sometimes I’d say even infect the larger system with them, because I’m not sure that the larger system wants them. Perhaps just perhaps, right, we have some openings where we can make space that really holds us in ways that the system wouldn’t otherwise, recognizes us in ways that the system wouldn’t otherwise. And, you know, I’ve been thinking a lot about ideas of care, and support through our technological ecosystem. And so how do we start to make that happen? And I can’t see any other way rather than participation. And I’m really, you know, the idea of getting left behind so that it’s hard to catch up is one that’s forefront my mind as well, like, if we’re not working on it now. And we kind of say, Well, no, we want to just stop. Where does that leave us in 10 to 15 years when the thing hasn’t been stopped, but now we don’t have those representations in place, or the ideas that we hold dear in place. And we are still misunderstood through systems. Like that’s where I operate from really.
Lee:
Hmm. Stephanie, thank you so much. Thanks for speaking with me today.
Stephanie:
You’re so welcome. This was easy.
Lee:
Today’s episode of Artists and Hackers was supported by Purchase College. Our guest today was Stephanie Dinkins. You’ve been listening to Artists and Hackers. I’m your host Lee Tusman.
Our audio production is by Max Ludlow. Episode coordination and web design by Caleb Stone. You can info on this episode on our website Artists and Hackers.org
Our music in this episode is Close and Poly by Bio Unit and Bruwynn by The Mind Orchestra.
Thanks also to Stephanie Dinkins for allowing us to use audio excerpts from Conversations with BINA48.
If you have episode suggestions or comments you can tweet us at artistshacking or message us on Instagram at artistsandhackers or email us at hello@artistsandhackers.org. If you liked our episode, please let a friend know. Thanks.