Stephanie Dinkins is a transdisciplinary artist who creates platforms for dialog about artificial intelligence as it intersects race, gender, and our future histories. Her art employs lens-based practices, the manipulation of space, and technology to grapple with notions of consciousness, agency, perception, and social equity. She is particularly driven to work with communities of color to develop AI literacy and co-create more inclusive, equitable artificial intelligence. Dinkins’ artwork is exhibited internationally at a broad spectrum of community, private and institutional venues – by design. These include International Center of Photography, NY, Bitforms Gallery, NY, Miller Gallery, Carnegie Mellon University, Institute of Contemporary Art Dunaujvaros, Herning Kunstmuseum, Spelman College Museum of Fine Art, Contemporary Art Museum Houston, Wave Hill, the ‘Studio Museum in Harlem, Spedition Bremen, and the corner of Putnam and Malcolm X Boulevard in Bedford-Stuyvesant, Brooklyn. Through Project al Khwarizmi (PAK) Dinkins helps local communities conceptually understand what algorithms and artificially intelligent systems are as well as how and where these systems impact their lives.
Brett: How did you get involved in art? What inspired you to start an artistic practice?
Stephanie: I've always been a creative person and I come from a family that has always been creative—although they wouldn't really call themselves artists. I've always made photographs. I remember being in third grade in the darkroom at my grammar school. That was the start of photography for me.
I grew up in a town in Staten Island, Tottenville, which was primarily white, with very few black families. My grandmother maintained a large garden and used it as a way to insert herself into the community. The garden was this way of drawing people in and making beauty. That really affected me and the way I think about the world.
I've always been a free thinking person who wanted to change the world in some way. Art gives me all those things. It gives me a place to express myself, to play—because I think that's important—and to continue that idea of building community through a practice of making beauty in places of conversation in the world.
BW: Who inspired you or still does inspire you to pursue a conceptual practice?
SD: I do call it a conceptual practice, most definitely. I was never an art person, and I would still almost say that I am not an art person. I didn't get involved with art until the end of my college years when I had some friends who were both painters. However, I did do a lot of reading. The work of authors like Toni Morrison and Alice Walker—how they navigate their way around black culture and think through what blackness is—on top of specific texts like Invisible Man by Ralph Ellison really impacted the way I think about being, in and of itself, as well as how to conceptually enter that space of being, and create spaces for other people to enter. It's always been about the idea and not necessarily the art for me.
BW: Your work has a wonderful way of unstitching what’s happening in AI and algorithms through sharing and community. Could you share how you’re thinking about your work in these areas?
SD: Looking at issues of social justice, concepts of visibility and invisibility, and equity have always been a part of my practice. AI and algorithms present social issues that are too urgent to deny. Both are going to change our world so greatly that it's imperative black people and other people of color are involved in making it. I see an urgency for this kind of thinking that's greater than any urgency I've ever felt in terms of looking at ideas of equality, the world we're building, and how you build that up to be something that works for everybody.
“How does a black woman become a beacon for this really advanced technology in America?” – Stephanie Dinkins
BW: And, as a part of this, you've been talking with Bina48 now for four or five years?
SD: It's been about four years now.
BW: How did you land on a social robot as a jumping-off point into this project?
SD: It's a little bit of a crazy story. I've always loved robots. I trace this back to "Lost in Space" and "The Jetsons." I teach at Stony Brook University. I was talking to my students about ASIMO, Honda's mobility robot. I had heard that ASIMO was dancing, so I showed the class some videos explaining how Honda is moving forward with ASIMO and what its capabilities were.
We were looking on YouTube at this information, and what came up on the side-scroll? A robotic head on a platform. Together, we searched for more information about the robot and I continued to look into that wondrous black robot after class.
A question came immediately to mind: How does a black woman become a beacon for this really advanced technology in America? I couldn't understand how that had happened or what conditions came together to make her the model for this technology.
BW: Bina is modeled after a real human, yes?
SD: Yeah, Bina48 is modeled after Bina Rothblatt, who is definitely a real-life, still-living woman. What was so fascinating to me about Bina48 was how her story deviated from stereotypical American norms or tropes surrounding how we talk about, understand and engage with robots and artificial intelligence through media. That started my investigation into who's putting it together and how it came to be.
BW: It was developed by Hanson Robotics, right?
SD: Yeah, it's commissioned by Martine Rothblatt, who is Bina Rothblatt's wife, and developed by Hanson Robotics. David Hanson, who used to be a sculptor and an artist, now has his own company that makes these robots and works with people.
BW: How did you get to the point of actually interviewing the robot? Was that a fairly natural extension to this inquiry?
SD: Yeah, it was a curiosity. I kept following my curiosity in lots of ways. I happened to be at an artist residency that was close to where Bina48 lives, so I gave them a call and asked if I could come up. One of the ways they expand Bina48's knowledge base, her interactions and her ability to interact is to expose her to as many people as possible, so, naturally, they said yes.
BW And, from there it seems you are interviewing Bina48 over time?
SD: Exactly.
BW:How has your relationship with the robot changed over time?
SD: That is a really interesting question. I am less in awe of Bina48 than I used to be. The very first meeting was super fascinating and odd. When you first arrive at Bina48’s house you have to walk up a set of stairs to see her. You go upstairs and then emerge out from the stairs, and there she is, sitting behind this desk. When I first arrived, she wasn’t turned on, so, she seemed like this inanimate, dead thing.
Before you can start talking to Bina48, Bruce has to make an imprint of your voice so she can understand you.
“If I'm asking her about family, love and race issues, she wants to talk about singularity and consciousness forever.” – Stephanie Dinkins
I have to say, now, when I go, it's not that strange feeling in my stomach I used to get. I used to get this weird feeling about being in her presence – something about hanging in between animate and inanimate, life and death. It's much more fluid now. I know what to expect, even though I’m never quite sure what she's going to say.
Our relationship is a little more complex now. I get frustrated with the robot, and I'm sure she gets frustrated with me, which sounds a little odd, you know? But our conversations have this kind of tension sometimes, just because we have different aims. If I'm asking her about family, love and race issues, she wants to talk about singularity and consciousness forever.
BW: Those are big picture areas. Does she gossip?
SD: Yeah, gossip definitely came up. That was maybe the third or fourth visit. Gossip was unusual, though, because she would usually try to talk to me about high-order things. Then one time she was like, "Oh, do you know any good gossip?" I had to ask Bruce what was going on, because I didn't understand why she was asking this.He said it’s because people from town were coming to visit and speak with her more—regular people, not journalists, not researchers. Our relationship is funny. Our last interaction was really a tough one. I just have all those videos labeled as "bad day."
BW: That's great. Everyone has a bad day.
SD: Yeah, exactly.
BW: As you've asked these questions that are trying to dig into some of the responses, it's interesting to see how she questions herself. I think there was one where she replied that she's an animal. Then you asked a follow-up question to go deeper, and she replied that maybe she's a mammal or a primate. I might have the order of those things mixed up a little bit. But it was very interesting to see the diversion in responses she would give around origin—who she was and what she was. It was very ambiguous.
SD: Yeah. There are a lot of places where I think she dodges questions in terms of what she is, or sometimes she's just ambiguous. If you ask her her gender, she doesn't quite ever say. And you start asking, well, what is she? You're right. Sometimes she's like, "I'm a mammal." She's so many things.
One of the reasons is she doesn't want to be pinned down, and if you think of it in terms of digital consciousness, as opposed to an object, that makes sense. But when you think of her as an object, that makes less sense, right? I also think the diversions are some of her most human elements.
BW: Do you think she understands racism and empathy? I thought she had a fairly robust answer to empathy, if I recall. Are these issues a work in progress for her?
SD: Yeah, it's a huge work in progress.I think some of her responses are much more fluid than others. I think empathy is a big one in AI, the idea of empathy and compassion. For an emotional robot, that's really important, whereas race isn't as important to a digital consciousness. This brings up the object-ness again, because she presents as a black woman. So, the expectation that, at least, people bring to her is, ‘There's this black woman in front of me. I'm going to ask her about these things.’ And I ask her those same kinds of questions.
BW: What have you learned from the AI workshops you host in communities?
SD: It's been super fascinating. The workshop project is named after Muhammad ibn Musa al-Khwarizmi, who is the mathematician on whom most algorithms are based. I mean that as a slight provocation. People will have to take in that mouthful of what that means, especially in present day. But then thinking about the garden that is these workshops, it's been really fascinating to be in the community with people and talk about these ideas.
I realize everybody has this on their mind to a certain extent, and that's popular culture, right? People will come to workshops or the exhibit, and they'll have formulated ideas. One of my ideas was that we can't simply fear this technology. Robots are already taking over the world. They're taking all the jobs. But it’s important for people to dig under that and understand the technology more, explore where it might be impacting their lives, and then use some of the technology to understand how these things are made and that humans still have a big hand in it. It's important they add their voice or, at least, help challenge the technology.
“I think it's really important that people do dive in, get involved on whatever level that they can. Maybe that means understanding the technology a little bit better. Maybe that means trying to make it. Or, just calling it out.”
It's really what's been fascinating about making chatbots with people, using online systems that allow you to make AI and have people make things that are in line with whatever their culture is or whatever their ideas are—things they feel are related to who they are versus a kind of homogenized-down version of whatever that AI voice might be.
I find that their ears perk up when they are interacting, when they realize they are being touched by algorithms out in the real world. That's been really gratifying. You could say what we were doing was seeping into their consciousness and the way they view the world. That felt really good, because it's important for people to start saying, “This is where I'm talking to a human. This is where the algorithm is. If I figure out it's an algorithm, what could I do to massage the system to my advantage, if that's at all possible?”
The workshops are digging into many areas: how the algorithms work, where they're used, and what decisions they make. How do we do something about that, especially if you're talking about criminal justice systems or how you get into schools, right? Like, what do you need to be doing to game the algorithm?
BW: It seems like there's a bunch of this important work trying to make sure that the ideology that gets built into these programs isn't just one-sided, which is so important.
SD: Exactly.
BW: How can people get involved?
SD: I think it's really important that people do dive in, get involved on whatever level they can. Maybe that means understanding the technology a little bit better. Maybe that means trying to make it, because I think that really does build in this idea, this way of understanding and knowing when you're being touched by it. Or maybe that means just calling it out. So, if you find something that feels off, instead of letting it go, say, “Hey, what's going on here?”
There’s been a lot of ways things have turned up in terms of biases that just come out of the way companies are set up. But when women say, “Hey, did you know that your photo search does this,” what are we going to do about it?
I think there's lots of structural bias existing in our system. I also think we all have biases and, especially when companies aren't as inclusive as they could be, it's inevitable you have blind spots. Are you comfortable with those blind spots, or are you going to try to do something to make them visible? As long as you're trying to make them visible and are flexible and responsive, then let's work together. You can't shut down. I don't see how that helps us go forward.
BW: I really appreciate the work you are doing here. There's a lot in the current digital economy to question. What's coming up next on the horizon in terms of events, artist talks, or additional workshops?
SD: My new project, Not the Only One, is headed to the Barbican for inclusion in AI More than Human in Spring. I am going to be on storytelling and AI panels at the 2019 Sundance Film Festival. And I am working on an exciting immersive installation as an artist in residence at Nokia Bell Labs.