...if engineers should study ethics? (with Ayanna Howard)

Ever Wonder? / November 23, 2022

...if engineers should study ethics? (with Ayanna Howard)

Image
Photo of Ayanna Howard, roboticist, professor, director of the HumAnS Lab, and chair of the School of Interactive Computing at Georgia Tech
Image attribution
Courtesy of Ayanna Howard

For this episode, we’re taking a look back at Perry’s interview with roboticist and Professor Ayanna Howard. He spoke with Ayanna about how the fields of computer science and artificial intelligence have advanced, and considered the ways robots can be biased. Considering all of the ethical and social aspects of the new technology engineers create….

Ever wonder if engineers should study ethics?

In this short, Perry and Ayanna discuss the importance of teaching ethics to technologists and engineers. We are after all the ones creating new technologies and shaping how the field of artificial intelligence will look 10, 50, or even 100 years from now. Ayanna shares her perspective on considering the human aspects in the new technology we create. 

Have a question you've been wondering about? Send an email or voice recording to everwonder@californiasciencecenter.org to tell us what you'd like to hear in future episodes.

Follow us on Twitter (@casciencecenter), Instagram (@californiasciencecenter), and Facebook (@californiasciencecenter).

 

Transcript

Jennifer Aguirre (00:06):

Hello. This is Ever Wonder? from the California Science Center I'm Jennifer Aguirre. For this episode we're taking a look back at Perry's interview with Roboticist and Professor Ayanna Howard. He spoke with Ayanna about how the fields of computer science and artificial intelligence have advanced and considered the ways robots can be biased considering all of the ethical and social aspects of the new technology engineers create. Ever wonder if engineers should study ethics? Perry and Ayanna discuss the importance of teaching ethics to technologists and engineers. We are, after all the ones creating new technologies and shaping how the field of artificial intelligence will look 10, 50 or even 100 years from now. Ayanna shares her perspective on considering the human aspects and the new technology we create.

Ayanna Howard (00:56):

Um, this is the first time in, in the world, I would say, or in the, in terms of the time period that technology engineers and technologists are pretty much carving what the world is gonna look like for the next 10, 20, 30 years. Which means that if we as technologists aren't thinking about those human aspects, we are going to be the ones that mess up the world. And it that means we are messing it up for ourselves. And so some of it is actually selfish. It's like, it was cool when we were working on gadgets. I mean, I was trained to work on gadgets. You didn't think about like the negatives about like, you know, what are the harms? I don't really care. It's all about the, the cool gadgets. But we don't have that luxury anymore. Only because, you know, we've been pushed to the forefront because technology has such a foothold in our lives that we need to just make sure we do it right. And even it's for a selfish reason, which is if we don't, is gonna come back to bite us. You know, the, the Frankenstein aspect like we are going to create our own disaster and destruction, which I don't think any of us really want.

Perry Roth-Johnson (02:06):

A, another, uh, way to address these problems and improve 'em in the future is, uh, this notion that engineers or scientists, all of them should have to take some sort of ethics course, uh, in their training. So like in, in your view, is this something that should be required in engineering?

Ayanna Howard (02:24):

I think not only should it, should it be required, but it should be threaded throughout the entire curriculum. Um, so what, what's happened a lot of times is that, you know, there is an ethics course, right? And so you go into the ethics course and because it's not directly tied to, you know, your, your calculus or your dynamics or your robotics course, you're like, oh, this is that humanities course that I have to check off. Right? I mean, it's true. And, and so what we have to do, like I teach an ethical AI course, but what I teach is I teach a algorithms course where you have to think about the ethics, right? Which is a different way of thinking about it, right? So we do, we'll talk about natural language processing and we'll talk about word embeddings. And it's like, oh, like that's cool. And then it's like, oh, but we're also gonna talk about how there's biases and these word embeddings and let's think about a solution. Right? And so it's, it's a techy course, but it's also cast within this realm of that you have to think about the outcomes. And so I think if you think about all the things in the curriculum, um, any course that you have, you can drop in one of these, these, this concept or these modules. Uh, when you're in, uh, a class and you're designing a, a project. So I'm designing say, you know, a robot manipulator that can grab objects, right? Like part of that should be, uh, the instructor says, now, does this actually work? If you're grabbing objects in an environment such as a nursing home, right? Like just a little bit of things like, and so then you're like, nursing home. Why would that be any different? It requires the student to go to a deep dive. Oh, nursing home. There's a lot of older people there. Oh, older people might have mobility impairments. Uh, right? And so without even saying it's ethics, it makes people start to think about those human aspects of their technology. Um, I think that's where a lot of institutions at at Georgia Tech and Computing we're moving to this responsible computing where it's threaded throughout the curriculum. I think a lot of, uh, programs are, are thinking about how to do this.

Perry Roth-Johnson (04:34):

I I like how you almost, you're, you're keeping your audience in mind that folks who are into tech, cause they're engineering students, you want it to be relevant to their daily lives, but you almost like trick them in a way. It's like, okay, we're going to use that tech that you want to learn about, but we have this other thing snuck in. So now you're learning about ethics and you didn't even realize it maybe

Ayanna Howard (04:55):

And you didn't even realize it, right?

Perry Roth-Johnson (04:56):

Yeah.

Ayanna Howard (04:56):

But it's almost like, you know, if, if anyone, uh, I always think about baby food cause you know, they have these commercials and you're watching TV all the day. You know, think about baby, it's vegetables. Like what kids loves vegetables. No one, right? But yet somehow they put it in this nice packaging and it has this nice coloring and you're just like, oh, it's, it's vegetables, right? It's the same kind of thing.

Jennifer Aguirre (05:20):

And that’s our show, thanks for listening! Ever Wonder? from the California Science Center is produced by me, Jennifer Aguirre, along with Perry Roth-Johnson. Liz Roth-Johnson is our editor. Theme music provided by Michael Nickolas and Pond5. We’ll drop a new episode every other Wednesday. If you’re a fan of the show, be sure to subscribe and leave us a rating or review on Apple Podcasts. It really helps other people discover our show. Have a question you’ve been wondering about? Send us an email or voice recording to everwonder@californiasceincecenter.org to tell us what you’d like to hear in future episodes.