Nov 16, 2018 | Atlanta, GA
Ari Schlesinger was spending time at Microsoft Research (MSR) in Cambridge, United Kingdom, shortly after a Microsoft AI chatbot made headlines for devolving into a racist, sexist mess within 24 hours of launch in 2016.
After the incident, an influx of think pieces about the chatbot, named “Tay,” attempted to explain that racism was a design issue. If designed better, they contended, chatbots wouldn’t encounter these problems.
Schlesinger and her MSR collaborators wrote a piece in response, contending that it’s not just a design flaw, but a problem with how tech firms and, more broadly, designers think about these issues in general.
“We wanted to point out research opportunities to figure out ways that we can do better at considering issues like race and identity when designing systems to avoid creating something like a chatbot that reproduces the types of problems that Tay produced,” she said. “It’s important that we really identify the central issue that causes these problems. It’s hard to address a problem if you can’t name it.”
These questions are central to the research she is conducting at Georgia Tech, where she is now a Ph.D. student advised by Professors Beki Grinter and Keith Edwards in the School of Interactive Computing. Recently, she was a finalist for the Foley Scholarship, where she was recognized for her research into ways enterprises can operationalize strategies to support software development with fairness in mind.
Understand the social impact
It wasn’t a straightforward path to studying equity, inclusion, and fairness in computer science (CS), however.
In 2012, Schlesinger’s second year pursuing a CS major through Harvey Mudd at Pitzer College she had a realization. CS degrees, she noticed, were not focusing on the vast social impacts that they were producing. She began to worry that an awareness of this social change might be missing in many CS educational environments.
“About a year and a half into my degree, I was just like – these machines, these programs, they’re ubiquitous,” she said. “They’re in everything. They’re changing the world, and we’re not talking about that.”
It was this realization that led her to course correct during her undergraduate degree. She redefined her major at Pitzer College, adjusted the trajectory of her career to pursue research full-time, and honed in on an area she says is vital to introducing mechanisms in enterprise, education, and beyond that protect against bias and exclusion.
One of the benefits of being at Pitzer College for her undergraduate degree was that Schlesinger was given the opportunity to define her own major. Her interests at the intersection of computer science, humanities, and social sciences led to a degree she designed, called “technology and social change.”
The social impacts at the heart of technology and CS are central to her interests, and she found that CS education was one of the few spaces she had experienced in computing that was really thinking about social impact. Upon graduation, she took a position at Harvey Mudd College running a National Science Foundation grant in CS education called “CS Teaching Tips.”
While there was a semblance of an ethics requirement in most CS degrees, whether or not it was a priority was unclear, and that was what concerned Schlesinger.
“Who is teaching it? How is it defined? What’s being covered? Often what you learn about ethics and these social concerns in CS depends on who you know and what you’re exposed to,” she explained. “Sometimes in academia, I think we have these siloing problems, where one discipline does this and another does that and it’s very hard to move between the two.”
It’s important, she said, that CS departments have someone within them who bring all of these disparate fields together, introducing people to literature and ideas they may not otherwise see in their respective disciplines.
It was this focus that drew her to Georgia Tech, specifically as it pertained to advisors Grinter and Edwards. She was looking for graduate advisors who could get excited about this idea investigating and implementing equity and inclusion within things like programming languages or artificial intelligence. Thinking more broadly, she knew that this wasn’t just a problem within CS education, but within technology as a whole.
“The pot is full,” she said. “Those questions of who gets an advantage or not when we are designing software or when we build computing systems. Technical systems have the opportunity to minimize expansion of harm, but they also have the opportunity to further discriminate. What can we do to stop hurting each other?”
‘The next step depends on you’
Her future work at Georgia Tech will follow a similar path, examining some of these issues of equity and bias in online communities.
There are rampant issues of harassment and discrimination in more traditional online communities. More specifically, there are issues of diversity and inclusion within open-source communities, where programmers interact and work on a tech product that might be widely adopted and will ultimately reflect some piece of those interactions.
“Online communities seem to be places where many people of color, women, people with various marginalized identities are harassed,” Schlesinger said. “That happens in the tech workplace, and it happens in these open-source spaces. Part of our work will look at this distilled problem space and ask questions about what is the connection between inclusion, discrimination and online communities. Are there ways these spaces are designed that inhibit good behaviors or promote bad?”
Of course, her next step in this research only one approach, and she said that it’s important to note there are many paths to pursue. Asked what the next step in this space should be, Schlesinger turned it around.
“The answer is that there is a clear step for everybody and the world would be a better place if we took that next step, but the next step depends on you,” she said. “Who you are, what you’re doing, where you work, what you think about. There is something to do, but what that is depends on you.”