ep 121: how artificial intelligence creates discrimination in hr & recruiting

12

Click here to load reader

Upload: workology

Post on 22-Jan-2018

107 views

Category:

Business


1 download

TRANSCRIPT

Page 1: Ep 121: How Artificial Intelligence Creates Discrimination in HR & Recruiting

Episode 121: Future of Work: How Artificial 

Intelligence Creates Discrimination in HR & 

Recruiting  

 

Episode Link: http://workolo.gy/ep121-wp INTRO: [00:00:00] Welcome to the Workology podcast a podcast for the disruptive 

workplace leader. Join host Jessica Miller- Merrell founder of Workology.com as she 

sits down and gets to the bottom of trends tools and case studies for the business 

leader HR and recruiting professional who is tired of this status quo. Now here's 

Jessica with this episode of Workology. 

 

Jessica: [00:00:25] Welcome to a new series on the Workology podcast we're kicking 

off that focuses on the future of work. This series is in collaboration with the 

partnership on employment and accessible technology or PEAT. You can learn more 

about PEAT at peatworks.org. 

 

Jessica: [00:00:38] Everywhere I turn there seems to be conversations in the human 

resources and recruiting industry centered around artificial intelligence. It is truly 

permeating our landscape, however, myself, along with PEAT we wondered if A.I. was 

as inclusive as it could be. We’re talking digging deep looking at the inclusiveness of 

machine learning technology and exploring the implications for people with disabilities.  

 

I’m joined with Dr. Jutta Treviranus. She is the director of the Inclusive Design Research 

Centre and a professor at OCAD University. Jutta has been working towards more 

inclusive and accessible technology for much of her life. Her focus is now on improving 

artificial intelligence systems so they can better serve everyone, including people with 

disabilities, and organizes hackathons towards this goal. 

 

Jutta, welcome to the Workology Podcast. Can you talk a little bit about your 

background because I think it's really interesting and different than many of the guests 

Workology Podcast | www.workologypodcast.com | @workology

Page 2: Ep 121: How Artificial Intelligence Creates Discrimination in HR & Recruiting

on this podcast previously? 

 

Jutta: [00:01:42] Ok. Sure. I've been interested in the liberating potential and worried 

about the exclusionary risks of computers and networks and digital systems since the 

emergence of personal computers in the late 70s. I was working with a very very 

diverse group of individuals at a university and I felt that computers and networks and 

digital systems had the potential to be wonderful translators.  

 

So I worked in that particular field getting to know the technology better and I started 

the inclusive Design Research Center in 1993 to proactively work to make sure that 

emerging socio technical practices new technologies are inclusive of everyone. The 

team that I created which continues and will be 25 years as of next June is a very very 

diverse distributed community with project partners all around the globe. We grew up 

with the web and have espoused open practices and communicate practices before I think even open was a common term.  

 

We moved to Oxford University in 2010 and I started a graduate program in inclusive 

design and the graduate program is intended to pioneer a more diversity supportive 

form of education. We're experimenting with an inclusive education to prepare 

graduates for all of the variety of things that they're encountering and the many 

changes that are happening in our community. So one of the things I'd love to tell you 

about is our perspective on Disability at the DRC we define disability not as a personal 

trait but as a mismatch between the needs of the individual and the service product or 

environment offered. So in that sense we are all experiencing disability when the 

system is not designed to match our needs.  

 

I'm hoping that we can see disability as a design issue that can be addressed not as a 

personal trait that creates an us them scenario. And so designing our artificial 

intelligence in our HR systems and the way that recruit in our our workplace ultimately 

to be more inclusive means that we're creating it to be better for all of us and to ensure 

that fewer and fewer of us experience disability. 

 

Workology Podcast | www.workologypodcast.com | @workology

Page 3: Ep 121: How Artificial Intelligence Creates Discrimination in HR & Recruiting

Jessica: [00:04:19] This is fascinating to me because you have been working in this 

area this field for 25 years. And I think long before maybe anyone had certainly thought 

about artificial intelligence but really focused on inclusive technology and its use for a 

very long time. 

 

Jutta: [00:04:35] Yes. Actually 25 years ago I started the center but I started in this 

when the apple two plus merged and Tandy Model 100 and all of those things that I'm 

sure very few people remember. 

 

Jessica: [00:04:49] Can you talk to our podcast listeners about your work in artificial 

intelligence and disability inclusion? 

 

Jutta: [00:04:55] Sure. I've been concerned about how diversity human variability and 

especially outliers fare in quantified data systems for some time this predates our use 

of data and big data analytics in artificial intelligence when we use or require big 

numbers of homogenous outcomes to draw any conclusions. People with disabilities 

are often the casualties. If you take say the needs and characteristics of any group of 

people and plot them on a scatterplot you get something like a star.  

 

First, it looks like an exploding star in the middle there will be a dense cluster and then 

other deeds and characteristics will spread out toward the periphery. People with 

disabilities are represented some distance from that dense core and if you look at the 

needs and characteristics at the periphery or where people with disabilities are in that 

starburst you will see that they are much further apart despite the fact that.  

 

And this demonstrates that people with disabilities are more diverse than people in the 

center but we tend to treat anyone beyond a certain invisible boundary away from the 

dense core the same that huge variability between people with disabilities becomes a 

problem because it means that you can never muster the numbers to be seen as 

significant and quite quantify data systems even though the people we relegate to the 

disability category are the world's largest minority. 

 

Workology Podcast | www.workologypodcast.com | @workology

Page 4: Ep 121: How Artificial Intelligence Creates Discrimination in HR & Recruiting

Jutta: [00:06:27] If not a majority as far as quantified data that requires statistical 

power and that's what we've required in all of our data systems is concerned people 

with disabilities are outliers and noise in the data set and what we do with noise and 

with outliers is that we lemonade it when we clean the data or when we norm the data. 

The difficulty with AI and why this becomes such a huge issue with AI is that AI bases 

its decisions on this data and these are hugely important decisions that we don't 

question because we think that artificial intelligence is not biased that is subjective that 

it's not subject to human foibles.  

 

So what happens is that AI AI does not recognize or understand people with disabilities 

people with disabilities don't appear in or fit the models. Machines used to make 

inferences and worse than that. I automates this bias against the outliers and amplifies 

it. If AI is a black box we can we can't challenge the ai ai decisions. And so what is 

happening is that people with disabilities are being impacted in quite a number of 

pernicious ways. One of your top you we're talking about H.R. and especially 

competitive HR decisions we use AI to filter the applications to determine who's going 

to be the most promising candidate who's going to get an interview. 

 

Jutta: [00:08:04] And if you're not part of the data set there's no data about your 

positive performance and so someone with a disability who is an outlier who isn't part 

of the model won't be selected. It however also permeates into other areas of our life 

loans and credit insurance. If your asset portfolio or profile is not something the 

machine is used to it won't determine that you are a good loan risk and if you have an 

anomalous medical history then insurance will not be something that will be decided to 

be a low risk if there is a security assessment or someone is determining sentencing 

predicting recidivism.  

 

Flagging security risks because you are unknown and something that is not part of 

what is part of the model or what is understood by the machine then you're more likely 

to get flagged. And so what you have is a vicious cycle. You are not part of the data 

set you're not understood. You're not therefore going to become part of the data set. 

So it's not a case of let's just add more data because the problem is that all of that 

data is eliminating you. And so you'll never get an opportunity to be part of the data 

Workology Podcast | www.workologypodcast.com | @workology

Page 5: Ep 121: How Artificial Intelligence Creates Discrimination in HR & Recruiting

set. 

 

Jessica: [00:09:29] This is fascinating because I feel like the sell in human resources 

and recruiting industry for artificial intelligence is that it's going to remove bias and help 

make better more consistent hiring decisions. But you're saying that it's removing 

eliminating an important group of people from even being considered for hire or 

promotion or any of these things related to the workplace. 

 

Jutta: [00:09:59] Right. Because it's based on past data. So it will perpetuate what has 

happened in the past. The evidence that someone has done well within a job will be 

based upon past data and the anomalous data. Most data brokerages will eliminate the 

anomalous state or they're trying to find the dominant patterns the strongest largest 

group so that it can be a certain decision that is being made. And so in that way it is 

eliminating what is seen as noise but in fact it's the individuals that are at the periphery 

of that scatterplot I was talking about or people who are in small minorities. 

 

Jessica: [00:10:47] So one of the areas that we've talked about throughout this Future 

of Work podcast series with PEAT is a look at the gig economy and some of these web 

based platforms. And there's a lot of different platforms in human resources and 

recruiting as well as the gig economy that that use AI and machine learning 

components. How can these technologies do better at incorporating all different types 

of individuals outside of that scatterplot so that they aren't impacting those who are 

different and were failing to bring those people into the community or giving them 

opportunities. 

 

Jutta: [00:11:26] So platforms and the flexible economy the sharing economy holds a 

lot of promise for people who previously face barriers to employment or who have 

difficulty participating in traditional employment and they hold a great deal of promise 

for people with disabilities when you are an outlier it is hard to find someone nearby 

with the same d that you have your subject to. As I was mentioning these vicious 

economic cycles products are not made for you if they are they cost more. Meaning 

you're scarce dollars worth less education. It's not optimized for you. Work 

opportunities don't recognize your potential. If you're lucky enough to get a position 

Workology Podcast | www.workologypodcast.com | @workology

Page 6: Ep 121: How Artificial Intelligence Creates Discrimination in HR & Recruiting

the tools you need to do your work won't be accessible meaning you can't 

demonstrate your optimal performance so that someone who is has a disability and 

faces barriers to employment in the current job market.  

 

There's many things that you're needing to battle. Platforms are a potential way to 

support people in recognizing their diverse needs and thereby diversifying the demand 

as well which can then help to trigger a diversification of production and supply. The 

more we push both the demand and the supply out to those edges the better it is for 

inclusion and for people that are outliers and it makes it it provides more choices for 

everybody and when there are more choices than you can find choices that fit you 

whether it's a product or whether it's a job or whether it's a service.  

 

So platforms can reduce fragmentation allowing lú the sharing and pooling of 

resources which makes it easier to address the requirements of individuals who can 

benefit from economies of scale. They also provide feedback loops to continuously 

refine available resources and not only can tools and resources be shared but also the 

building blocks are development tools needed to create things like inclusive product 

support training of people that face barriers to employment or to address gaps 

however there. 

 

Jutta: [00:13:45] So I've been a great proponent of platforms for those reasons. But 

some of the platforms are largely extractive platforms. The value comes from the 

workers but the workers don't govern the platform nor do they receive the profit. 

Because the focus is on short term competitive wealth production for the owners. 

These platforms are less likely to invest in diversification. Most of the AI products are 

used to have a quick win. Not only will people with disabilities fare badly in the 

predictive analytics.  

 

But as I was mentioning if you are not dissipating there won't be data that proves your 

successful performance and the AI that is frequently used in these systems is to find 

the quickest way to address an immediate demand. There is an alternative and that is 

there are there there's a platform co-op movements and they are governed by workers 

and the workers share the profit. They are motivated to create a thriving diverse 

Workology Podcast | www.workologypodcast.com | @workology

Page 7: Ep 121: How Artificial Intelligence Creates Discrimination in HR & Recruiting

community. If you're interested in the greater social good it pays to be as inclusive as 

possible. And so there there is an evolution of some of these platforms and there are 

these emergent platforms that are not looking at data to support the immediate quick 

win but data to support a thriving community with a diversification of jobs and better 

benefits for the workers but also for the employers and for the consumers of the 

services at the inclusive Research Center. 

 

Jessica: [00:15:40] Let's talk a little bit of a reset. This is Jessica Miller-Merrell. You're 

listening to the Workology Podcast in partnership with PEAT. . Today we are talking 

about machine learning and inclusion with Jutta Treviranus. You can connect with her 

on Twitter @juttatrevira.  

 

Sponsor tag: [00:15:40] Future of Work series is supported by PEAT the partnership 

on employment and accessible technology. PEAT's initiative is to foster collaboration 

and action around accessible technology in the workplace. PEAT is funded by the U.S. 

Department of Labor's office of disability employment policy ODEP learn more about 

PEAT at PEATworks.org.  

 

Jessica: [00:15:40] What suggestions do you have for podcast listeners maybe who 

are thinking about adding artificial intelligence technology to to their human resources 

are creating our workplace sort of technology stack so that maybe they select the right 

one or maybe select one that is more inclusive than the other.You focus on three 

dimensions of inclusive design. Can you walk us through these and maybe talk about 

how they apply to machine learning and AI technologies. 

 

Jutta: [00:15:50] Sure. So Universal Design has a set of principles and accessibility has 

a checklist. And one of the things that I was encouraged to do was to come up with a 

set of principles for inclusive design but inclusive design is intended to be relative to 

the individual. So it's not a one size fits all approach it's a one size fits one approach 

because we grew up in the digital domain and the digital can be adaptive. Unlike a 

building where you have to have the entrance work for everyone that might approach 

the building on a digital system can morphin adapt and present a different design to 

each one that that visits that digital says.  

Workology Podcast | www.workologypodcast.com | @workology

Page 8: Ep 121: How Artificial Intelligence Creates Discrimination in HR & Recruiting

 

So instead of a set set of principles I devised the three dimensions and the three 

dimensions are the first dimension is that recognize that everybody unique and help 

people to understand their own uniqueness and create systems that fit that unique 

diversity of requirements or one size fits one the the second dimension is ensure that 

there is an inclusive process. This means designing the tables so that everyone can 

participate in the decision making and in the design because we all benefit from 

diverse perspectives.  

 

We have far better planning better prediction and much greater creativity if we have a 

diversity of perspectives participating in the design and what you want in inclusive 

design as co-designer so authentic code design where the individual that the design is 

intended to be used by is part of the design process and then the third dimension is 

recognize that we're in a complex adaptive system. No design decision is made in 

isolation. It reverberates out to all of the connected systems that are in the context of 

the design. So create a design that benefits everyone and create virtuous not vicious 

cycles. 

 

Jessica: [00:18:17] I feel like all of these could be extremely helpful in many of these 

artificial intelligence machine learning companies that are building these platforms or 

anyone in really in technology to consider when when they are trying to to create a 

community or a technology that's inclusive to all. 

 

Jutta: [00:18:41] Yes definitely. Yeah. The Actually when we take it inclusive design 

approach to AI AI one of the things that that we frequently do with it in our inclusive 

design sessions or one activity that I often do is an activity called the grandparent 

grandchild conversation. I don't know if you have a toddler or if you don't have a 

toddler there's that continuously the unpacking of why are we doing this.  

 

And that is a little bit of a characterization of what happened when I started to look at 

the inclusive design of artificial intelligence because it prompted us to unpack not just 

what is happening with in a eye but also being part of an academic's situation. It prompted us to think about how are we dealing with evidence how do we determine 

Workology Podcast | www.workologypodcast.com | @workology

Page 9: Ep 121: How Artificial Intelligence Creates Discrimination in HR & Recruiting

truth. Are our research methods that we're using research methods that are supportive 

of diversity and understanding of diversity.  

 

And you can trace the the history of where we ended up with a I and with machine 

learning and how we're teaching our machines. It traces right back to statistical 

research and how we are determining what is true and what is statistically significant or 

what has statistical power within all of our research and academia the in terms of 

applying those three dimensions. Diversity supportive data and evidence is something 

that has benefit for all. We have been exploring something called small thick bottom up 

data that understands the outliers.  

 

Making ourselves smarter not just machines smarter as well. Small data is is also 

called and equals me data or and equals one Data think data is looking at data in 

context. The process that we try to encourage is transparent parent purchase the 

parent patrie AI code design inclusive process and in adding diverse perspectives. 

 

Jutta: [00:21:16] And the third dimension understanding the complex adaptive system 

that we are in if we create machines that understand and recognize diversity and have 

inclusion embedded in their rules and models it will definitely benefit everyone in all of 

the experiments that we've done so far show that this is something that's critical not 

just for people with disabilities but for all of us. The goals that we have for artificial 

intelligence are better match when we don't ignore the outliers when in fact we are a 

diversity friendly in our knowledge and understanding.  

 

The really interesting and this is not part of your question but the really interesting thing 

that I found is that previously when we talked about stereotypes or bias or prejudice in 

conversations with hard scientists or with our fellow academia and it was seen as a 

soft science as something that was not easily verifiable and therefore in a sort of 

hierarchy of academic rigor it wasn't seen as something that was as well respected. 

And but now that we can actually manifest the issues within artificial intelligence we 

can show that look at if you don't include the state here or if you use only the dominant 

patterns and you ignore the full diversity if you base all of your knowledge and 

assumption on homogenous data or the same effect in one large data group then 

Workology Podcast | www.workologypodcast.com | @workology

Page 10: Ep 121: How Artificial Intelligence Creates Discrimination in HR & Recruiting

you're not going to do all of the things that you hope to achieve. Whether it's 

innovation or whether it's risk detection or whether it's better prediction or planning. 

 

Jessica: [00:23:14] I felt like that last piece is I think something that employers or 

businesses can can really take advantage and think about right. What happens when 

you aren't including all people from different backgrounds education levels interests. 

And how does that impact creativity the success of your organization moving forward 

with a project now. You have some data maybe some research to support the case for 

diversity work itself is changing quite significantly. 

 

Jutta: [00:23:48] We still have systems and understandings that come from the 

industrial age when we need a direct replicate able workers. And that's really not the 

reality at the moment. If a organization wishes to survive in the current reality then what 

you need is you need an agile diverse collaborative team and we've set up our systems 

and the data that we're teaching those systems information and rules and algorithms 

from another era the best way to effect the cultural change that we need within work is 

to create AI systems that are diversity supportive that doesn’t rid us of those that 

outlying information and that don't ignore people who are at the periphery and who 

offer an alternative or greater variability of skills and understandings and perspectives. 

 

Jutta: [00:25:54] So the question was what advice to give. In choosing the AI and 

machine learning programs like if somebody was purchasing it is that. 

 

Jessica: [00:26:03] Yes so let me give you an example. There isn't an artificial 

intelligence technology it's a video technology. And they use AI to measure 

truthfulness and your body language to determine if you're happy angry sad mad if you 

are what type of personality are so HR and recruiting leaders are seeing a lot of 

different types of technology that whether it's video or sourcing technology that will 

pull the best candidates to the front lines and score them and present them to an 

employer.  

 

What kind of things should they be thinking about when they're looking at these these 

technologies. Because all of them are not created equal and they're certainly not all 

Workology Podcast | www.workologypodcast.com | @workology

Page 11: Ep 121: How Artificial Intelligence Creates Discrimination in HR & Recruiting

likely thinking about inclusive design. 

 

Jutta: [00:26:55] The first thing that I would recommend is that you choose a system 

where it's apparent what the rules are and you know what set of data the system is 

trained on. Is the system trained on a full diversity of the types of employees that you 

hoped to choose. Or is it trained on a homogenous data set that has been cleaned of 

edge scenarios. The other thing that I would encourage you to do is to find a system 

where there is the opportunity for a feedback loop where new data can be added 

where you can add additional filters or at least query the filters and remove filters that 

cause that may cause bias.  

 

The other thing that I would do is to choose a system that is reactive or that is 

responsive to the scenario that you that you are working in. So not something that only 

gives you generic data but that allows you to continuously adapt the system to fine 

tune the working environment and the requirements that you have.  

 

And then lastly I would encourage you to get a system that allows you to do some 

predictive modeling so scenarios that may not have been found within the current data 

set because the the one thing as I was describing people with disabilities or people 

who have been traditionally left out of a work environment will not have data that can 

be used for decisions. But there are ways to model current situations that were not 

there in the past to see well what with this particular skill what with this new strategy 

what would this add to the team that I have.  

 

That's the the last piece that I would say is that you want any system that understands 

a team not in a system that is trying to produce a set of repeatable similar homogenous 

workers. So the an AI system that can orchestrate a variety of diverse perspectives and 

diverse skills to make a good team within the workplace. 

 

Jessica: [00:29:18] Awesome. Well thank you so much for taking the time to talk with 

us today. Where can people go to learn more about you and you what you do. 

 

Workology Podcast | www.workologypodcast.com | @workology

Page 12: Ep 121: How Artificial Intelligence Creates Discrimination in HR & Recruiting

Jutta: [00:29:27] They can go to the inclusive Design Research Center Web site. 

Actually my name is a unique identifier so if you search from my name many things will 

come up related to inclusive design.You can learn more about the challenges that we 

design challenges that we've been engaging in to stretch the design of artificial 

intelligence to be more inclusive. 

 

And there we have a design challenge called start your machine learning engines and 

race to the edge in it. We have many different schools that are putting together edge 

scenarios that current artificial intelligence systems may not understand or may not 

have heard of or recognize and we are using those edge scenarios to challenge the ai 

ai developers to see which one is better at encompassing the full range of human 

diversity. 

 

Jessica: [00:30:36] Well we'll go ahead and put links to the Bigidea.one and then kad 

your program and then as well as some other research that I have found that was 

pretty interesting in your work. I'm focused on inclusive design so thank you so much 

for taking the time to talk with us. Thank you.  

 

This by far has been a really eye opening discussion for me and hopefully for you on 

the subject of artificial intelligence and human resources and recruitment. I’m looking 

at some ways that this tech might identify patterns that a know and only eliminate 

candidates or employees who don't fit traditional patterns or models. While I love 

technology I absolutely do. It is such an important part of what I do.  

 

We do need to push back a bit on these AI technologies in our industry and educate 

ourselves outside of their demos and slick marketing to really understand the things 

that we need to be asking questions about. Thank you for joining the work I'll you 

podcast a podcast for the disruptive workplace leader who is tired of the status quo. 

This is Jessica Miller-Merrell. Until next time can visit work alci dot com to listen to all 

our previous podcast episodes. 

 

Episode Link: http://workolo.gy/ep121-wp 

Workology Podcast | www.workologypodcast.com | @workology