fbpx
December 27, 2024

Podcast Episode: People With Disabilities Are The Original Hackers

Podcast Episode: People With Disabilities Are The Original Hackers

People with disabilities were the original hackers. The world can feel closed to them, so they often have had to be self-reliant in how they interact with society. And that creativity and ingenuity is an unappreciated resource.

play

Privacy info.
This embed will serve content from simplecast.com

Listen on Google Podcasts badge  Listen on Apple Podcasts Badge
Listen on Spotify Podcasts Badge  Subscribe via RSS badge

You can also find this episode on the Internet Archive.

Henry Claypool has been an observer and champion of that resource for decades, both in government and in the nonprofit sector. He’s a national policy expert and consultant specializing in both disability policy and technology policy, particularly where they intersect. He knows real harm can result from misuse of technology, intentionally or not, and people with disabilities frequently end up at the bottom of the list on inclusion. 

Claypool joins EFF’s Cindy Cohn and Jason Kelley to talk about motivating tech developers to involve disabled people in creating a world where people who function differently have a smooth transition into any forum and can engage with a wide variety of audiences, a seamless inclusion in the full human experience. 

In this episode, you’ll learn about:

  • How accessibility asks, “Can we knock on the door?” while inclusion says, ”Let’s build a house that already has all of us inside it.”
  • Why affordable broadband programs must include disability-related costs.
  • Why disability inclusion discussions must involve intersectional voices such people of color and the LGBTQI+ community.
  • How algorithms and artificial intelligence used in everything from hiring tools to social services platforms too often produce results skewed against people with disabilities.  

Henry Claypool is a technology policy consultant and former executive vice president at the American Association of People with Disabilities, which promotes equal opportunity, economic power, independent living and political participation for people with disabilities. He is the former director of the U.S. Health and Human Services Office on Disability and a founding principal deputy administrator of the Administration for Community Living. He was appointed by President Barack Obama to the Federal Commission on Long-Term Care, advising Congress on how long-term care can be better provided and financed for the nation’s older adults and people with disabilities, now and in the future. He is a visiting scientist with the Lurie Center for Disability Policy in the Heller School for Social Policy and Management at Brandeis University, and principal of Claypool Consulting. 

Transcript

HENRY CLAYPOOL
The disability community, I would argue, were the original hackers. The world is not a very accessible place and therefore disabled people have largely been self reliant in terms of how they’re going to get to and go to and do things in our society. And that creativity and ingenuity is something that I think is a kind of an unappreciated resource in our society.

CINDY COHN
One of the most powerful things about technology is the role that it can play in society as a tool for inclusivity, often far beyond the original goals. Of course inclusivity matters even if it’s only helpful to those it’s aimed at, but there are so many examples of the broader impact – from audio books to closed captioning to curb cuts – all of which I use on a regular basis. Technological changes aimed at assisting people with disabilities have so often not only accomplished their original goals, but helped those without as well.

But as with all tools, thought and intention are critical. Real harms can result from misuse of technology – whether intentional or not. And that’s especially true for marginalized groups. And people with disabilities end up at the bottom of the list on inclusion all too often. That’s not only bad for them, I think it’s bad for all of us.

I’m Cindy Cohn, the Executive Director of the Electronic Frontier Foundation.

JASON KELLEY
And I’m Jason Kelley, EFF’s Associate Director of Digital Strategy. This is our podcast series: How to Fix the Internet.

Our guest today is Henry Claypool. He’s a national policy expert and consultant specializing in both disability policy and technology policy – and is especially interested in where those two things intersect.

He is the co-author of a report put out by the Centre for Democracy and Technology called “Centering Disability in Technology Policy” – we’ll be referencing the report a bit in our conversation, and you can find a link to it in our show notes.

CINDY COHN
Henry, thank you so much for joining us today. On this show, we try to imagine what the world looks like if we get these things right. So let’s start there. What does it look like if we’ve actually created an accessible digital world?

HENRY CLAYPOOL
Oh, I think it’s far more inclusive. People that function differently – basically, by definition, disability often boils down to that fact – would experience kind of smooth transition into any forum and be able to engage with a wide variety of audiences. Not so much dependent on their ability to physically get somewhere or making sure information wasn’t an accessible format. All of these things would just be seamless. And so the interactions with disabled people, I think, would be much more welcoming and probably inviting to real, genuine human exchange. Instead, right now we have to rely on assisted technology, which can create its own barriers, even though it’s facilitating access to basic things.

We’ve got multifaceted barriers that we’re still trying to deal with. And technology can be leveraged to work on all fronts. And I think it’s a matter of getting the makers and the developers of the world to really think hard about these fundamental access challenges that the population faces, and then begin to address them in a more systematic way.

Right now we largely rely on people sitting in a garage thinking about what the new big challenge they’re going to take on, but they don’t really think about people with disabilities when they’re developing their list of targets.

JASON KELLEY
We’ve heard you say, or write, is that talking about accessibility isn’t enough, and I’m wondering what, what should we think about expanding the focus beyond just accessibility? Like what does that look like and, and what would the implications of that be?

HENRY CLAYPOOL
Well, I think there are some fundamental challenges that we have with that. I think accessibility has provided a nice window for the tech community into the disability experience, but it’s largely focused on people with sensory disabilities. And honestly, I think what drives this is regulatory obligations, right?

Under the Americans with Disabilities Act, there are standards that need to be met. You need to facilitate effective communication for people with disabilities, and that tends to drive a lot of what people work on. And so, instead of just approaching this as a compliance issue, I think reframing this as inclusion of disabled people in the work gives us a better opportunity to really think about what the problems are, where the barriers are, and how we can go about addressing them.

JASON KELLEY
I wonder if the way you’re describing inclusion helps get at this sort of dichotomy that I’ve been thinking about when we talk about technology and how it intersects with disabilities. There’s this idea that, you know, is the tech accessible, and then there’s this second idea of is it disproportionately harmful to people who are disabled? And I’m wondering if, like, am I imagining that dichotomy correctly and is the idea with something like, you know, reframing it as inclusion to sort of encompass those two ideas together.

HENRY CLAYPOOL
Yeah, I think that’s part of what’s going on. One of the ways I think about in the reframing is that you broaden the aperture of the lens when you go beyond just accessibility. Because then we begin to bring in the whole concept of the social construct of disability. What does it mean? Who’s included in that population?

And so we start to think about it maybe even differently from how is technology gonna facilitate access? Instead, we begin to think about how do people with, you know, wide ranging disabilities function when they use technology? And it’s not so much a matter of accessibility as it is really thinking about what do we need to do to make sure that the technology is useful?

And I’ve got a really concrete example. When we talk about accessibility, and this is a trap that we often fall into. We really skip a primary consideration, which is often around affordability because tech is expensive. And most people that have significant disabilities that might rely on a program like Social Security, supplemental Insurance, don’t have the discretionary income to go out and buy a new piece of tech.

And so, or you know, we have now broadband subsidies that are being made available through providers. Well that’s okay, but they really haven’t focused on disability there because of the additional costs that are disability-related. That program I think falls short of really achieving the equal access that the Congressional framers set up when they funded the affordable connectivity program.

CINDY COHN
Yeah, I think that’s really right. And I love that framing because accessibility is like, can you knock on the door? And an inclusion is, did you build a house that has us already inside it. Right. And I think that that framing is better. And what I also like about it is it’s easier to put that into a kind of an intersectional thing, right? And so by framing it as inclusion, you’re kind of moving the disabled community out of a silo and into something that’s more about what we’re trying to do more broadly. I think. I think that’s part of what’s going on in that language.

HENRY CLAYPOOL
And maybe another dimension to it, I’ll just use myself as an example. As a white male who has a disability, but I acquired it. So my identity was really shaped and formed before I acquired this disability. So my lifelong struggle since disability has been incorporating it to who I am. But that really gives me insight into my experience. And when I think about identity and an intersectional approach that is hugely influential in how we go about selecting the issues that we want to tackle from. An inclusionary perspective or just if we wanted to be more pragmatic and say, what are the tech policy issues that need to be addressed first?

And that idea that we don’t have people from marginalized communities at the table helping us shape that priority list, I think is one of the fundamental challenges that we face today in technology policy and in the field that I’ve, you know, typically come up in and around what is disability policy, how is it gonna be defined in the future?

CINDY COHN
What’s the philosophy behind your approach to creating technology policy and, and how should we be thinking about how to frame disability issues when it comes to tech policy?

HENRY CLAYPOOL
Great question. And I do think it involves making sure we look around and who we’re talking to, and from the very beginning we have to make sure that we have voices that have traditionally and historically been excluded in that conversation. Once we get to that point, we can start to identify the types of issues that we want to address through our tech policy agenda that’s largely in this case, gonna be focused on disabled people and addressing their concerns.

Because without that foundational piece, I think we’re really going fall victim to some of what’s been discussed around accessibility and how we approach issues and the types of steps that we take to mitigate or address barriers. We miss out, I think, on what the centering disability and technology policy does well. It enumerates a number of areas where marginalized communities actually are really disadvantaged simply because they live with a disability and they happen to be someone of color, have a different orientation sexually or non-binary populations. So these are all real challenges, and if they aren’t addressed from the outset, I think we end up moving in directions and pursuing solutions to problems that will perpetuate some of the barriers that our society’s built up and we all know that leads to these significant inequities that we live with today.

CINDY COHN
Yeah. Could you give us a good example? I think this is a good one to help ground people a bit.

HENRY CLAYPOOL
I would really go back to kind of the just basic access to broadband. And if you really want to get online, that’s nice. And if you live in a middle class house, where the broadband hasn’t been redlined that’s great. And if you can afford the devices that will allow you to get online even better. But if you’re disabled, person of color, growing up in an area where the only access to broadband you have is at a school, or now maybe the area where you live hasn’t been the the focus of the provider’s rollout of their broadband infrastructure.

And so without, you know, making sure that those populations are thoughtfully included, we end up with what’s now being defined by the FCC as digital discrimination. But again, that has its limitations, even though I think it’s rightfully prioritizing these issues of race and income, which touch a lot of people with disabilities. The challenge is, it doesn’t go deeply enough into this intersectional exploration of disability to think about, oh, that means that the person will be needing a different type of device to access the internet. And therefore, our affordable connectivity program offering up subsidies that limit devices to $75 isn’t really sufficient to achieving the goals of the statute, which are granting equal access to all with broadband.

JASON KELLEY
I think it’s important that we’re talking about broadband, since it is so critical to how people access the internet in general. But I wonder if we can expand out a little bit and talk about a newer technology as an example of sort of a problem or concern that we often see in how technology impacts people with disabilities. So in the last few months we’ve seen this explosion of use in AI tools, these large language tools, ChatGPT, that I think could have a really positive impact on how people use technology and what they get out of it. And that would include, I ghope, people with disabilities. But this morning I went to the ChatGPT website and saw they weren’t accessible to someone with a screenreader. And that surprised me because this isn’t some tiny company in someone’s garage, this is a company with a huge amount of funding, and if they can’t make their technology accessible, how do we know that the models that they’re using to train the date and the results that are coming out of it aren’t going to be potentially harmful to people with disabilities, if it seems like they’re not already taking those folks into account. And I wonder if you have an idea of why it’s so hard for these newer technologies, even with large amounts of funding, to build something accessible.

HENRY CLAYPOOL
Well, I would can answer that in two ways and I think they’re both right. One is, this is largely driven by people that are trying to bring a product or service to market. And so when you look at disability as a target market, it may not look appealing. I think if you take another look and think about how technology has facilitated access, ease of access, I’ll say not just access for disabled people, but I think voice recognition is a great example. Um, using what used to be called dragon dictate or something along those lines.

JASON KELLEY
I remember those days.

HENRY CLAYPOOL
Yeah. And, you know, heavy, heavily trained, you really had to work with it, but once you did, um, it could pretty effectively do your writing for you if you weren’t able to use a keyboard and now that technology is just integral to so many things and I think leads to, like, generative AI and those types of things are going to be built in. Yes, so there are really good examples there of how the market dynamic both stymies and, if we think anew about it, could be like, if we think about solving a problem for a disabled person, what are the broader market applications that come out of that?

And so those folks that are just market oriented, I think there may be some incentives for them to think about learning from the past and how tools have been developed specific to disabled people. But at the end of the day, they’ve really just facilitated access more broadly or ease of access for the general population.

JASON KELLEY
Let’s take a quick moment to say thank you to our sponsor.

“How to Fix the Internet” is supported by The Alfred P. Sloan Foundation’s Program in Public Understanding of Science and Technology. Enriching people’s lives through a keener appreciation of our increasingly technological world and portraying the complex humanity of scientists, engineers, and mathematicians. So a tip of the hat to them for their assistance.

Now back to our conversation with Henry Claypool.

CINDY COHN
We’ve been talking a lot about the ways in which inclusion could be a good thing. I think we’re also in a time where we’re seeing a lot of, say, generative AI and machine learning tools and other tools that measure humans against some norm end up creating discriminatory outcomes for people with disabilities. And I know you’ve got some experience and done a lot of thinking about that. I’d love for you to talk a little about that kind of, you know, the, the other side of this, which is not just that we want technology that’s inclusive, but we want technology that doesn’t supercharge discrimination either.

HENRY CLAYPOOL
Yeah. And boy, algorithms and AI really present challenges there. And just what we’ve seen, I think in the report it touches on. And since the report was issued, we’ve done some deeper work on the automated hiring tools and they’re kind of ubiquitous out there in the world now. So, to the extent that employers are using them, they haven’t been developed with this population in mind. And it’s really interesting from a policy perspective, the Americans with Disabilities Act has employment provisions, which actually require employers to offer up what might be considered a reasonable accommodation in that hiring process. Well, if you don’t know that you’re interacting, Some AI that is going to be used to select who goes on to the next step, you’ve been denied your rights. You’ve not been able to step forward and say, ‘Hey, I’m not sure this format’s gonna work for me, and I think I need an accommodation.’ So that’s an example of how just fundamentally, the development of the algorithms and the other decision making tools have kind of missed the boat.

But there’s, I think, a larger challenge to AI. And that has to do with really thinking about how difficult it is to capture the disability experience. It’s not something that’s easily generalized. And the experience is extraordinarily heterogeneous. Just think about the different types of disabilities that exist in our society. And how are you going to make sure that the tool that you’re developing is sensitive to and aware of the predictions that it needs to put forward, not disadvantaging all those different populations. So they really are, I think, one of the bigger challenges that we in the disability community have to struggle with.

JASON KELLEY
One of the things that I think people working in the tech space, who are interested and thoughtful around inclusion and determining how harm might be caused by the tools they’re making. One of the problems that I think people run into is how do you know when the tool you’re building is accessible to all the people, or how do you know when the tool you’re building is not going to cause harm?

I wonder if you have any advice for people who are working on that to think about, you know, what they can do so that they don’t feel that there’s sort of a constant push and sort of moving of the goal line, even if there should be a constant rethinking of, of how the tool works. But how should people think about that difficulty of finding that end goal?

HENRY CLAYPOOL
I think that’s a great way of framing this because where the automated tool is deterministic and going to render a final result, one won’t know whether or not they’ve really experienced disability discrimination. The individual may, as an after effect of interacting with it, but the people that develop the tool won’t.

And there’s this really great example that came up. Allegheny County in Pittsburgh. Their human services developed a tool to kind of screen calls that came into their child welfare lines. And so they were thoughtful. They hired a contractor who helped them think about an ethical approach to developing their tool. But what they didn’t think about was the fact that disabled people have children too, and maybe disabled people function as parents differently than what you might expect from parents that aren’t disabled. And lo and behold, the results of this algorithm have been found, and ACLU has actually looked at this, it’s been around for a little while. So you find even those developing the tool with good intentions need to be vigilant about making sure that it’s modified over time to be more effective and more inclusive. And I think it’s possible to really build these steps in when the tools used and it’s getting the types of outcomes that may be leading towards, in this example, disability discrimination.

You need to get into there and start thinking. All right. How did we build this? And in this case, it was fairly clear to me that the people. That helped develop the tool. Really didn’t think about the fact that disabled people had and raised children and that they might do it differently.

And I wonder if that might be really the way to best approach this as being that intentional from the outset.

CINDY COHN
Yeah. I think that what was happening in this situation and in several others we’ve seen, is that disabled parents were getting flagged as potentially negligent, and I think in some of these even had their children taken away because the automated systems’ presumption about what was normal didn’t reflect the, the perfectly reasonable alternate way that these folks were parenting.

HENRY CLAYPOOL
Yeah. And the reading that I did showed that they were very thoughtful about protecting the child. And there’s a bigger equation there, right, about who’s doing the parenting and how they go about doing that. That was left out probably in the original design, and so we can, we can get back, we can start to make adjustments and hopefully those types of corrective measures lead to an algorithm that is that, you know, this algorithm was written up by University of Pittsburgh as, you know, an example in terms of intentional design. So ironically, it ends up getting highlighted as perpetuating disability discrimination in parents.

CINDY COHN
I hear a couple things in here that are really important. One is, you know, have disabled people and people who experience, involved very early on. So you’re building the tool, but I think also making sure that you’re monitoring and that you’ve built a tool that can adjust because inevitably it’s not gonna be perfect.

And you’re going to find other use cases, and especially around machine learning and AI systems where it’s difficult to know what system is solving for it needs to be able to be flexible. So when you find that there’s a community that’s not being served or hasn’t been reflected in it, you can actually make those changes.

They shouldn’t just be bolted on at the end. You need both. You need building your best, but also recognition that you’re gonna need an accountability strategy and you’re gonna need to be able to adjust when that happens. I think especially in hiring, we’ve seen some of these tools have extremely. I think naive and unreasonable assumptions about say what makes a good employee, right?

For example, we’ve seen the hiring platforms that try to use eye contact as a proxy. It’s not good science – there’s no evidence that employees who can maintain eye contact during an interview are better employees – but it also discriminates. People with eyesight issues and also many who are neurodivergent will not score highly on these systems even though they may be great employees.

So I think that the disability community often presents a very easy to understand way in which this mistake plays out. We so often see technologies that substitute something we can track and count for something that what we actually want to find out. It’s critical to interrogate that connection because it’s so often both wrong and discriminatory.

HENRY CLAYPOOL
Well, thanks for raising the example because it gives me an opportunity to plug disability policy again. What we have here in the Americans with Disabilities Act and its employment provisions. To determine whether or not somebody’s experienced disability discrimination, one must look at the essential job functions, and so that moves us away from this world of, gee, I think eye contact is important to be a good customer service person to no, no employer. Think about what the essential functions of this job are and then ask questions that are directly related to that. And that will really influence the tool that’s been developed. And so I think there’s an another great example of how not only will applying disability policy, Work well for disabled people, but for the broader population.

Cuz just think now about other civil rights communities that have had challenges with different types of automated hiring tools. This now has the employer take on the burden of thinking about in the design, what are the real central aspects of the job? What are the skills that are needed to fulfill it?

And let me develop my tool based on those issues instead of the going into the realm of pseudoscience and thinking about what might be good personality matches, et cetera.

CINDY COHN
Yeah. And what I hear you saying that’s really important is that we actually have the law already, unlike a lot of other situations in which we need to think about how do we change the law to help here, this is a situation in which we have actually a pretty good law and we just need to, honestly enforce it and apply it in this new context. And that in some ways that’s good news because it puts us a far ahead of some other issues where, you know, we actually need a legal framework.

HENRY CLAYPOOL
And there’s even been evidence of that. Ironically, or maybe it’s not an irony. Um, the EEOC and the DOJ issued guidance on algorithms and AI and automated hiring tools, generally telling employers. To be aware of disability discrimination for the very reasons that we’ve discussed. Because the law is so clear in this regard, it’s easy for them to then take on the issue and say, we can develop guidance now about what disability discrimination would look like.

Whereas in the civil rights laws, we don’t have the clear contrast to what is discrimination look like for those populations. But again, bringing it all together, we can use this law to benefit other populations as well.

CINDY COHN
Thank you so much for talking to us and reinforcing, I think, what we’re seeing throughout this series of podcasts, which is how important inclusion is, especially of marginalized communities, if we’re gonna get to a place of a better internet, and this community is vital and important and I really appreciate the work that you and other people have done to make sure that the conversation includes people with disabilities as part of thinking about all the communities that we want to, you know, win from our better future.

HENRY CLAYPOOL
It was my pleasure. Really enjoyed this time and you know, I hope that your listeners can maybe find something in here that they can take away from it and apply in their work.

JASON KELLEY
I’m so glad we got a chance to speak with Henry about this issue. I really was struck when he reminded us, or me at least, that often people with disabilities have to hack. I think that’s so true that, you know, frequently we hack things as sort of tech people because they don’t work the way we want them to, or they don’t work for us and if tech is built with, without you in mind, that’s going to mean that you have to hack a lot more things. So, of course, that’s a group of people that we should all think of as sort of the original hackers. And I loved that point. Cindy, I’m wondering what else you took away from that conversation, which, which was so rich and full of good ideas.

CINDY COHN
I mean, what I loved hearing is this idea that the disability community is this outside silo of people who need accessibility. They need to, we need to, you know, they’re knocking on the door for tech and we need to open the door for them to one where we really need to think about inclusion and I like this metaphor, obviously, but this idea that we need to build a house that includes them to begin with rather than just make sure that the door can be opened if they knock, and how that changes the way you think about it. And I also really appreciate that, you know, the way that is intersectional, right?

It’s really thinking about, first of all, the whole person, not just the disability. So the, the fact that people are disproportionately from other marginalized communities, especially around poverty. We know that the disabled community in the United States is often much poorer than the rest.
They have less access to broadband and other kinds of foundational things. And I think the shift in the thinking helps make sure that we’re talking about the whole experience of people with disabilities and not just focusing on the kind of individual disability of a bunch of different people.

JASON KELLEY
Absolutely.

CINDY COHN
I also think that, in terms of thinking about the disability community as hackers, it really does begin to shine a light on the needs for interoperability, the need for open standards, the need to create technologies that are open to being used in different ways other than the ways that the creator intended.

And how do we think about building technologies that do that? And I think one of the tools that EFF thinks about a lot in this regard is standards and interoperability and things like that, that present, you know, real barriers to people who say, want to be able to plug a reader into chat GPT or any other technology that they use, that, you know, this is a place where we need to free up space for hacking to happen.

And that, you know, takes us back to the kind of law and policy side of what EFF does. The other thing that I really appreciated as we think about how do we build about a better world, that, that the Americans with Disabilities Act is really already there and already has good principles and doesn’t need to really be amended or changed as we just need to apply it, and how helpful it is when you’ve got the law already aligned. It doesn’t mean that we’re done, but it is one of the pieces of building a better world that, at least in this instance, is already in place and we don’t have to start from scratch on the legal side. You know, that doesn’t mean it’s easy or it’s done, but it does set us up in a better place to try to get to a better internet.

JASON KELLEY
At least there’s guidance.

Well that’s it for this episode of How to Fix the Internet.

Thank you so much for listening. If you want to get in touch about the show, you can write to us at podcast@eff.org or check out the EFF website to become a member, donate, or see what’s happening digital rights-wise this week and every week.

This podcast is licensed Creative Commons Attribution 4.0 International, and includes music licensed Creative Commons Attribution 3.0 Unported by their creators. You can find their names and links to their music in our episode notes, or on our website at eff.org/podcast.

Our theme music is by Nat Keefe of BeatMower with Reed Mathis

How to Fix the Internet is supported by the Alfred P. Sloan Foundation’s program in public understanding of science and technology.

See you next time.

I’m Jason Kelley…

CINDY COHN
And I’m Cindy Cohn.

Music credits

This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by its creators:

Come Inside by Zep Hurme
Drops of H2O the filtered water treatment by J Lang


Published May 16, 2023 at 03:01AM
Read more on eff.org

%d bloggers like this: