Published on

    Technocentrism Caused Me to Falsely Assume About the Deaf


    During Hyperjump's weekly sharing session, my colleague Kevin shared a perspective I hadn't considered before. He recounted a conversation with his friend, a sign language interpreter, about the potential impact of AI on their profession. With the rapid advancement of AI technologies, Kevin wondered if his friend felt his job might be at risk in the foreseeable future.

    Kevin's friend shared an insightful example that reshaped our understanding. He described a scenario familiar to many of us. At a conference, organizers use AI to transcribe speakers' words into subtitles in real-time. It seems to bridge the gap for deaf attendees. On the surface, this technological solution appears to address accessibility concerns effectively. But, this assumption overlooks the deaf community's nuanced needs. It also ignores the rich, cultural depth of sign language as a form of communication.

    The interpreter explained that while live subtitles seem thorough, they don't always help the deaf as much as one might assume. Many deaf individuals consider sign language their first language. Written language—often the form used for subtitles—is their second. Just as with any second language, proficiency levels can vary, making subtitles less accessible for some.

    This revelation highlights a broader issue: technocentrism. It's the belief that technology can solve all our problems, including those of a complex, social nature. We are eager to embrace technology. But, we risk oversimplifying human experiences and needs. This assumption is a prime example of the oversight. It says AI subtitles could replace humans. But it fails to acknowledge the cultural and linguistic nuances of sign languages. They are not just visual versions of spoken words but rich languages. They have unique grammar and syntax.

    Moreover, this technocentric viewpoint underestimates the importance of human touch in communication. Sign language interpreters do more than translate words. They convey emotions, emphasis, and cultural context. This ensures that the message is fully understood. An AI, no matter how advanced, lacks the ability to interpret these subtle yet crucial aspects of communication.

    Kevin's story is a powerful reminder of the limits of a purely tech-focused approach. This is especially true for accessibility. Technology offers amazing tools to improve our lives. But, it can't fully replace the deep understanding and empathy from human interactions. As we move forward, embracing the advancements AI brings to our world, we must remain mindful of the diverse needs within our communities. Making things accessible means offering different options that respect and support the unique language and choices of the deaf community.

    Kevin's talk with his friend challenges us. It urges us to look past the allure of tech fixes and to value the complexity of human needs and experiences. It shows the need for diverse perspectives in our talks about accessibility and technology. They ensure that we don't leave anyone behind in our quest for progress.

    Are you working in a team environment and your pull request process slows your team down? Then you have to grab a copy of my book, Pull Request Best Practices!

    Did you like this post?

    I'm looking for a job as full stack developer. If you're interested, you can read more about me here.