![]() A few weeks ago we had our first meeting with our Core Team in Seattle, Washington, and it was my first foray into DeafBlind culture. The meeting included 4 DBI staff (all hearing/sighted), 4 individuals who are DeafBlind, and one hearing/sighted VR counselor with a DeafBlind caseload (an RCDB). Of the four DeafBlind individuals, two were the developers of ProTactile (and contractors on the project), one was a VR Counselor, and one teaches in an interpreter training program educating students about working with individuals who are DeafBlind. There were also six interpreters working with three of the DeafBlind individuals (three hearing and three deaf). The meeting was conducted in ASL. Hearing/sighted norms had no place here. First, spending the time to get to know who was there and to get to know each other was important. In this situation, we all lined up in two rows facing each other. We each chatted a few minutes with the person across from us, and then switched partners. It was a very intentional way of leveling the playing field. It wasn’t just the hearing/sighted people who knew who was there and who could get a feel for the different personalities in the room. Once we had all talked with each other, we were ready to join the circle; no rows of chairs and tables for this group. For one thing, people aren’t likely to be taking a lot of notes on computers if they are relying on some type of tactile sign language. And there are often lighting challenges for people with low vision. Going from room lighting to looking at the computer lighting may be very tiring for their eyes, which are already taxed in a 3.5 day meeting. Thus, having a designated notetaker (a talented team member who could watch the language and take notes at the same time) was important for everyone, but not so with tables. Tables also get in the way. DeafBlind culture is a culture of touch. When you are sitting next to someone, you always stay in physical contact, whether through your leg or foot up against theirs, or having your hand on their thigh. This is so they know you are still there, and it allows them to know if you are agreeing or disagreeing with a speaker (for example, by patting their leg) without interrupting the interpretation. (If you are talking, they will also have a hand on your thigh indicating they are following you). The interpreters all needed to be able to see who was signing so that they could interpret in PTASL, and tables definitely get in the way of the flexibility that is needed. As someone who currently knows very little about the features of PTASL, it was fascinating to me to watch and try and figure out what was being done differently and why. For example, I noticed that when a speaker is setting up a sequence (e.g., 1st, 2nd, 3rd), she indicated the sequence on the other person’s fingers, not her own. I saw this happen several times before I figured out that indicating it on your own hand is a sighted way of doing it. Touch is the DeafBlind way. As time goes on, we’ll have more contributors to the blog explaining other features of PTASL and their experiences. We are excited and honored to be a part of this growing movement and looking forward to seeing the impact it will have on the autonomy of DeafBlind individuals everywhere. CHERYL DAVISCheryl Davis is the DBI Project Director. Her role is administrative and evaluative, ensuring the project activities are completed on schedule and within budget, and adhere to the values and mission of the project.
0 Comments
Your comment will be posted after it is approved.
Leave a Reply. |
Archives
July 2020
Categories
All
|