Search

Stakeholders Advocating for Fair and Ethical AI in Interpreting

Interpreting SAFE AI Task Force Guidance (Ethical Principles) AI and Interpreting Services

Open for public comment until May 29, 2024

Perceptions on Automated Interpreting

Overview of Findings Presented by Hélène Pielmeier, Senior Analyst at CSA Research

12:31:15 So hi everyone. I’m the senior analyst at CSA research who worked on the perception survey for the Safe AI task force.

12:31:26 The interpreting industry is facing a tremendous amount of angst over improvements in artificial intelligence. Does it mean the end of the profession?

12:31:36 Will meaningful language access and meaningful communication access be maintained when using automated solutions, how can you ensure that AI is used safely to prevent negative outcomes?

12:31:50 There are a lot of questions to address. So, we conducted an in-depth research report and we recently published 350 pages packed with lots of information.

12:32:02 And this session I’m going to give you a summary of core data points that you will see analyzed in greater detail in the report.

12:32:11 But let’s start by discussing the context for this research. It was commissioned by the Safe AI task force.

12:32:19 It’s a diverse group of industry stakeholders who got together to establish disseminate and promote guidelines for responsible AI adoption in the field of interpreting.

12:32:30 And they requested that CSA research conducts for them a large-scale perception study of end users.

12:32:39 Here think doctors, patients, etc. And then of requesters those are the buyers of language services and then the providers of interpreting services and technology so that’s of course the language service providers, the interpreters, and the AI tech vendors.

12:32:54 The goal of the study was to capture current perceptions about both spoken and signed AI for interpreting with a focus on the US market.

12:33:06 And for those of you who don’t know us, CSA research is an independent market research company.

12:33:11 We’ve been around for 20 years. And we focus exclusively on the language services industry. Whether the supply side or the demand side.

12:33:20 And the scope of the research that the task force requested was massive. So, we are grateful to the sponsors that the task force recruited to help fund this research.

12:33:31 Their names are on screen here and this research would not have been possible if not for these companies backing the project.

12:33:39 And I also want to acknowledge our sign language interpretation right now with Master Word who is donating the sign language interpretation for the session today.

12:33:51 The report will synthesize the results of 2 surveys. One was with requesters and providers of language services.

12:34:00 So here again, think procurement teams, schedulers, interpreters, etc. And then the other was with the end users.

12:34:05 Again, things like doctor patient and all those permutations. That end user survey was available in 10 languages.

12:34:13 And a great promotion effort from the task force led to over 2,500 responses from 82 different countries.

12:34:21 In the United States, we were able to collect responses from 48 different states. So, it had a broad reach.

12:34:28 Altogether, we collected 118 data points which we correlated against each other and that allowed us to publish 9,400 values.

12:34:38 And we also offered ample opportunities for respondents to provide free text answers and we collected an amazing 3,400 comments and that told over 62,000 words.

12:34:51 Some people honestly wrote entire essays. So, we’ve read and analyzed every single one of them and we sprinkled anonymized approach throughout the report.

12:35:03 And I briefly want to touch on the profile respondents because it did affect the results. Two-thirds of respondents were either spoken or signed language interpreters.

12:35:13 We had targeted 11 different groups. So that means that the average is influenced greatly by interpreter responses.

12:35:20 And that’s why each graphic in the report will indicate responses by group so you can contrast responses from their different perspectives.

12:35:29 We also had 69% of respondents that had some sort of a connection to the healthcare field.

12:35:36 And this is important because it’s a sector where mistakes can have life and death consequences. So, the survey collected more negative reactions to AI than if more respondents had come from other domains that are less high risk.

12:35:51 And then only 11% of respondents across our sample had either moderate or extensive experience. With automated spoken or same language solutions.

12:36:03 And that honestly was a little bit of a problem because perceptions ended up being based on opinions more than on facts.

12:36:09 So on each graphic we distinguish responses from those who had experience and those that have either a little or not.

12:36:17 And I also want to mention that 79% of respondents came from the United States. That was the primary recruitment target for the task force.

12:36:28 On graphics, you will see respondents in the United States versus outside US. Outside the US tended to be a bit more open to AI.

12:36:37 But that could have had to do with the fact that the survey might have reached predominantly people who were already interested in the topic…..

12:36:44 …in those other geographies.

12:36:47 Okay, so let’s get started with context data. This is from end users. Meaning either the service recipients or frontline professionals who participate in sessions where the interpreting occurs.

12:37:00 We ask them to what degree they trust in interpreting from 5 different sources. And what you see here, the results for those who say they either, that they fully trust the source.

12:37:11 So. No surprise, face to face interpreters come in the lead with 78% of end users who fully trust them.

12:37:20 Once the interpreter is remote, that confidence level drops to just 56%. Bilingual friends and family members who are not professionals did not rank very high here with only 21% of end users who fully trust them.

12:37:35 The last 2 options refer to the automated interpreting options. So, when an organization provides interpreting, think for example about a conference where the organizer gives you a link to access the automated interpreting.

12:37:51 In such cases respondents trust the output a little bit more than when using apps that they found online, often those free apps that you can just download by yourself.

12:38:03 But either way, it’s not very much trust in either of these 2 cases.

12:38:09 Let’s now explore a series of 5 AI powered services that’s really to the topic automated captioning.

12:38:17 Automated transcription, automated subtitling, automated spoken language interpreting, and automated sign language interpreting.

12:38:26 And we asked survey takers about their level of experience with it and if they had used it, what they thought about the quality.

12:38:34 Note that the end users of interpreting services did not see captioning, transcription, and subtitling questions because we were trying to simplify the survey for them.

12:38:47 So let’s start with captions. That’s when the text on screen appears in the same language as the one spoken in the session over a quarter of respondents had either used an automated system to produce captions or tested either moderately or extensively.

12:39:03 That’s actually a pretty good number. And over half of the M found results either good or excellent.

12:39:12 And this was the highest-level performance across the 5 services that we tracked.

12:39:17 The next service is related to captioning. But instead of the text appearing on a screen, the AI produces a document that may include elements such as time codes and who is speaking.

12:39:29 A little fewer people had experience with automated transcription. But overall, over half again found results either good or excellence.

12:39:38 What such high numbers indicate is the potential for technology to be useful to assist in some situations. Moving on to subtitling, the third best performing AI service.

12:39:51 That’s when captions on screen in a different language than the language spoken in the session. 19% of respondents had some experience with automation and 40% of them found some value in the results.

12:40:05 Numbers are a little bit less exciting than the captions because you compound errors at each step you add in the process.

12:40:12 So to simplify the concept in captioning, you essentially just use voice recognition. I’m simplifying here.

12:40:20 But for subtitling, you apply machine translation to the captions. So, you have doubled the chance of errors in the process.

12:40:29 Now, once you apply a synthetic voice to the subtitles, it’s called automated interpreting, machine interpreting, automated speech translation, the terminology out there is a necessarily very formalized what we predominantly use in the report is automated in interpreting or AI.

12:40:49 And that is the focus of the report. Unfortunately, only 11% of respondents had first-hand experience with such technology and just a little over a third of respondents found results either good or excellent.

12:41:05 One of the challenges of automated interpreting is that very often people prefer to read subtitles than to listen to a robotic sounding voice that lacks a little bit in intonation.

12:41:19 And finally, the last AI driven service we asked about was automated signed language, which refers to text to sign technology using avatars.

12:41:27 Assigned to text technology using complex sign decoding systems. And the reality is that the technology is much less developed than spoken language interpreting.

12:41:39 So barely 1% of respondents had even experienced any of it. And for those who did, quality perceptionss fell below the 30% ….

12:41:46 … mark, meaning that the use cases are going to be very limited. Until the technology improves.

12:41:53 Now I also want to show you some data on whether respondents thought that automated interpreting could reach the same level of accuracy as qualified human interpreters.

12:42:04 And for simple conversations 9%. Believe that AI already reaches human parity. And 25% believe it will happen soon.

12:42:14 Numbers naturally drop for complex conversations with a single percentage point who believe human parity already exists and 8% who think it will happen soon.

12:42:25 The flip side of course is that a large portion of respondents think it will take a really long time to get there.

12:42:32 It may never happen at all. And that’s why it gets very interesting to see the response difference between those who already have some experience and those who don’t.

12:42:42 What’s in the report is that respondents without much or any AI experience tend to underestimate AIs capability.

12:42:53 So let me prove this to you. We assigned a score. To quality perceptions, those who say quality is excellent, they have a value of 2, good is one, unacceptable, sorry, then I have 4, that’s minus one, an acceptable minus 2.

12:43:09 And that’s what you see on the x, the y-axis here, and then on the x-axis you see the level of experience.

12:43:17 And for each of the 5 AI driven services that we tracked, the more experience people have, the more positively they think about AI capabilities.

12:43:28 The effect on the results is very real. So, I highly encourage providers to gain some experience with it, interpreters in particular had the least experience with AI.

12:43:41 You should really test some tools so that you know what it can or cannot do. Otherwise, you only say what it should or should not do.

12:43:50 That’s important too, but that is not all there is to this discussion.

12:43:57 Okay, moving on I want to know show you a series of 10 statements that we made where we asked survey respondents to tell us to what degree they agreed with them.

12:44:11 And I’m going to show you responses for those who agree. The first statement was, it’s okay to use machines for routine and repetitive conversations that they can handle.

12:44:21 Nearly half of respondents either agree or strongly agree. And this confirms what we saw earlier about simple conversations.

12:44:30 AI can be helpful for some easier interactions or one-way communications. The next statement inquired whether it’s better to have a machine interpret rather than have no interpretation at all.

12:44:43 And this is the whole premise behind the argument that AI increases language and communication access. 44% of respondents either agreed or strongly agreed.

12:44:53 That imperfect, results are better than no language support. Now that leaves a lot of people who disagree.

12:45:01 Not also that respondents commented on the fact that AI could actually reduce meaningful language and communication access. Because organizations would push AI options in unsuitable scenarios.

12:45:15 If language and communication access decreases, or reduces in quality, then discrimination of physical harm or many other consequences can occur.

12:45:27 And obviously that is not something that anybody wants to see happen.

12:45:32 The next statement epitomizes the contradiction of the results throughout the reports. Many respondents claim that machines cannot deliver when they haven’t retested the systems.

12:45:45 Their answers are often more about saying what machines should be tasked with handling interpreting services.

12:45:54 And that is why nearly 3 in 4 respondents are whopping 74% either agree or strongly agree that it is not right to replace people with machines for interpreting.

12:46:06 Now remember we had a lot of responses from providers who are afraid to lose their job. So that affected the results to some degree.

12:46:15 And when you contract the responses of the previous ethics question to the potential financial motives of organizations to push for AI, you really get to the crux of the issue.

12:46:26 Over 2, of 3, of respondents either agree or strongly agree that requests is only want automation to reduce costs.

12:46:33 The ones who really disagreed with that statement were people in procurement role, which is normal because they see the whole picture with all the elements that go into such decisions.

12:46:45 Procurement teams try to fix problems for their companies and their primary motives are efficiency even before cost.

12:46:53 So, for example, a language professional may not see the cost of the administrative burden involved in scheduling interpreters.

12:47:01 Or what happens when there is a no-show from either the service recipient or the interpreter?

12:47:06 Does that mean the only answer to their problem is AI? Of course not, but you need to understand that buyers are frustrated with an inadequate scheduling system,   and the lack of system integration and interpreter demands even….

12:47:20 …can affect decisions regarding the role of human versus AI.

12:47:26 Continuing the analysis and cost, we asked end users whether who pays for the interpreting effects, their thoughts on AI usage.

12:47:34 39% either agree or strongly agree that they might choose a machine if they have to pay for the interpreter themselves.

12:47:41 Note, of course, that the balance would still prefer a human interpreter. And this point is tied to the proliferation of pocket translators.

12:47:53 These apps you can find online for generally tourists and that affects the price point that technology vendors can charge.

12:47:59. And then this in turn drives the AI developers to create products primarily for enterprises and government institutions because developing quality applications is very costly.

12:48:14 And they need a return on their investment. And this also explains why some of the better AI products are not accessible to direct consumers.

12:48:25 Next statements, was one where we asked the opposite of the previous one. We’re contrasting payment by someone else versus by the end users themselves.

12:48:36 And if someone else pays the bill fully, 76% of end users will favor a human interpreter.

12:48:43 This shift in pay is important to remember in the context of respondents who feel that organizations push AI as a cost saving measure.

12:48:51 The cost factor matters for requesters and users alike. After all, this is interpreting. Nobody wants to pay for it.

12:48:59 Even though it is very necessary for all the people who are involved in communication.

12:49:06 Now switching gales a little bit the next statement was on whether there are situations where end users would prefer an automated interpreting solution over a human.

12:49:16 AI admittedly comes with its fair share of concerns over data privacy. However, being able to hold a conversation away from the ears and eyes of a fellow human might have some advantages too, especially in small communities where you personally know the interpreter.

12:49:34 Using a machine can be more comfortable when dealing with matters of a highly personal nature, anything that is sensitive or very private because despite the humanity that an interpreter brings to the table, sometimes that human element can be a little bit too much because here you can see that 31% agreed that sometimes AI could be better.

12:49:56 For them. And finally, the stress factor of using an automated solution also enters into the discussion. About half of respondents expect that using a machine interpreter would cause them no stress.

12:50:13 So why is that? AI driven processes add a burden to participants to figure out if a mistake was made and when it was made to Speak up if he did.

12:50:24 And they will likely fear loss of vital information that could affect a ruling, a diagnosis, or whatever the next step action is going to be.

12:50:33 Each of which can have a huge financial and human cost. And ultimately, distress can contribute to the feeling of not being heard.

12:50:45 And I also want to add another element is that the robotic nature of some of the AI tools output may be distracting and it might make it more difficult for the listeners to pay attention to the content and the next steps.

12:51:02 And we close a series of statements with a question for end users of whether they want to know if a person or a machine is doing the interpreting.

12:51:10 And the answer here was an overwhelming yes for 89% of respondents. Advances in advances in voice flow.

12:51:20 Can make the distinction between a human and a machine difficult when the end user doesn’t have a visual reference.

12:51:28 So you can use the analogy of phone systems which we all have struggled with it can become frustrating sometimes not to know if you’re talking to a human in our machine and the same would happen if you deal with a life translator versus a bot and not knowing which one you’re dealing with.

12:51:48 So now let’s tackle the pros and cons of automated solutions. I’ll present only the data portion of what’s in the reports note that there are dozens of pages in the report with an analysis of free text answers that went with it.

12:52:02 Respondents had a lot to say, especially when it comes to the drawbacks of potential A.

12:52:09 I use. But here I’ll strictly share the data portion. So, respondents could select multiple options from a list of common benefits.

12:52:17 Requesters and providers of interpreting services, so more options than end-user did.

12:52:23 You can see that with the little stars here in the legend. So, what does this graph tell us?

12:52:30 Well, the most frequent advantage was around the clock availability. 66% selected that advantage but end users and requesters were showing even greater numbers.

12:52:43 That’s when the interpreting solution is available, 24 7 365.

12:52:49 It’s technology that’s easy to implement. Now in second position, there was a tie between the no need to schedule.

12:52:56 Interpreter and the low cost element where 58% of respondents selected those responses. But what I also want you to pay attention to is the indirect dig against the interpreters.

12:53:10 Beyond the ability to skip the scheduling, some elements that respond positively about AI underscore issues even if they’re not significant.

12:53:18 But the issues that affect end users and requesters perception of human performance and therefore influence their reaction to AI.

12:53:28 So the numbers you see on screen at the average for all responses group. But let me show you examples from requests only meaning the buyers and decision makers.

12:53:38 38% like that AI doesn’t arrive late at appointments. 31%. Think that they could achieve higher field rates on job assignments.

12:53:46 And 16% think that they deal with fewer professionalism issues. So averages don’t tell the whole story, so pay attention when you read the report as to the difference between the different audiences, it’s on every graphic.

12:54:01 Okay, so now let’s examine responses for drawbacks. Big mistakes is where respondents worried  the most.

12:54:10 We defined these big mistakes as situations when the main idea might be wrong or the mistake could cause harm or some sort of a legal problem.

12:54:20 And that’s where it’s 81% of respondents. When AI makes mistakes, the errors can be more severe than humans would make.

12:54:28 But humans make mistakes too, especially when they deliver services in real time. But in contrast though they can rebound a little better.

12:54:41 Now I also want you to look if you jump ahead a little bit here at the small mistakes. Those are situations where the main idea is clear, or the details are wrong.

12:54:55 That still worries 48%, but that number jumps to 57% for the end users.

12:55:01 So really compare the results by audience once you see the report because they tell different stories. Okay, so let’s go back to the beginning of the graphic here, the second bar.

12:55:11 The need for special tech worries another 67% of all respondents. The implementation of telephone and video.

12:55:22 Remote interpreting, launch the introduction of these dual handset phones and, you know, those physical carts that transport video devices.

12:55:30 And if you take the example of a medical setting, someone has to find a cart and maybe they didn’t order enough for their departments or maybe the last person who use it didn’t store it where it should have been and somebody has to hunt.

12:55:43 For it. So, if automated solutions use the same hardware, it will not really make the situation any easier for frontline professionals.

12:55:54 Okay, the third bar here respondents also fear something going wrong. 59% think there are more chances that something will go wrong with automated solutions.

12:56:03 And that always means more work for whoever must find a solution to troubleshoot and fix the issue. And then 56% fear that users will not accept machines.

12:56:15 Overcoming the stigma would require training and experience for all parties involved. But then on top of it for some end users, machines will never be the right solution.

12:56:26 Think for example about a patient who has multiple disabilities. There is a really high chance AI will not be able to assist them.

12:56:36 Okay, so most of the drawbacks are tied to the limitations of the AI and getting into the complex conversations where human interpreter is by far the superior option.

12:56:48 So that is why we ask survey takers if they thought that an automated interpreting system would be more useful if they could quickly get a person to help when there is a communication problem.

12:57:01 So. The logic behind the question is that the app would include some sort of a button where you can request to talk to a human interpreter.

12:57:09 Either by telephone, video, or even in person. You do that if either party in the conversation feels there’s a communication challenge.

12:57:19 It’s really similar to the press 0 to talk to a customer service representative in a phone-based system.

12:57:26 And some technology providers already built such an escalation mechanism into their platform. Here we found that more than one half of respondents think the ability to escalate to a human being would increase the usefulness of automated interpreting either a lot or level.

12:57:44 This ties a bit more to the symbiotic relationship. Between AI and interpreters.

12:57:51 So I’m not covering this here today, but in the report, we cover AI driven computer aided interpreting.

12:58:01 Which is when the AI helps with you know named entity recognition numbers etc. memory recall terms a lot of different aspects to empower the interpreter to do a better job.

12:58:17 I already mentioned that respondents cited a lot of concerns about AI performance and a lot of it relates to what machines cannot capture yet.

12:58:27 I’ll present a high-level categorization of what we explore more in depth in the report.

12:58:34 Respondents were truly prolific in their free text comments about the importance of humans in dealing with subtleties and ambiguities.

12:58:43 This graphic depicts a high-level categorization of their arguments from requiring the ability to deal with a context for language, culture, tone, emotions, and interaction background.

12:58:55 AI cannot capture visual clues, cultural inferences. It doesn’t understand the mood of the participants and it doesn’t really understand regionalisms.

12:59:05 So in short, the respondents fear that because AI cannot see nuance or read between the lines, service recipients have too much to lose when the machine doesn’t understand this context and culture.

12:59:18 But similarly, the language that people speak is not textbook perfect. Neither are the situations in which the interpreting occurs.

12:59:27 Respondents here were terrific too at providing examples of challenging situations where AI would struggle interpreting for someone who uses programmer, has a speech impediment or is in an active psychotic episode — that requires some serious skills.

12:59:43 So the graphic on screen shows some of the many such imperfect scenarios that can lead to a communication breakdown or technology failure.

12:59:53 The reason that humans remain the better choice is that they provide language accommodations for unusual or unexpected situations by adapting what and how they interpret their knowledge because they use the knowledge and their common sense.

13:00:09 And this takes us to an important concept that underpins the research. That of the role of the interpreter.

13:00:16 And I had a bit of fun with AI myself and I asked chat GPT to create a graphic depicting the traditional view of interpreting the one that revolves around the concept of linguistic conduit.

13:00:29 Meaning interpreters act as neutral parties who convey messages from one language to another without adding, omitting, or altering the content.

13:00:39 Ultimately, that’s what automated interpreting strives to accomplish. The thing though is that the interpreter’s role has evolved over time to be more than translators of spoken or signed words.

13:00:53 And in this active model, the interpreter plays a more involved role in the communication process. They may clarify meanings, ensure cultural appropriateness, and sometimes even mediate the conversation.

13:01:06 So in this model, the interpreters take responsibility to ensure successful communication. They work to verify the messages and pursue them correctly by both parties, potentially adapting language, tone, cultural references to fit the context.

13:01:22 So this approach recognizes the interpreter as an active participant in the communication process, acknowledging that their presence and their decisions can influence the outcome of the interaction.

13:01:35 AI is nowhere close to capable of reaching this level of proficiency. So, who will lose if AI pushes organizations to go back to a conduit model?

13:01:47 Of course, the end users. It may be less of an issue when interpreting for a business meeting or other, you know, trade show or conference but it is certainly in context such as healthcare, legal and social services.

13:02:03 And that then takes me to giving you a snapshot of data on AI suitability by use case scenario.

13:02:10 They are close to 6,000 data points and that alone in the report. So today I’m just giving you the most interesting ones.

13:02:18 Let’s examine end users’ responses first. The data you see here summarizes the percentage of service recipients and frontline professionals who find AI either mostly or totally suitable for a conversation type.

13:02:34 On the left you have low risk non-technical non urgent conversations that came up as the highest cause, while their logical counterparts on the right, high risk, technical and urgent, were seen as much less ready for deployments.

13:02:50 The trick of course is defining the characteristics of these conversation types as conversations rarely stick to a single type all the way through.

13:03:00 And if we now look at responses from requesters and providers, so no longer end users. Here you can see the top 5 use cases out of 58 that we provided across 11 areas.

13:03:14 Nearly 2 thirds of respondents of respondents think it’s mostly a totally suitable to use automated interpreting for notifications or announcements.

13:03:25 For example, when you report an outage, an absence, a cancellation, and I’ll skip ahead here and have you look at number 4 which has a similar situation but in an education context.

13:03:37 And then look at the type for number 5. That was an emergency service scenario. Where 37% believe that AI is okay to notify of a weather emergency.

13:03:50 But let’s go back to number 2 here, scheduling sessions to select, notify, or remind of either the time of the meeting where it will take place and what you need to prepare for it.

13:04:01 We found 56, sorry, 57%. Who found that AI was suitable for that.

13:04:09 And then in number 3, that came from our series on client service scenarios. That’s the ability to use AI.

13:04:16 I to check on the balance of an account. That’s something that is often frequently already done. Through AI phone systems as it is.

13:04:27 And then in number 5, the logistics call to explain how a session our conversation will take place, reaches 37% approval.

13:04:35 It’s a little similar to what was in number 2 here. Now these are very strong use cases.

13:04:41 Other sectors did not fare as well. But responses do vary significantly by use case, not just by domain.

13:04:50 So for example, if you take healthcare. The delivery of bad news to a patient is clearly unsuitable.

13:04:58 You all know this, and the data proved it. But if you’re doing patient registration, there was some potential there.

13:05:05 However, if you’re doing patient register, say with a patient that has cognitive differences, that would still be better with a human interpreter.

13:05:15 So there are a lot of elements that go into the decision. So that’s why the next part of the analysis identifies criteria to use when determining whether to use automated interpreting.

13:05:26 We provided respondents elements where they rated the criteria as to whether it should be a major criterion minor or not a criterion at all.

13:05:38 And I’m showing you here just the percentage for those who believe it’s a major factor. We told respondents to answer as if the decision was up to them.

13:05:48 The level of accuracy required from the interpreting leads the list, no surprise, followed by the risk of possible harm, no surprise there either.

13:05:58 And then comes the complexity of the language, not all language are rendered equally well through AI. But even after that you can see that the other elements, we suggest are all quite important.

13:06:11 And what you need to note is that one element alone is not enough to determine whether AI can or cannot do the job.

13:06:19 It’s always a multi-factor decision that is not just done at the organizational level, but for each single interaction.

13:06:28 So an organization might decide to procure an automated interpreting system. That doesn’t mean they should use it in every interaction.

13:06:37 Even if it’s a basic one. Okay, so I will wrap up this presentation of data highlights with responses from procurement teams about their plans for the next year for the 5 services that we had discussed upfront.

13:06:51 Note that the number of respondents here is very small. Not just 40 people who followed that survey path.

13:06:59 And the bulk of answers show no plans to use automated captioning, automated transcription, automated interpreting.

13:07:08 Subtitling or certain language, none of those. But let’s explore those who already went ahead with it or plan to do so.

13:07:15 So captioning and transcription are in the lead in terms of existing implementation. 18% of respondents have automated solutions.

13:07:25 In place for captioning. And 13% for transcription. Does that are 2 monolingual services that we’re tracked in the survey?

13:07:33 And only 5% which given the small sample and you know; of people this represents just 2 respondents.

13:07:43 Only 5% have solutions in place for interpreting and subtitling. None of them report any AI for sign language right now.

13:07:52 Now, there is little change planned on the radar. Those who were ready to implement automated captioning and sub-tiling already did so.

13:08:00 And implementation plans are negligible for other services. This thing is a bit different. All but sign languages will see some testing.

13:08:10 Truth be told, studies like the present one likely prompted some procurement teams to think it’s time to see what the fuss is all about.

13:08:19 And then what’s also interesting here in this data set is that more procurement teams plan to reduce their use of automated subtitling, than those who plan to continue using it.

13:08:31 And that is probably because the output is not up to expectations. That is problematic for the automated interpreting tech vendors.

13:08:42 Because they simplified mostly just add voice synthesis to the subtitles. Now we also notice some reductions for use of AI in captioning and transcription.

13:08:55 Okay, so I’m not going to take the time today to get into all the recommendations, which will be listed in the report for each of the audiences.

13:09:04 But I do want to give a flyby overview of the ones we gave for the task force. What’s important for everyone to remember?

13:09:10 Is that we just only scratch the surface, and we did not get anywhere close to enough feedback from end users.

13:09:17 The findings in this report alone are not sufficient. To establish all-encompassing guidelines. If you rush to establish guidelines, you will likely end up restricting use in situations where AI could have been a valid option.

13:09:31 So I want to urge you to remember your mission, which is to find out what is right for end users, not necessary to defend interpreters.

13:09:40 Or technology vendors. That was not the mission of the task force. So, what should you do? Now comes the hard work with more in-depth research to establish safe boundaries of when you can or cannot use AI.

13:09:56 So first you need studies with end users think focus groups and testing labs where end users can compare the different outputs.

13:10:05 Rate them, categorizing which scenarios they trust the system to do an adequate job. It should be for a variety of scenarios, not just the high-risk ones because realistically those are not ready for prime time.

13:10:18 The survey was also a little bit too theoretical for many end users who don’t even know what AI is or colony.

13:10:25 Imagine what automated outputs might sound like. Sweet, really needs to be hands-on. And you should compare different tools because the accuracy varies from one system to another.

13:10:37 It’s not equal out there. You should also conduct an in-depth analysis of the nature of conversation.

13:10:45 So we talked about simple versus complex conversations. And for each use case scenario you’re considering AI for, you need to really understand the percentage of conversations that are truly simple.

13:10:58 And when it turns complex. What are the trigger words that indicate the conversation tipped from simple to complex?

13:11:07 Because without that, we fall to any conversation having the chance to turn complex and then you like the data to put proper escalation systems when using AI.

13:11:19 And next our research barely touched on captioning and sub-titling. Automation use cases are much greater for those technologies and such research should identify the tipping point of when and users are satisfied with subtitling versus one date one voice synthesis instead.

13:11:36 And the same logic applied to captioning versus signed language by an avatar. Again, results will vary based on the scenario and the audience.

13:11:47 Finally, the presence perception study was conducted nearly too early to collect accurate perceptions in 6 months, a year or even 2 years, many more people will have been exposed to AI solutions and will have perceptions more in line with the capabilities.

13:12:05 Also, don’t forget technology is also constantly improving. So, what is not suitable now, would not be suitable in a matter of months.

13:12:15 So this is it for the presentation today. You can read the report online. On our platform or download a PDF, plan some time to read it.

13:12:25 It probably takes about 8 h to read it straight through. And that’s even at a pretty good pace.

13:12:29 It is packed with information, and I recommend you don’t skip the second chapter that tells you how to read the graphic because a graphic like the one you see on screen has about a hundred data points and you really need to understand.

13:12:42 What each of them represents to get the most out of the report. And I also want to mention that the 350 pages of the report are a bit daunting for some people, so there will be a summary version available pretty soon.

13:12:57 This concludes the presentation of the findings. Thank you everyone for watching. Thank you.

Deaf Advisory Group on AI and Sign Language Interpreting

Experienced End User Intelligence about Automated Interpreting

00:00:01:06 – 00:00:01:20
Hello.

00:00:01:20 – 00:00:03:23
Hello, everyone.

00:00:03:23 – 00:00:06:23
My name is Tim Riker

00:00:07:02 – 00:00:11:05
and I am going to be the presenter today,

00:00:11:12 – 00:00:12:11
one of the presenters.

00:00:12:11 – 00:00:13:22
I’m from Brown University

00:00:13:22 – 00:00:17:05
and I’m a member of this advisory board,

00:00:17:05 – 00:00:19:21
which is for artificial intelligence

00:00:19:21 – 00:00:22:03
and sign language interpreting.

00:00:22:03 – 00:00:23:19
Today we are thrilled

00:00:23:19 – 00:00:26:20
because we are going to be providing you

00:00:26:20 – 00:00:28:04
a presentation.

00:00:28:04 – 00:00:30:08
We’ll be talking about our report.

00:00:30:08 – 00:00:32:18
And

00:00:32:18 – 00:00:34:17
the advisory council is here today

00:00:34:17 – 00:00:35:20
together.

00:00:35:20 – 00:00:37:07
We’ll be talking about some of the work

00:00:37:07 – 00:00:39:08
that we’ve done, collecting data

00:00:39:08 – 00:00:40:17
from three webinars

00:00:40:17 – 00:00:43:09
that we hosted last fall.

00:00:43:09 – 00:00:45:01
The reason that we decided

00:00:45:01 – 00:00:46:05
to host these webinars

00:00:46:05 – 00:00:47:02
and do this research

00:00:47:02 – 00:00:48:17
is because we wanted to get more

00:00:48:17 – 00:00:50:06
of the deaf perspective

00:00:50:06 – 00:00:51:23
and to take that perspective.

00:00:51:23 – 00:00:54:00
And we do

00:00:54:00 – 00:00:55:10
some more research and former

00:00:55:10 – 00:00:57:20
Task Force for Safe AI.

00:00:57:20 – 00:01:00:09
That report is going to be presented

00:01:00:09 – 00:01:03:16
today and we

00:01:05:02 – 00:01:06:10
would like to share now with you

00:01:06:10 – 00:01:09:10
the topic of the session.

00:01:12:02 – 00:01:14:15
I’ll go ahead and introduce myself.

00:01:14:15 – 00:01:17:06
And so in terms of visual description,

00:01:17:06 – 00:01:20:11
I am currently wearing a black shirt.

00:01:21:11 – 00:01:22:07
It’s a long sleeve

00:01:22:07 – 00:01:23:04
black shirt,

00:01:23:04 – 00:01:24:09
collared shirt

00:01:24:09 – 00:01:27:09
with buttons and a white male

00:01:27:16 – 00:01:30:16
with reddish blondish hair.

00:01:30:21 – 00:01:33:20
I have a bit of a mustache

00:01:33:20 – 00:01:36:14
and facial hair.

00:01:36:14 – 00:01:38:23
So today our presentation

00:01:38:23 – 00:01:42:00
topic is going to be death safety.

00:01:42:07 – 00:01:43:23
I

00:01:43:23 – 00:01:47:03
a.i meaning artificial intelligence.

00:01:47:15 – 00:01:48:20
And our goal

00:01:48:20 – 00:01:50:12
is to have a legal foundation

00:01:50:12 – 00:01:53:12
for ubiquitous automatic interpreting,

00:01:53:17 – 00:01:55:19
using artificial intelligence

00:01:55:19 – 00:01:58:19
and how that relates to interpreting.

00:01:58:20 – 00:02:01:16
So

00:02:01:16 – 00:02:04:05
I wanted to talk a bit about why

00:02:04:05 – 00:02:07:05
this topic is so important.

00:02:08:24 – 00:02:09:21
As you know,

00:02:09:21 – 00:02:11:23
there are many users

00:02:11:23 – 00:02:13:13
of sign language interpreters.

00:02:13:13 – 00:02:17:17
We sometimes go through frustrations

00:02:18:02 – 00:02:19:03
and situations

00:02:19:03 – 00:02:22:04
because we are using technology,

00:02:22:13 – 00:02:25:07
and sometimes technology

00:02:25:07 – 00:02:26:10
can be very beneficial,

00:02:26:10 – 00:02:28:09
while other times technology

00:02:28:09 – 00:02:30:03
can cause harm.

00:02:30:03 – 00:02:31:16
So with VR, I

00:02:38:23 – 00:02:41:20
so with VR, AI, for example,

00:02:41:20 – 00:02:43:04
we have video,

00:02:43:04 – 00:02:44:06
remote

00:02:44:06 – 00:02:45:08
video, remote interpreters

00:02:45:08 – 00:02:47:07
that come up on the screen.

00:02:47:07 – 00:02:50:07
And sometimes it can be a great idea

00:02:50:17 – 00:02:52:04
because, you know,

00:02:52:04 – 00:02:55:04
when it first came out, we saw that

00:02:55:16 – 00:02:58:06
there was a lot of freezing.

00:02:58:06 – 00:03:00:23
Sometimes there might be challenges

00:03:00:23 – 00:03:01:20
internally, like,

00:03:01:20 – 00:03:02:19
let’s say

00:03:02:19 – 00:03:03:14
things are going on

00:03:03:14 – 00:03:06:02
in that room that cause issues.

00:03:06:02 – 00:03:06:20
Various issues

00:03:06:20 – 00:03:09:20
would happen with this technology.

00:03:09:21 – 00:03:13:01
Now, if you think about R2,

00:03:13:01 – 00:03:15:00
we think about automatic interpreting

00:03:15:00 – 00:03:16:07
or artificial intelligence

00:03:16:07 – 00:03:17:23
and interpreting.

00:03:17:23 – 00:03:19:11
Are we ready for that?

00:03:19:11 – 00:03:20:20
How will that impact

00:03:20:20 – 00:03:22:19
the greater community?

00:03:22:19 – 00:03:24:01
What will be the community’s

00:03:24:01 – 00:03:26:03
view of this?

00:03:26:03 – 00:03:28:07
So last fall

00:03:28:07 – 00:03:31:07
we hosted three webinars

00:03:31:10 – 00:03:33:23
and those webinars happened here

00:03:33:23 – 00:03:35:10
at Brown University.

00:03:35:10 – 00:03:38:03
They were through Zoom

00:03:38:03 – 00:03:40:13
and we had a panel

00:03:40:13 – 00:03:41:21
and we had discussions

00:03:41:21 – 00:03:43:00
and we gathered the view

00:03:43:00 – 00:03:44:12
of multiple people.

00:03:44:12 – 00:03:46:07
We also had deaf community members

00:03:46:07 – 00:03:48:02
who came in to watch

00:03:48:02 – 00:03:49:18
and make comments

00:03:49:18 – 00:03:51:23
and talk about their perspective.

00:03:53:17 – 00:03:56:12
So in that discussion we saw that

00:03:56:12 – 00:03:58:16
there was a lot of rich information

00:03:58:16 – 00:04:00:11
that was shared

00:04:00:11 – 00:04:03:11
and the team, the advisory group,

00:04:04:03 – 00:04:07:04
decided to work to analyze

00:04:07:14 – 00:04:08:24
that information that came

00:04:08:24 – 00:04:10:15
from the discussions.

00:04:10:15 – 00:04:11:17
And right away,

00:04:11:17 – 00:04:14:08
we knew we had a lot of rich content

00:04:14:08 – 00:04:15:15
and that we would be able

00:04:15:15 – 00:04:17:05
to take that content

00:04:17:05 – 00:04:19:01
and add it to the report

00:04:19:01 – 00:04:20:03
so that we could get

00:04:20:03 – 00:04:22:23
a general understanding of what fire

00:04:22:23 – 00:04:25:00
I would look like.

00:04:25:00 – 00:04:28:06
And we had several different questions

00:04:28:06 – 00:04:30:00
that we put into a survey

00:04:30:00 – 00:04:32:18
and we sent those out and we were able

00:04:32:18 – 00:04:36:01
to get those responses from the deaf.

00:04:36:08 – 00:04:36:23
Unfortunately,

00:04:36:23 – 00:04:39:08
we did not have a survey in ASL,

00:04:39:08 – 00:04:41:06
but we knew that we needed

00:04:41:06 – 00:04:42:14
to have these discussions

00:04:42:14 – 00:04:44:03
in order to gather the information

00:04:44:03 – 00:04:45:14
we were looking for.

00:04:45:14 – 00:04:46:03
Let’s go ahead and

00:04:46:03 – 00:04:47:00
head to the next slide.

00:04:53:03 – 00:04:55:06
So

00:04:55:06 – 00:04:55:17
I’d like

00:04:55:17 – 00:04:56:12
to introduce

00:04:56:12 – 00:04:59:12
you to other members of our team.

00:04:59:15 – 00:05:02:07
We all participated together

00:05:02:07 – 00:05:03:15
in gathering this research

00:05:03:15 – 00:05:06:05
and we’ve been working hard as a group

00:05:06:05 – 00:05:07:10
to get that information.

00:05:07:10 – 00:05:08:24
And these lovely people here

00:05:08:24 – 00:05:10:09
volunteer their time.

00:05:10:09 – 00:05:12:01
I’d like to introduce you to them now

00:05:12:01 – 00:05:14:03
so that you can get to know them a bit.

00:05:14:03 – 00:05:15:13
And they will also be talking

00:05:15:13 – 00:05:17:09
about the report today.

00:05:17:09 – 00:05:19:18
So

00:05:19:18 – 00:05:22:16
let me go ahead and pass it over to you.

00:05:22:16 – 00:05:25:09
Let’s start with Theresa.

00:05:25:09 – 00:05:25:17
Tracey,

00:05:25:17 – 00:05:28:07
if you’d like to introduce yourself.

00:05:28:07 – 00:05:28:17
Sure.

00:05:28:17 – 00:05:30:13
Good morning, everyone.

00:05:30:13 – 00:05:33:08
My name is Theresa Blake.

00:05:33:08 – 00:05:36:08
Maya Burke

00:05:36:09 – 00:05:38:09
and I work at Gallaudet University.

00:05:38:09 – 00:05:40:12
I’m a professor of philosophy

00:05:40:12 – 00:05:41:24
and my research

00:05:41:24 – 00:05:44:11
is specifically in ethics.

00:05:44:11 – 00:05:47:11
And it’s the ethical application

00:05:47:15 – 00:05:48:21
to technology.

00:05:48:21 – 00:05:50:09
And I’m thrilled to be here with you

00:05:50:09 – 00:05:50:22
all today.

00:05:50:22 – 00:05:52:00
I’m looking forward

00:05:52:00 – 00:05:53:17
to discussing the webinar

00:05:53:17 – 00:05:54:24
and having other discussions

00:05:54:24 – 00:05:56:03
with all of you today.

00:05:56:03 – 00:05:57:03
And I did like

00:05:57:03 – 00:05:58:05
I would like to add

00:05:58:05 – 00:06:01:05
that in terms of visual description,

00:06:01:11 – 00:06:02:10
I am a middle aged

00:06:02:10 – 00:06:06:03
woman with an olive, with olive skin

00:06:06:12 – 00:06:08:24
and also I have brown eyes,

00:06:08:24 – 00:06:09:11
I’m wearing

00:06:09:11 – 00:06:11:06
glasses and I have a brown

00:06:11:06 – 00:06:12:00
I have brown hair

00:06:12:00 – 00:06:13:13
and my hair is in a bun today

00:06:13:13 – 00:06:16:09
and I’m wearing a gray sweater

00:06:18:04 – 00:06:20:20
and I’m here in my office at Gallaudet.

00:06:20:20 – 00:06:22:22
Thank you, Theresa.

00:06:22:22 – 00:06:24:20
This next, let’s have Jeff.

00:06:24:20 – 00:06:25:02
Jeff,

00:06:25:02 – 00:06:26:15
would you like to introduce yourself?

00:06:26:15 – 00:06:29:04
Hello, My name is Jeff Schall

00:06:29:04 – 00:06:32:04
and I am working.

00:06:32:06 – 00:06:35:06
I work to develop

00:06:35:06 – 00:06:38:02
AI for the deaf and hard of hearing.

00:06:38:02 – 00:06:41:08
And in terms of a visual description,

00:06:41:17 – 00:06:43:00
I am wearing a white

00:06:43:00 – 00:06:44:21
and black plaid shirt.

00:06:44:21 – 00:06:47:15
I have facial hair and brown eyes

00:06:47:15 – 00:06:49:06
and I’m here.

00:06:49:06 – 00:06:50:10
My office is a background

00:06:50:10 – 00:06:51:17
and I work for go sign.

00:06:51:17 – 00:06:52:24
I

00:06:52:24 – 00:06:55:24
next will have Holly.

00:06:56:03 – 00:06:56:22
Yes, hello.

00:06:56:22 – 00:06:58:11
Good morning.

00:06:58:11 – 00:07:00:15
My name is Holly.

00:07:00:15 – 00:07:03:07
Last name is Jackson

00:07:03:07 – 00:07:06:07
and visual description.

00:07:06:16 – 00:07:09:16
I am an African-American black female.

00:07:10:04 – 00:07:14:02
I have light skin and I have curly

00:07:14:02 – 00:07:17:02
natural hair today.

00:07:17:19 – 00:07:18:12
And

00:07:19:11 – 00:07:22:11
I have a dark Navy

00:07:22:11 – 00:07:25:11
blue suit jacket on.

00:07:26:10 – 00:07:29:09
And I the shirt, my Navy blue

00:07:29:09 – 00:07:29:22
suit

00:07:29:22 – 00:07:32:22
jacket has light white stripes on it.

00:07:33:08 – 00:07:36:02
And I have a shirt beneath my jacket,

00:07:36:02 – 00:07:39:07
and it is a light blue tan

00:07:40:19 – 00:07:42:17
lace

00:07:42:17 – 00:07:43:20
top.

00:07:43:20 – 00:07:45:11
That’s the design.

00:07:45:11 – 00:07:48:14
And my background today

00:07:49:18 – 00:07:52:15
is light gray

00:07:52:15 – 00:07:54:17
plane, light gray background.

00:07:54:17 – 00:07:57:02
And I’m an interpreter.

00:07:57:02 – 00:07:59:09
I’m a hearing interpreter

00:07:59:09 – 00:08:01:15
and also an educator

00:08:01:15 – 00:08:04:08
and educator of ASL and interpreting.

00:08:04:08 – 00:08:06:11
I work for any ASL program

00:08:06:11 – 00:08:09:23
interpreting program, and also I am here

00:08:11:03 – 00:08:13:19
for Naomi,

00:08:13:19 – 00:08:15:13
the representation of Niobe,

00:08:15:13 – 00:08:18:15
the Atlanta chapter, and

00:08:18:15 – 00:08:21:15
I serve as the secretary for the board.

00:08:21:22 – 00:08:23:15
That’s my position this year.

00:08:23:15 – 00:08:24:13
Thank you very much.

00:08:24:13 – 00:08:26:13
I’m happy to be here.

00:08:26:13 – 00:08:27:22
Thank you, Holly.

00:08:27:22 – 00:08:30:15
And last but not least, Anne Marie.

00:08:32:10 – 00:08:35:10
Hello, I’m Anne Marie.

00:08:35:19 – 00:08:37:15
Last name is Killian,

00:08:37:15 – 00:08:40:15
and this is my signed name,

00:08:40:18 – 00:08:43:19
and I am the CEO for

00:08:45:23 – 00:08:49:16
Tie Access and visual description

00:08:49:16 – 00:08:50:14
is that I’m a white

00:08:50:14 – 00:08:53:19
female, middle aged with medium

00:08:53:19 – 00:08:54:24
length hair,

00:08:54:24 – 00:08:56:19
brown hair, and I’m wearing glasses.

00:08:56:19 – 00:08:57:17
Today

00:08:57:17 – 00:08:59:01
I have on a suit jacket

00:08:59:01 – 00:09:00:18
that is black with a purple shirt

00:09:00:18 – 00:09:01:23
beneath it.

00:09:01:23 – 00:09:03:11
And in the background

00:09:03:11 – 00:09:06:11
you can see my dining room table

00:09:06:13 – 00:09:09:21
and blacks, black chairs and

00:09:12:08 – 00:09:13:13
you might see two dogs

00:09:13:13 – 00:09:14:19
running around in the background.

00:09:14:19 – 00:09:16:23
If that happens, I apologize in advance.

00:09:16:23 – 00:09:18:09
Like everyone else.

00:09:18:09 – 00:09:19:13
I’m thrilled to be here today.

00:09:19:13 – 00:09:21:02
Thank you.

00:09:21:02 – 00:09:23:12
Great.

00:09:23:12 – 00:09:26:12
So let’s go to the next slide.

00:09:28:17 – 00:09:29:11
So today

00:09:29:11 – 00:09:31:07
we will be talking about

00:09:31:07 – 00:09:32:14
multiple things.

00:09:32:14 – 00:09:34:04
And during these presentations,

00:09:34:04 – 00:09:36:05
we’ll go in depth about our studies

00:09:36:05 – 00:09:38:04
and what we have found through analyzing

00:09:38:04 – 00:09:39:02
this data

00:09:39:02 – 00:09:40:13
will be sharing with you

00:09:40:13 – 00:09:42:17
this information today.

00:09:42:17 – 00:09:44:04
The first thing we’re going to be doing

00:09:44:04 – 00:09:45:08
is identifying

00:09:45:08 – 00:09:47:21
three critical impact areas.

00:09:49:00 – 00:09:51:02
And so these impact

00:09:51:02 – 00:09:52:16
areas are quite important.

00:09:52:16 – 00:09:54:05
We’ll be talking more in depth

00:09:54:05 – 00:09:56:12
and describing what they are for you.

00:09:56:12 – 00:09:59:12
Secondly,

00:09:59:12 – 00:10:00:17
with this analysis

00:10:00:17 – 00:10:03:17
and with this research in our webinars,

00:10:04:07 – 00:10:06:14
we were able to go through the data

00:10:06:14 – 00:10:08:00
and that data helped us

00:10:08:00 – 00:10:11:06
to build a better understanding of what

00:10:11:06 – 00:10:12:23
the deaf communities perspective

00:10:12:23 – 00:10:14:18
and experience has been

00:10:14:18 – 00:10:17:17
and their experiences with interpreters

00:10:17:17 – 00:10:19:04
and the harms that have happened

00:10:19:04 – 00:10:20:18
and the way that these experiences

00:10:20:18 – 00:10:21:08
have impacted

00:10:21:08 – 00:10:22:07
their life

00:10:22:07 – 00:10:23:13
in terms of access,

00:10:23:13 – 00:10:25:11
in terms of communication access.

00:10:25:11 – 00:10:26:20
And so it’s very important

00:10:26:20 – 00:10:28:17
to have this deaf community perspective

00:10:28:17 – 00:10:29:20
so that we can understand

00:10:29:20 – 00:10:30:19
what they’ve been through

00:10:30:19 – 00:10:32:00
when it comes to

00:10:32:00 – 00:10:34:01
their experiences in interpreting

00:10:34:01 – 00:10:36:05
and where harm has happened.

00:10:36:05 – 00:10:36:19
Often

00:10:36:19 – 00:10:38:17
that is an experience

00:10:38:17 – 00:10:40:16
that is common in our communities.

00:10:40:16 – 00:10:42:23
So we would like to mitigate those harms

00:10:42:23 – 00:10:47:02
and ensure that if we do put out

00:10:47:02 – 00:10:48:00
new technology,

00:10:48:00 – 00:10:49:08
that it’s going to be we’re going

00:10:49:08 – 00:10:50:16
to be mindful of those harms

00:10:50:16 – 00:10:52:06
that have been experience.

00:10:52:06 – 00:10:53:14
Third, we’re going to talk

00:10:53:14 – 00:10:55:10
about the value of the big picture

00:10:55:10 – 00:10:57:08
lens on possibilities.

00:10:57:08 – 00:10:58:12
We’ll talk about

00:10:58:12 – 00:11:00:01
what it looks like to do right

00:11:00:01 – 00:11:02:11
and prevent possible disaster.

00:11:02:11 – 00:11:05:23
We want to make sure that we are looking

00:11:05:23 – 00:11:08:12
through a lens where we are

00:11:08:12 – 00:11:10:02
showing care and concern

00:11:10:02 – 00:11:11:24
about the future of the community

00:11:11:24 – 00:11:13:10
and taking all of these things

00:11:13:10 – 00:11:15:00
into account. Next slide.

00:11:24:19 – 00:11:26:11
So for today

00:11:26:11 – 00:11:29:03
in this webinar, like you mentioned,

00:11:29:03 – 00:11:30:06
we had a panel

00:11:30:06 – 00:11:31:17
and we had

00:11:31:17 – 00:11:32:11
today will be going

00:11:32:11 – 00:11:34:06
through different presentations

00:11:34:06 – 00:11:37:10
and we’ll be going through the report

00:11:37:10 – 00:11:39:22
and having discussions about that.

00:11:39:22 – 00:11:42:20
So we will be following this order,

00:11:42:20 – 00:11:45:20
as you see outlined here and the slide

00:11:46:24 – 00:11:48:05
we will be discussing

00:11:48:05 – 00:11:51:05
ethics and fairness.

00:11:52:02 – 00:11:54:17
We will be discussing how this research

00:11:54:17 – 00:11:55:24
was studied

00:11:55:24 – 00:11:58:24
and our approach to the research.

00:11:59:06 – 00:12:02:02
We will be talking about what we found.

00:12:02:02 – 00:12:04:10
So our findings,

00:12:04:10 – 00:12:07:13
we will also be discussing the results

00:12:08:03 – 00:12:11:07
and outcomes of our research.

00:12:12:02 – 00:12:13:17
And also we will be talking

00:12:13:17 – 00:12:14:18
about concerns

00:12:14:18 – 00:12:17:18
as it relates to the deaf community.

00:12:18:07 – 00:12:20:14
We will be also discussing techno

00:12:20:14 – 00:12:23:14
technology and the quality

00:12:25:05 – 00:12:27:23
and we will be asking,

00:12:27:23 – 00:12:29:10
are we ready for this?

00:12:29:10 – 00:12:30:20
Are we ready for eye,

00:12:32:06 – 00:12:33:00
for eye

00:12:33:00 – 00:12:36:00
and sign language to come together?

00:12:36:20 – 00:12:37:18
Are we ready?

00:12:37:18 – 00:12:40:18
And if we are, what does that look like?

00:12:41:18 – 00:12:44:18
What kind of risks are involved?

00:12:46:00 – 00:12:48:11
We need to be proactive

00:12:48:11 – 00:12:50:19
in understanding

00:12:50:19 – 00:12:52:09
and predicting those risks

00:12:52:09 – 00:12:53:21
so that we can mitigate

00:12:53:21 – 00:12:56:21
or resolve them before they occur.

00:12:59:18 – 00:13:02:08
We will also discuss the future

00:13:02:08 – 00:13:04:17
and what we can anticipate

00:13:04:17 – 00:13:08:02
and what we can recommend for

00:13:09:06 – 00:13:11:03
any anyone

00:13:11:03 – 00:13:13:22
who is going to be working in relation

00:13:13:22 – 00:13:15:08
to this topic.

00:13:15:08 – 00:13:18:08
All of us here today are impacted

00:13:18:08 – 00:13:22:23
or affected by this topic and many people

00:13:22:23 – 00:13:23:15
who are not here

00:13:23:15 – 00:13:24:15
today,

00:13:24:15 – 00:13:27:15
the entire community that we represent

00:13:27:22 – 00:13:30:22
and also work with.

00:13:32:05 – 00:13:34:17
So now

00:13:34:17 – 00:13:36:11
Teresa

00:13:36:11 – 00:13:39:11
will go ahead and begin this discussion

00:13:39:13 – 00:13:40:20
and we’re going to start off again

00:13:40:20 – 00:13:42:09
with ethics and fairness.

00:13:42:09 – 00:13:43:18
Teresa, I’ll turn it over to you.

00:13:49:06 – 00:13:50:09
one moment.

00:13:50:09 – 00:13:53:09
Let me make sure I have this up.

00:13:54:11 – 00:13:55:09
I always have to make sure

00:13:55:09 – 00:13:58:05
I have my spotlight on so that I’m seen.

00:13:58:05 – 00:13:58:17
Okay.

00:13:58:17 – 00:14:02:12
So ethics, ethics and fairness,

00:14:03:01 – 00:14:04:23
what emphasis

00:14:04:23 – 00:14:06:14
and what are we looking at here

00:14:06:14 – 00:14:07:18
with ethics and fairness?

00:14:07:18 – 00:14:09:12
What do we want to avoid?

00:14:09:12 – 00:14:11:00
We don’t want to cause harm.

00:14:11:00 – 00:14:13:17
How do we reduce the cause of harm?

00:14:13:17 – 00:14:15:02
And what does harm mean

00:14:15:02 – 00:14:16:14
to the deaf community

00:14:16:14 – 00:14:18:16
and deaf individuals?

00:14:18:16 – 00:14:21:22
Harm in general, to humanity?

00:14:21:22 – 00:14:23:12
We want to avoid.

00:14:23:12 – 00:14:27:00
So we have two

00:14:27:00 – 00:14:30:00
topics really

00:14:30:00 – 00:14:32:08
coexisting here ethics and fairness.

00:14:32:08 – 00:14:35:13
So what you’ll notice here,

00:14:35:20 – 00:14:38:24
we are concerned with controlling bias

00:14:39:24 – 00:14:42:00
and we want to

00:14:42:00 – 00:14:43:10
assign responsibility

00:14:43:10 – 00:14:46:10
or accountability, rather, as it

00:14:46:22 – 00:14:49:11
relates to AI to avoid harm.

00:14:51:13 – 00:14:52:04
The second

00:14:52:04 – 00:14:52:17
point I like

00:14:52:17 – 00:14:53:00
to make

00:14:53:00 – 00:14:55:05
is that we need to be very clear

00:14:55:05 – 00:14:58:14
and transparent with our documentation

00:14:58:14 – 00:15:00:02
about who is accountable

00:15:00:02 – 00:15:02:00
in the design of AI

00:15:02:00 – 00:15:03:24
and the development of AI

00:15:03:24 – 00:15:06:22
and the application and evaluation.

00:15:06:22 – 00:15:09:07
Everything related to A.I.

00:15:09:07 – 00:15:11:19
who is responsible for this portion

00:15:11:19 – 00:15:13:11
ethically?

00:15:13:11 – 00:15:16:11
Next slide.

00:15:20:01 – 00:15:20:13
Okay.

00:15:20:13 – 00:15:22:05
So we spoke about ethics,

00:15:22:05 – 00:15:23:01
and now I want to talk

00:15:23:01 – 00:15:25:08
a little bit about fairness.

00:15:25:08 – 00:15:28:19
How we in our society

00:15:28:20 – 00:15:31:20
have developed the meaning behind this.

00:15:31:20 – 00:15:35:12
We have a lot of negative biases,

00:15:35:16 – 00:15:36:08
and we also have

00:15:36:08 – 00:15:38:18
positive biases in society.

00:15:38:18 – 00:15:39:18
But what we want to do

00:15:39:18 – 00:15:42:02
is make sure that in air

00:15:42:02 – 00:15:44:07
we want to reflect the best of society

00:15:44:07 – 00:15:45:24
and not the worst.

00:15:45:24 – 00:15:46:20
So with that being said,

00:15:46:20 – 00:15:48:05
we want to eliminate bias

00:15:48:05 – 00:15:49:10
and we want to eliminate

00:15:49:10 – 00:15:50:24
eliminate favoritism

00:15:50:24 – 00:15:52:19
with measurable results.

00:15:52:19 – 00:15:56:12
And also we want to be able to observe

00:15:56:12 – 00:15:59:21
how people are interacting with that.

00:15:59:21 – 00:16:01:13
We have statistics and evidence

00:16:01:13 – 00:16:03:01
related to the fairness,

00:16:03:01 – 00:16:04:10
and we need to use that

00:16:04:10 – 00:16:06:10
fairness evidence based.

00:16:07:15 – 00:16:10:15
Next slide.

00:16:14:17 – 00:16:15:13
Okay.

00:16:15:13 – 00:16:17:01
This one is a little bit interesting

00:16:17:01 – 00:16:18:09
because it’s a little bit

00:16:18:09 – 00:16:21:09
of a mixture of English and

00:16:21:09 – 00:16:23:09
written English and sign language.

00:16:23:09 – 00:16:26:09
So what we see here is the word A.I.

00:16:26:09 – 00:16:29:09
by A.I.. A.I.

00:16:29:18 – 00:16:30:20
Times, A.I.

00:16:30:20 – 00:16:32:21
on the screen in a mathematical look.

00:16:32:21 – 00:16:34:09
But we have a few different ways

00:16:34:09 – 00:16:37:10
that we are expressing this concept Now.

00:16:37:10 – 00:16:38:13
If we’re signing it,

00:16:38:13 – 00:16:40:13
you may see a sign A.I.

00:16:40:13 – 00:16:42:05
squared. Okay?

00:16:42:05 – 00:16:43:01
But when we say

00:16:43:01 – 00:16:46:01
that we’re referring to automated

00:16:46:01 – 00:16:46:19
or automatic

00:16:46:19 – 00:16:48:00
interpreting by

00:16:48:00 – 00:16:50:02
artificial intelligence, A.I.

00:16:50:02 – 00:16:51:05
by A.I..

00:16:51:05 – 00:16:51:23
Now, sometimes

00:16:51:23 – 00:16:55:13
you may see people sign a AI by A.I.,

00:16:55:24 – 00:16:57:05
and it’s the same concept.

00:16:57:05 – 00:16:58:21
It means the same thing.

00:16:58:21 – 00:17:00:00
Now, in written form.

00:17:00:00 – 00:17:03:00
We’ll see A.I., X, A.I..

00:17:03:01 – 00:17:04:11
Those are the three ways you’ll see it.

00:17:04:11 – 00:17:06:08
But the main point is understanding

00:17:06:08 – 00:17:07:03
that we’re referring

00:17:07:03 – 00:17:10:12
to automatic interpreting by artificial

00:17:11:09 – 00:17:12:04
intelligence.

00:17:15:02 – 00:17:16:07
Now, with that being said,

00:17:16:07 – 00:17:19:07
I’m going to turn this over to

00:17:20:13 – 00:17:21:18
I’m sorry, I don’t remember

00:17:21:18 – 00:17:22:23
who’s next on here.

00:17:22:23 – 00:17:25:23
Let me work

00:17:26:13 – 00:17:29:13
just.

00:17:31:12 – 00:17:31:23
Hello.

00:17:31:23 – 00:17:34:23
Okay, so

00:17:35:24 – 00:17:37:12
as Teresa mentioned,

00:17:37:12 – 00:17:39:23
we hosted three webinars.

00:17:39:23 – 00:17:41:12
And during those webinars,

00:17:41:12 – 00:17:42:06
we invited

00:17:42:06 – 00:17:43:09
so many of the deaf

00:17:43:09 – 00:17:46:09
community members to participate with us.

00:17:46:19 – 00:17:48:05
Those webinars,

00:17:48:05 – 00:17:50:22
we discussed a variety of topics

00:17:50:22 – 00:17:51:23
and issues

00:17:51:23 – 00:17:54:11
covering AI and interpreting

00:17:54:11 – 00:17:55:18
and how they relate.

00:17:55:18 – 00:17:57:14
As Teresa mentioned, A.I.

00:17:57:14 – 00:17:59:21
Squared is how we were referring to A.I.

00:17:59:21 – 00:18:00:21
by A.I..

00:18:00:21 – 00:18:02:22
There was so much

00:18:02:22 – 00:18:05:09
great dialog conversation,

00:18:05:09 – 00:18:07:07
ideas and issues

00:18:07:07 – 00:18:09:18
discussed during these webinars.

00:18:09:18 – 00:18:12:16
Now we recorded these webinars

00:18:12:16 – 00:18:15:16
and we made a transcript of them

00:18:15:22 – 00:18:17:18
and we went through these transcripts

00:18:17:18 – 00:18:19:01
with a fine tooth comb

00:18:19:01 – 00:18:20:07
and we looked at them

00:18:20:07 – 00:18:21:12
and looked at patterns

00:18:21:12 – 00:18:23:04
that arose in each

00:18:23:04 – 00:18:24:04
discussion, different

00:18:24:04 – 00:18:25:11
themes that popped up

00:18:25:11 – 00:18:26:21
throughout the entirety

00:18:26:21 – 00:18:28:02
of these webinars.

00:18:28:02 – 00:18:29:11
And we wanted to make sure

00:18:29:11 – 00:18:31:12
that we were able to pinpoint

00:18:31:12 – 00:18:32:21
and really understand

00:18:32:21 – 00:18:34:21
what issues are most prevalent

00:18:34:21 – 00:18:36:10
to the community at large.

00:18:38:02 – 00:18:39:17
We did

00:18:39:17 – 00:18:42:08
a time analysis as well

00:18:42:08 – 00:18:44:22
and we looked at how frequently

00:18:44:22 – 00:18:46:14
different themes

00:18:46:14 – 00:18:48:18
popped up in said webinars

00:18:48:18 – 00:18:50:13
and we found

00:18:50:13 – 00:18:52:07
our findings

00:18:52:07 – 00:18:53:06
are shown in the next slide.

00:18:53:06 – 00:18:54:01
And if you’re interested

00:18:54:01 – 00:18:55:09
in more detailed information,

00:18:55:09 – 00:18:56:19
please read our report.

00:18:56:19 – 00:18:59:22
It goes into a varied in depth

00:19:01:05 – 00:19:03:02
reporting of our findings

00:19:03:02 – 00:19:05:00
and that is available online as well.

00:19:05:00 – 00:19:08:00
Next slide, please.

00:19:10:20 – 00:19:14:12
So this is a snapshot of our findings.

00:19:14:18 – 00:19:16:01
We put down

00:19:16:01 – 00:19:16:19
all the themes

00:19:16:19 – 00:19:17:12
that popped up

00:19:17:12 – 00:19:20:08
most frequently in our webinars,

00:19:20:08 – 00:19:21:22
and we categorized those

00:19:21:22 – 00:19:25:22
into three different areas of study.

00:19:27:00 – 00:19:30:08
The first area is related

00:19:30:08 – 00:19:33:08
to results and outcomes,

00:19:34:01 – 00:19:35:03
and we’ll talk about this

00:19:35:03 – 00:19:36:09
more in the next slide.

00:19:36:09 – 00:19:39:09
We’re focused more on discussing about

00:19:39:20 – 00:19:42:19
what kind of society,

00:19:42:19 – 00:19:44:24
what kind of societal results

00:19:44:24 – 00:19:48:20
will arise from the impact of AI by AI.

00:19:50:08 – 00:19:53:04
Our next theme was readiness.

00:19:53:04 – 00:19:56:04
How ready is the community at large?

00:19:56:21 – 00:20:00:05
How ready are stakeholders for

00:20:01:22 – 00:20:03:14
this technology?

00:20:03:14 – 00:20:04:19
We looked at the feedback

00:20:04:19 – 00:20:06:03
and the requirements and things

00:20:06:03 – 00:20:07:04
that need to happen

00:20:07:04 – 00:20:08:15
for the community at large

00:20:08:15 – 00:20:10:19
in this technological realm.

00:20:10:19 – 00:20:13:05
But as you can see on here,

00:20:13:05 – 00:20:16:05
by over half and half,

00:20:16:08 – 00:20:17:09
the biggest issue

00:20:17:09 – 00:20:19:01
that came up most frequently

00:20:19:01 – 00:20:20:08
and most prevalent on people’s

00:20:20:08 – 00:20:22:17
minds was technological quality.

00:20:22:17 – 00:20:24:08
What type of regulations

00:20:24:08 – 00:20:27:19
and standards are required to minimize

00:20:28:03 – 00:20:31:03
the reduce or the possibility of harm?

00:20:31:15 – 00:20:33:08
And how do we maximize

00:20:33:08 – 00:20:35:01
the potential benefits

00:20:35:01 – 00:20:36:17
of this technology?

00:20:36:17 – 00:20:37:22
Now, for this,

00:20:37:22 – 00:20:39:14
we went into more in-depth

00:20:39:14 – 00:20:42:03
research in our report,

00:20:42:03 – 00:20:43:24
and I’m going to turn it

00:20:43:24 – 00:20:44:20
over to Ann Marie.

00:20:44:20 – 00:20:45:12
If she can.

00:20:45:12 – 00:20:46:10
She can discuss this

00:20:46:10 – 00:20:47:08
a little bit more in depth.

00:20:49:05 – 00:20:50:08
Thank you so much, Jeff.

00:20:50:08 – 00:20:53:08
I really appreciate it.

00:20:53:08 – 00:20:54:06
The first thing that we’re going

00:20:54:06 – 00:20:55:07
to look at, again,

00:20:55:07 – 00:20:59:03
as Jeff mentioned in his synopsis

00:20:59:03 – 00:21:01:13
of the percentages that we looked at,

00:21:01:13 – 00:21:03:06
it is staggering.

00:21:03:06 – 00:21:06:06
The biggest section was on technological

00:21:06:20 – 00:21:09:20
aspects, but

00:21:09:20 – 00:21:13:02
I’m looking at the desired results

00:21:13:02 – 00:21:14:15
and outcomes as well,

00:21:14:15 – 00:21:16:07
and it was a very hot topic

00:21:16:07 – 00:21:18:18
in the discussions in the webinar.

00:21:18:18 – 00:21:19:20
It was really important

00:21:19:20 – 00:21:22:20
to emphasize that as well.

00:21:24:05 – 00:21:25:08
Now, more than one

00:21:25:08 – 00:21:26:17
third of the discussions

00:21:26:17 – 00:21:29:17
regarding this were really focused on

00:21:30:02 – 00:21:33:02
these critical areas of control,

00:21:34:16 – 00:21:36:19
control

00:21:36:19 – 00:21:39:16
by the language community

00:21:39:16 – 00:21:40:18
and the authority

00:21:40:18 – 00:21:42:24
who is involved in the decision making,

00:21:42:24 – 00:21:44:15
who is involved in the device

00:21:44:15 – 00:21:45:23
development of design

00:21:45:23 – 00:21:48:02
and the impact the A.I. is going to have.

00:21:49:16 – 00:21:50:24
The biggest concern

00:21:50:24 – 00:21:53:24
was really regarding the legality,

00:21:54:01 – 00:21:58:20
the structure and the synthesis of this.

00:21:59:03 – 00:22:00:20
And it was critical.

00:22:00:20 – 00:22:02:00
It was impressive

00:22:02:00 – 00:22:05:00
to see how much the deaf leaders

00:22:05:01 – 00:22:07:01
felt that,

00:22:07:01 – 00:22:08:03
you know, in history

00:22:08:03 – 00:22:09:20
in their based on their experience,

00:22:09:20 – 00:22:11:18
the deaf have not always

00:22:11:18 – 00:22:13:17
had a voice at the table,

00:22:13:17 – 00:22:15:05
whether it was in the development

00:22:15:05 – 00:22:17:15
of different views or different

00:22:17:15 – 00:22:19:01
processes.

00:22:19:01 – 00:22:20:01
The deaf perspective

00:22:20:01 – 00:22:20:15
has often

00:22:20:15 – 00:22:21:07
been missing

00:22:21:07 – 00:22:22:18
and it’s really critical

00:22:22:18 – 00:22:23:18
that they are included

00:22:23:18 – 00:22:24:17
in the conversation.

00:22:24:17 – 00:22:26:03
And what does that look like?

00:22:26:03 – 00:22:27:19
What does the deaf representation

00:22:27:19 – 00:22:28:06
look like

00:22:28:06 – 00:22:29:15
in the authoritative process

00:22:29:15 – 00:22:32:15
of developing these standards?

00:22:32:21 – 00:22:34:17
Also,

00:22:34:17 – 00:22:38:07
it is very important and very stringent

00:22:38:07 – 00:22:42:08
that any violation of these processes,

00:22:44:14 – 00:22:45:09
you know, if

00:22:45:09 – 00:22:47:01
these processes are violated,

00:22:47:01 – 00:22:48:20
what type of ramifications

00:22:48:20 – 00:22:49:15
does that have?

00:22:49:15 – 00:22:51:11
That’s been a very big discussion

00:22:51:11 – 00:22:53:07
topic as well. Next slide.

00:23:03:20 – 00:23:05:02
Now you see this next slide.

00:23:05:02 – 00:23:06:06
We’ve also included

00:23:06:06 – 00:23:09:09
some concerns based on our webinars

00:23:10:01 – 00:23:11:21
about control.

00:23:11:21 – 00:23:13:16
There are two levels of control,

00:23:13:16 – 00:23:17:06
individual control and cultural groups,

00:23:17:12 – 00:23:18:17
their control

00:23:18:17 – 00:23:19:20
and the impact

00:23:19:20 – 00:23:21:04
and the protections

00:23:21:04 – 00:23:23:16
needed for each of these groups.

00:23:23:16 – 00:23:25:02
We need to be sensitive to those,

00:23:25:02 – 00:23:28:02
especially with children.

00:23:29:21 – 00:23:31:19
The focus of the community at large.

00:23:31:19 – 00:23:33:16
The biggest priority here

00:23:33:16 – 00:23:37:08
was to make sure that the legal framework

00:23:39:01 – 00:23:41:00
is established with ABI.

00:23:41:00 – 00:23:44:00
I again, you know, it’s

00:23:44:12 – 00:23:48:11
one we’ve had to look at leaders involved

00:23:48:16 – 00:23:50:00
with the deaf community

00:23:50:00 – 00:23:51:17
involvement in research

00:23:51:17 – 00:23:54:17
and in development for these

00:23:55:02 – 00:23:55:20
technologies.

00:23:55:20 – 00:23:57:05
It’s very imperative

00:23:57:05 – 00:23:58:20
because it impacts their life

00:23:58:20 – 00:23:59:21
for the deaf community

00:23:59:21 – 00:24:00:10
at large,

00:24:00:10 – 00:24:00:23
for the deaf

00:24:00:23 – 00:24:01:20
blind community

00:24:01:20 – 00:24:02:18
as well,

00:24:02:18 – 00:24:04:14
for those that are losing their hearing

00:24:04:14 – 00:24:05:21
or hard of hearing

00:24:05:21 – 00:24:07:10
and learning sign,

00:24:07:10 – 00:24:09:14
this impacts all of them.

00:24:09:14 – 00:24:11:13
Again, it really can’t emphasize

00:24:11:13 – 00:24:12:17
this enough.

00:24:12:17 – 00:24:13:24
You know, the deaf

00:24:13:24 – 00:24:15:18
leadership needs to be involved

00:24:15:18 – 00:24:16:14
in the developmental

00:24:16:14 – 00:24:18:23
process of these platforms.

00:24:20:09 – 00:24:23:09
For I by

00:24:25:04 – 00:24:26:10
also it’s really interesting

00:24:26:10 – 00:24:27:04
to note

00:24:27:04 – 00:24:30:04
the importance of the concern that

00:24:30:24 – 00:24:33:24
I, I the data.

00:24:33:24 – 00:24:36:07
What is that data stored storage

00:24:36:07 – 00:24:37:01
look like?

00:24:37:01 – 00:24:39:10
Who is in conservatorship

00:24:39:10 – 00:24:40:20
of this storage?

00:24:40:20 – 00:24:41:21
What are the analytics

00:24:41:21 – 00:24:44:00
and the statistics looking at?

00:24:44:00 – 00:24:45:15
Where is that data held

00:24:45:15 – 00:24:47:00
and how is it protected?

00:24:47:00 – 00:24:49:10
That’s a very big discussion as well.

00:24:49:10 – 00:24:51:15
There’s a lot of concern about,

00:24:51:15 – 00:24:54:24
you know, hiring individuals

00:24:54:24 – 00:24:57:24
that could influence the development and

00:25:00:20 – 00:25:02:09
leadership of this.

00:25:02:09 – 00:25:03:13
That doesn’t always happen

00:25:03:13 – 00:25:05:00
to have deaf individuals

00:25:05:00 – 00:25:07:04
in this type of role

00:25:07:04 – 00:25:09:08
and involved in that process.

00:25:09:08 – 00:25:11:21
So will this process in the future

00:25:11:21 – 00:25:12:17
involve deaf

00:25:12:17 – 00:25:14:11
individuals in the beginning,

00:25:14:11 – 00:25:16:17
a foundational development of this,

00:25:16:17 – 00:25:18:16
or will they only bring them in for A

00:25:18:16 – 00:25:20:20
to perspective and that’s it.

00:25:20:20 – 00:25:22:00
So I think another good point

00:25:22:00 – 00:25:22:19
of discussion

00:25:22:19 – 00:25:23:08
as well,

00:25:23:08 – 00:25:24:17
who have the ownership and the

00:25:24:17 – 00:25:26:14
influence in this process.

00:25:29:22 – 00:25:31:20
You know, not only

00:25:31:20 – 00:25:33:19
hiring deaf consultants,

00:25:33:19 – 00:25:35:09
but all through the line,

00:25:35:09 – 00:25:36:01
through A.I.,

00:25:36:01 – 00:25:38:04
by having deaf

00:25:38:04 – 00:25:40:16
hands on the process is vital.

00:25:40:16 – 00:25:42:08
And I’m going to turn this over

00:25:42:08 – 00:25:45:08
to Jeff again.

00:25:49:08 – 00:25:51:20
Hi, it’s Jeff here again for you.

00:25:51:20 – 00:25:53:12
So

00:25:53:12 – 00:25:55:22
about the technological quality.

00:25:55:22 – 00:25:56:16
During the webinar,

00:25:56:16 – 00:25:58:19
we discussed this quite in depth

00:25:58:19 – 00:26:00:09
and how we can break down

00:26:00:09 – 00:26:03:10
this idea into several subcategories.

00:26:04:16 – 00:26:07:03
The first or the major subcategory

00:26:07:03 – 00:26:08:10
that a lot of people really

00:26:08:10 – 00:26:11:10
discussed in the webinar was data.

00:26:11:14 – 00:26:14:21
Many people agreed that the quality

00:26:15:05 – 00:26:16:19
and the diversity of the data

00:26:16:19 – 00:26:19:10
is key to the foundation of building A.I.

00:26:19:10 – 00:26:21:07
by A.I..

00:26:21:07 – 00:26:22:08
You know, with machine

00:26:22:08 – 00:26:25:08
learning in the community,

00:26:26:15 – 00:26:28:22
it’s garbage in, garbage out

00:26:28:22 – 00:26:30:06
more often than not.

00:26:30:06 – 00:26:32:12
And so what does that look like?

00:26:32:12 – 00:26:34:13
So if we have a model training

00:26:35:14 – 00:26:36:08
by that

00:26:36:08 – 00:26:38:03
with that data,

00:26:38:03 – 00:26:40:19
it will be the same in the output.

00:26:40:19 – 00:26:42:11
So

00:26:42:11 – 00:26:44:08
that leads us to our second

00:26:44:08 – 00:26:47:08
largest issue modeling.

00:26:47:23 – 00:26:50:09
So if we have the garbage in modeling,

00:26:50:09 – 00:26:51:07
it’s garbage out.

00:26:51:07 – 00:26:52:09
But people are looking at it

00:26:52:09 – 00:26:53:09
and saying, okay,

00:26:53:09 – 00:26:55:15
so what is the primary use?

00:26:55:15 – 00:26:57:11
What is the primary model?

00:26:57:11 – 00:26:59:07
What is what features are they using?

00:26:59:07 – 00:27:02:03
Are they focusing only on hand shape

00:27:02:03 – 00:27:02:18
or will

00:27:02:18 – 00:27:04:18
they also be including facial shape

00:27:04:18 – 00:27:05:08
and mouth

00:27:05:08 – 00:27:08:08
shapes, expressions and other key

00:27:08:15 – 00:27:11:15
contextual clues of the language?

00:27:12:09 – 00:27:14:22
How will we be able to evaluate

00:27:14:22 – 00:27:15:24
that model?

00:27:15:24 – 00:27:17:07
Can we decide

00:27:17:07 – 00:27:20:09
which model to use over another model,

00:27:20:21 – 00:27:23:21
and how do we use metrics

00:27:24:01 – 00:27:24:22
to decide

00:27:24:22 – 00:27:26:06
if that is the quality

00:27:26:06 – 00:27:27:20
that we need or not?

00:27:27:20 – 00:27:29:06
Many people really had

00:27:29:06 – 00:27:30:06
that on their minds

00:27:30:06 – 00:27:32:00
in the webinar discussions.

00:27:32:00 – 00:27:33:24
Now, throughout these discussions,

00:27:33:24 – 00:27:35:07
we also noticed the need

00:27:35:07 – 00:27:37:07
for deaf leadership

00:27:37:07 – 00:27:40:02
involvement and oversight as well.

00:27:40:02 – 00:27:44:01
It was key that that topic came up

00:27:44:01 – 00:27:44:24
many times.

00:27:44:24 – 00:27:46:13
We needed to set up

00:27:46:13 – 00:27:50:19
at least a minimum criteria for

00:27:52:00 – 00:27:54:15
the bidirectional interpreting for A.I.

00:27:54:15 – 00:27:55:13
by A.I..

00:27:55:13 – 00:27:58:13
Next slide, please.

00:28:01:16 – 00:28:04:18
Now many more subcategories

00:28:04:18 – 00:28:06:05
arose from this discussion,

00:28:06:05 – 00:28:08:02
and one of them was quality,

00:28:08:02 – 00:28:10:02
and it was very imperative

00:28:10:02 – 00:28:13:02
This topic popped up quite a bit

00:28:13:24 – 00:28:16:24
as well.

00:28:17:01 – 00:28:18:24
The participants all agreed

00:28:18:24 – 00:28:23:03
that during the process of gathering

00:28:23:13 – 00:28:26:18
data, storing data, using data,

00:28:27:05 – 00:28:29:24
sharing data, publishing

00:28:29:24 – 00:28:30:17
all of that,

00:28:30:17 – 00:28:31:18
and that process

00:28:31:18 – 00:28:33:19
needed to be quite transparent

00:28:33:19 – 00:28:34:23
so that we could understand

00:28:34:23 – 00:28:37:12
everything as it happened.

00:28:37:12 – 00:28:39:14
We need the opt in or opt out

00:28:39:14 – 00:28:41:07
option as well.

00:28:41:07 – 00:28:43:03
Also, discussing the ability

00:28:43:03 – 00:28:44:12
to withdrawal.

00:28:44:12 – 00:28:46:14
Our can fit in the future,

00:28:46:14 – 00:28:48:12
meaning if we opt into this

00:28:48:12 – 00:28:49:06
and then later

00:28:49:06 – 00:28:50:09
we change our minds and say,

00:28:50:09 – 00:28:50:18
you know what,

00:28:50:18 – 00:28:52:10
we don’t want to participate.

00:28:52:10 – 00:28:54:23
We have the option to withdraw

00:28:54:23 – 00:28:55:18
that consent.

00:28:57:10 – 00:28:59:04
Another topic that we discussed

00:28:59:04 – 00:28:59:22
quite in death

00:28:59:22 – 00:29:03:01
was the topic of safeguards.

00:29:03:23 – 00:29:05:12
Those are needed to be put in place

00:29:05:12 – 00:29:07:10
to minimize harm.

00:29:07:10 – 00:29:10:10
Many of the what ifs

00:29:10:13 – 00:29:12:18
that arose during our discussion

00:29:12:18 – 00:29:14:03
really related to this.

00:29:14:03 – 00:29:16:12
What if someone downloads this?

00:29:16:12 – 00:29:20:15
What if there is a breach of data?

00:29:20:21 – 00:29:23:14
What if the information is leaked?

00:29:23:14 – 00:29:25:19
What if the system crashes?

00:29:25:19 – 00:29:26:15
There were so many

00:29:26:15 – 00:29:28:09
what ifs and questions

00:29:28:09 – 00:29:30:09
that popped up in that discussion.

00:29:30:09 – 00:29:31:00
And again,

00:29:31:00 – 00:29:33:00
for more analysis of this, please

00:29:33:00 – 00:29:33:23
look at our report.

00:29:33:23 – 00:29:36:17
We have a lot of it detailed in there.

00:29:36:17 – 00:29:39:17
And we also talked about readiness.

00:29:40:10 – 00:29:42:08
And Theresa is going to talk about

00:29:42:08 – 00:29:44:19
that more in depth now.

00:29:44:19 – 00:29:47:19
I’ll hand it over to Theresa.

00:29:51:10 – 00:29:53:20
Okay.

00:29:53:20 – 00:29:56:20
So

00:29:57:08 – 00:29:59:20
in terms of readiness,

00:29:59:20 – 00:30:02:05
when we look at what that includes,

00:30:02:05 – 00:30:04:06
we have readiness

00:30:04:06 – 00:30:05:16
when it comes to technology,

00:30:05:16 – 00:30:09:03
but also there’s a discussion of society

00:30:09:10 – 00:30:11:12
and what our awareness

00:30:11:12 – 00:30:12:01
looks like

00:30:12:01 – 00:30:13:19
in terms of ethical issues

00:30:13:19 – 00:30:15:22
and ethical concerns.

00:30:15:22 – 00:30:18:21
So do we have

00:30:18:21 – 00:30:21:05
technological availability?

00:30:21:05 – 00:30:23:00
Do we have representation

00:30:23:00 – 00:30:25:16
and understanding and creation?

00:30:25:16 – 00:30:28:09
That’s one part of readiness.

00:30:30:12 – 00:30:33:12
Next slide.

00:30:37:24 – 00:30:39:22
So now we’ll see that there’s

00:30:39:22 – 00:30:41:12
there’s various components here.

00:30:41:12 – 00:30:42:11
So you can see here

00:30:42:11 – 00:30:43:19
where it talks

00:30:43:19 – 00:30:44:12
about readiness

00:30:44:12 – 00:30:45:13
when it comes to sign

00:30:45:13 – 00:30:47:12
language recognition.

00:30:47:12 – 00:30:50:24
And we have readiness of American

00:30:50:24 – 00:30:53:01
the American Deaf community.

00:30:53:01 – 00:30:55:07
That’s another component.

00:30:55:07 – 00:30:58:09
Does the deaf community have an in-depth

00:30:58:09 – 00:31:01:23
understanding of the having good quality,

00:31:02:06 – 00:31:05:00
ethical and technological aspects

00:31:05:00 – 00:31:06:17
when it comes to A.I.

00:31:06:17 – 00:31:08:04
by A.I.?

00:31:08:04 – 00:31:11:20
And the last sign of readiness would be

00:31:11:20 – 00:31:12:14
when it comes

00:31:12:14 – 00:31:15:22
to public and private entities,

00:31:16:09 – 00:31:20:08
Are we ready for this kind of technology

00:31:20:15 – 00:31:22:04
and what the responsibilities

00:31:22:04 – 00:31:24:02
be that come along with that

00:31:24:02 – 00:31:27:02
and the accountability?

00:31:28:04 – 00:31:31:04
Next slide, please.

00:31:34:15 – 00:31:36:06
So as you can see here,

00:31:36:06 – 00:31:37:05
one of the components

00:31:37:05 – 00:31:38:24
that we talked about are civil

00:31:38:24 – 00:31:40:13
rights and civil protections.

00:31:41:12 – 00:31:43:04
So what we have

00:31:43:04 – 00:31:45:23
quality control and certification

00:31:45:23 – 00:31:47:13
when it comes to the interpreters

00:31:47:13 – 00:31:49:01
and certification for the A.I.

00:31:49:01 – 00:31:50:17
technology itself.

00:31:50:17 – 00:31:51:06
Also,

00:31:51:06 – 00:31:53:03
we have to consider the impact

00:31:53:03 – 00:31:55:09
that this would have on the culture,

00:31:55:09 – 00:31:56:18
the current culture

00:31:56:18 – 00:31:58:08
and the culture of the future.

00:31:58:08 – 00:31:59:20
And what

00:31:59:20 – 00:32:00:19
what does

00:32:00:19 – 00:32:03:11
state of the art technology mean?

00:32:03:11 – 00:32:06:05
How would we make sure that we are up

00:32:06:05 – 00:32:07:12
to date with technology

00:32:07:12 – 00:32:09:04
as it changes over time

00:32:09:04 – 00:32:10:02
and ensure

00:32:10:02 – 00:32:11:12
that we have the appropriate

00:32:11:12 – 00:32:13:03
response to those changes?

00:32:13:03 – 00:32:16:03
Next slide.

00:32:26:00 – 00:32:27:08
So I believe that, well,

00:32:27:08 – 00:32:28:22
I can go ahead and do this part.

00:32:28:22 – 00:32:33:06
And so in just a few hours.

00:32:33:08 – 00:32:36:08
Next slide, please.

00:32:40:04 – 00:32:40:21
Okay, great.

00:32:40:21 – 00:32:42:01
Sorry about that.

00:32:42:01 – 00:32:45:22
And okay, So within a few hours,

00:32:46:05 – 00:32:48:08
we had these internal discussions

00:32:48:08 – 00:32:50:00
at the three webinars,

00:32:50:00 – 00:32:51:11
and within those hours

00:32:51:11 – 00:32:52:23
we had a lot of ethical issues

00:32:52:23 – 00:32:54:10
that came up to be discussed

00:32:54:10 – 00:32:56:06
and we touched on ethics,

00:32:56:06 – 00:32:58:10
we touched on fairness and safety.

00:32:58:10 – 00:32:59:09
And I believe

00:32:59:09 – 00:33:00:23
all of these things were covered.

00:33:00:23 – 00:33:02:20
Also, the deaf participants

00:33:02:20 – 00:33:04:21
were able to explain

00:33:04:21 – 00:33:07:08
how we can use our principles

00:33:07:08 – 00:33:09:03
and how we can use our models

00:33:09:03 – 00:33:10:15
in a way that’s ethical

00:33:10:15 – 00:33:11:19
in order to prevent

00:33:11:19 – 00:33:13:00
any ongoing

00:33:13:00 – 00:33:14:00
harm

00:33:14:00 – 00:33:15:09
or anything that could come

00:33:15:09 – 00:33:16:24
against the deaf community.

00:33:16:24 – 00:33:18:00
So we talked about that

00:33:18:00 – 00:33:19:04
fairness and equity

00:33:19:04 – 00:33:20:21
in these conversations,

00:33:20:21 – 00:33:23:09
how it pertains to society.

00:33:23:09 – 00:33:24:00
Next slide.

00:33:29:06 – 00:33:31:18
So we do have a few suggestions

00:33:31:18 – 00:33:35:02
and we suggest that there be a

00:33:35:08 – 00:33:36:23
the appropriate level of protection

00:33:36:23 – 00:33:38:19
and privacy and confidentiality

00:33:38:19 – 00:33:39:20
when it comes to A.I.

00:33:39:20 – 00:33:41:07
backed by A.I.,

00:33:41:07 – 00:33:42:11
and that will allow us

00:33:42:11 – 00:33:43:22
to have better protections

00:33:43:22 – 00:33:44:22
across the Internet

00:33:44:22 – 00:33:47:09
for all kinds of applications.

00:33:47:09 – 00:33:51:05
Also, another suggestion was that we have

00:33:52:15 – 00:33:55:19
that these concerns regarding risk

00:33:55:23 – 00:33:58:21
be established and considered in order

00:33:58:21 – 00:33:59:13
to make sure

00:33:59:13 – 00:34:01:20
that we have strict regulations in place

00:34:01:20 – 00:34:03:21
to avoid any adverse

00:34:03:21 – 00:34:05:10
downstream consequences

00:34:05:10 – 00:34:06:18
for governance, business

00:34:06:18 – 00:34:09:18
and social infrastructures.

00:34:10:13 – 00:34:13:13
Next slide.

00:34:14:15 – 00:34:15:08
Okay.

00:34:15:08 – 00:34:17:06
So this word stress

00:34:17:06 – 00:34:20:06
or socio technical systems

00:34:21:01 – 00:34:23:12
and again asks for short

00:34:23:12 – 00:34:26:07
means that we have to recognize

00:34:26:07 – 00:34:28:07
that we do have technology

00:34:28:07 – 00:34:31:12
and we also have our the socio aspect

00:34:31:20 – 00:34:34:01
when these two things come together,

00:34:34:01 – 00:34:34:19
they’re going to have

00:34:34:19 – 00:34:36:15
an influence on each other.

00:34:36:15 – 00:34:38:03
So we need to always

00:34:38:03 – 00:34:38:20
be sure

00:34:38:20 – 00:34:41:18
to look at the system of technology

00:34:41:18 – 00:34:44:04
while considering its impact on society

00:34:44:04 – 00:34:45:14
and vice versa.

00:34:45:14 – 00:34:47:03
We have to look at how both of these

00:34:47:03 – 00:34:48:20
things interact. Next slide.

00:34:54:12 – 00:34:55:17
So I’ll give you a moment

00:34:55:17 – 00:34:58:17
to read through the slide.

00:35:06:07 – 00:35:07:06
So it’s important.

00:35:07:06 – 00:35:08:09
Here to take note of

00:35:08:09 – 00:35:09:20
is that Estes

00:35:09:20 – 00:35:13:18
refers to how things are correlated, how

00:35:13:18 – 00:35:15:11
these co influences

00:35:15:11 – 00:35:18:01
happen in both society and technology

00:35:18:01 – 00:35:21:01
as it pertains to the organization.

00:35:21:08 – 00:35:23:03
So it’s so important

00:35:23:03 – 00:35:25:04
that during the design process

00:35:25:04 – 00:35:28:03
we consider both of these areas

00:35:28:03 – 00:35:29:19
and look at our results

00:35:29:19 – 00:35:34:04
to ensure that air by air is optimizing

00:35:34:09 – 00:35:37:09
both of these two subsystems.

00:35:41:18 – 00:35:44:19
So the key here is to attend to

00:35:44:19 – 00:35:47:21
how the social behaviors of humans

00:35:47:21 – 00:35:50:00
are going to combine and influence

00:35:50:00 – 00:35:52:16
by the structures of technology

00:35:52:16 – 00:35:55:16
and vice versa.

00:35:56:06 – 00:35:59:06
So these are not two separate things.

00:35:59:15 – 00:36:02:10
We do have a few recommendations

00:36:02:10 – 00:36:04:02
that we would like to share.

00:36:04:02 – 00:36:05:19
The first one is that

00:36:05:19 – 00:36:09:10
it needs to be understood that AI by AI

00:36:09:10 – 00:36:12:11
is a socio tech and technological system

00:36:13:10 – 00:36:15:11
and also we need to

00:36:15:11 – 00:36:16:07
make sure

00:36:16:07 – 00:36:19:10
that the deaf wisdom from the community

00:36:19:15 – 00:36:21:02
is a vital part of this.

00:36:21:02 – 00:36:22:10
We want to use our wisdom

00:36:22:10 – 00:36:24:03
and our experience

00:36:24:03 – 00:36:26:09
from interpreting experiences

00:36:26:09 – 00:36:29:19
to our experience with VR s VR AI,

00:36:29:24 – 00:36:31:01
and also

00:36:31:01 – 00:36:33:09
how we as a community

00:36:33:09 – 00:36:35:08
experience technology.

00:36:35:08 – 00:36:38:21
And we want to be able to impart

00:36:38:24 – 00:36:40:00
our experience

00:36:40:00 – 00:36:43:01
and our wisdom as the framework is being

00:36:43:01 – 00:36:46:01
developed in this realm.

00:36:46:01 – 00:36:50:00
Thirdly, we want to continue to engage

00:36:50:00 – 00:36:51:14
with the deaf community

00:36:51:14 – 00:36:52:21
and build our knowledge

00:36:52:21 – 00:36:55:17
and our awareness through funding.

00:36:55:17 – 00:36:58:17
One example of that funding are grants,

00:36:58:20 – 00:37:00:07
and so an example

00:37:00:07 – 00:37:02:15
is the Civic Innovation grant,

00:37:02:15 – 00:37:04:05
where we can receive money

00:37:04:05 – 00:37:04:21
for this

00:37:04:21 – 00:37:06:02
and continue

00:37:06:02 – 00:37:07:20
to plug in to the deaf community

00:37:07:20 – 00:37:09:04
and get their input

00:37:09:04 – 00:37:10:09
and find what the deaf

00:37:10:09 – 00:37:11:15
community would like to see

00:37:11:15 – 00:37:12:14
and what they feel needs

00:37:12:14 – 00:37:14:04
to be brought to attention.

00:37:14:04 – 00:37:15:08
And as these laws and

00:37:15:08 – 00:37:16:13
policies are developed.

00:37:18:11 – 00:37:21:11
Next slide, please.

00:37:24:21 – 00:37:25:08
And as we

00:37:25:08 – 00:37:26:09
get close to wrapping up,

00:37:26:09 – 00:37:28:13
I’d like to make the point that I by

00:37:28:13 – 00:37:30:04
I will probably never be able

00:37:30:04 – 00:37:31:22
to totally replace

00:37:31:22 – 00:37:32:16
humans

00:37:32:16 – 00:37:35:16
because of the interactivity issues.

00:37:35:20 – 00:37:37:01
Often we can predict

00:37:37:01 – 00:37:39:00
there would be misunderstandings

00:37:39:00 – 00:37:42:00
and so we need to have that deep

00:37:42:05 – 00:37:44:03
human knowledge there.

00:37:44:03 – 00:37:47:03
So we’ll have to look for

00:37:47:15 – 00:37:50:00
a human in the loop design

00:37:50:00 – 00:37:51:06
as this technology

00:37:51:06 – 00:37:53:18
is developed over time.

00:37:53:18 – 00:37:57:03
And in terms of human in the loop design

00:37:57:05 – 00:38:00:05
to explain a bit about about that,

00:38:00:12 – 00:38:01:22
of course we would have the A.I.

00:38:01:22 – 00:38:05:03
technology and often

00:38:05:03 – 00:38:08:03
we see that I, I get information

00:38:08:08 – 00:38:10:05
through different sources,

00:38:10:05 – 00:38:12:14
but we want to be able to emphasize

00:38:12:14 – 00:38:14:12
the importance of A.I.

00:38:14:12 – 00:38:16:00
getting feedback

00:38:16:00 – 00:38:19:23
from human interactivity.

00:38:20:12 – 00:38:22:08
There needs to be a feedback loop

00:38:22:08 – 00:38:23:07
food feedback loop

00:38:23:07 – 00:38:24:05
where humans are always

00:38:24:05 – 00:38:27:12
involved in this process so that

00:38:28:05 – 00:38:30:05
the design and the development

00:38:30:05 – 00:38:31:24
and the operation

00:38:31:24 – 00:38:33:22
and everything is verified

00:38:33:22 – 00:38:34:21
by humans

00:38:34:21 – 00:38:37:02
who are involved with the project.

00:38:37:02 – 00:38:38:17
That would be the end goal.

00:38:38:17 – 00:38:39:18
Next slide, please.

00:38:45:23 – 00:38:46:17
Hello everyone.

00:38:46:17 – 00:38:50:03
I’m back so I would like to

00:38:50:05 – 00:38:53:22
we have the advisory group on AI

00:38:53:22 – 00:38:55:02
and final language interpreting

00:38:55:02 – 00:38:56:00
and this has been

00:38:56:00 – 00:38:57:24
there’s been so many

00:38:57:24 – 00:38:59:06
benefits to this group,

00:38:59:06 – 00:39:00:15
so much AB many hours

00:39:00:15 – 00:39:02:18
and work has gone into this.

00:39:02:18 – 00:39:05:15
And so this isn’t the end of our work.

00:39:05:15 – 00:39:07:00
This is just the springboard

00:39:07:00 – 00:39:08:15
to the future for more discussions

00:39:08:15 – 00:39:09:11
and more partnerships

00:39:09:11 – 00:39:11:00
with the community at large.

00:39:11:00 – 00:39:13:05
So we have an event to share with you.

00:39:13:05 – 00:39:14:21
A Save the Date.

00:39:14:21 – 00:39:17:21
We have a symposium

00:39:18:02 – 00:39:21:02
on AI and sign language interpreting

00:39:21:14 – 00:39:23:10
and it will be hosted on April

00:39:23:10 – 00:39:27:08
20th and 21st is a Saturday and Sunday

00:39:27:08 – 00:39:28:11
this year

00:39:28:11 – 00:39:31:11
and will be here at Brown University.

00:39:32:21 – 00:39:35:04
And we also will have

00:39:35:04 – 00:39:37:13
accessibility options to join us

00:39:37:13 – 00:39:38:14
through Zoom.

00:39:38:14 – 00:39:40:16
Anyone can join from anywhere

00:39:40:16 – 00:39:43:09
we are planning to on Saturday.

00:39:43:09 – 00:39:46:16
It will be from 9 to 6 and Sunday 9

00:39:46:16 – 00:39:49:16
to 2, just a half day on Sunday.

00:39:49:22 – 00:39:52:00
And this is Eastern Standard Time.

00:39:52:24 – 00:39:53:21
This is very

00:39:53:21 – 00:39:55:02
important for us

00:39:55:02 – 00:39:57:20
to bring in different perspectives,

00:39:57:20 – 00:40:00:08
different experts and different people

00:40:00:08 – 00:40:02:16
who have in-depth experience in A.I.

00:40:02:16 – 00:40:04:10
and sign language interpreting

00:40:04:10 – 00:40:05:07
so we can really have

00:40:05:07 – 00:40:06:15
an in-depth discussion

00:40:06:15 – 00:40:08:02
on what this looks like.

00:40:08:02 – 00:40:09:15
And it’s also an opportunity

00:40:09:15 – 00:40:11:12
for us to do a deeper dive

00:40:11:12 – 00:40:12:21
into what our presenters

00:40:12:21 – 00:40:13:13
have really talked

00:40:13:13 – 00:40:14:21
about during this session

00:40:14:21 – 00:40:15:18
as well,

00:40:15:18 – 00:40:17:21
to flesh out different topics and issues

00:40:17:21 – 00:40:18:20
that may arise.

00:40:18:20 – 00:40:20:01
I to see you there.

00:40:20:01 – 00:40:23:01
I’m very excited about it.

00:40:23:11 – 00:40:24:15
I’m going to go ahead

00:40:24:15 – 00:40:27:15
and turn this over to Anne Marie.

00:40:28:03 – 00:40:30:13
Thank you so much, Tim.

00:40:30:13 – 00:40:32:20
So I would just like to add

00:40:32:20 – 00:40:34:11
a little bit more information

00:40:34:11 – 00:40:35:21
for your awareness.

00:40:35:21 – 00:40:38:21
And in terms of the participations,

00:40:39:03 – 00:40:42:08
the participants, we did have 300

00:40:42:20 – 00:40:44:21
people come in that participated

00:40:44:21 – 00:40:46:11
during these webinars.

00:40:46:11 – 00:40:48:15
So we had a very great showing for that

00:40:48:15 – 00:40:50:12
and we were on Zoom,

00:40:50:12 – 00:40:52:05
so we were able to see

00:40:52:05 – 00:40:54:04
a lot of comments in the Q&A.

00:40:54:04 – 00:40:55:21
We saw lots of questions come through,

00:40:55:21 – 00:40:57:05
which was great.

00:40:57:05 – 00:41:00:09
And so after this meeting today,

00:41:00:11 – 00:41:01:15
we can share

00:41:01:15 – 00:41:02:16
some more information with you.

00:41:02:16 – 00:41:04:02
But we had a lot of people

00:41:04:02 – 00:41:06:17
that were coming in

00:41:06:17 – 00:41:09:08
for this research, for these webinars,

00:41:11:08 – 00:41:14:17
and we had eight we had 98%

00:41:14:17 – 00:41:16:12
of the people who participated

00:41:16:12 – 00:41:18:05
sign and agreed to be able

00:41:18:05 – 00:41:20:01
to share their information.

00:41:20:01 – 00:41:25:10
And we also had ASL involved

00:41:25:10 – 00:41:26:24
for all of those participant

00:41:26:24 – 00:41:29:24
participants out of 55,

00:41:31:01 – 00:41:32:00
out of the fit

00:41:32:00 – 00:41:32:20
out of the people

00:41:32:20 – 00:41:35:20
that were on the forum out of,

00:41:36:06 – 00:41:38:18
we had 98% of the people

00:41:38:18 – 00:41:40:07
who were involved agreed

00:41:40:07 – 00:41:43:07
to share the information

00:41:47:00 – 00:41:50:00
and to add to that

00:41:51:05 – 00:41:54:05
in terms of the topics and

00:41:54:15 – 00:41:56:15
what we what we were talking about,

00:41:56:15 – 00:41:58:22
we talked about the values and these

00:41:58:22 – 00:42:01:22
and the perspectives of the community.

00:42:03:15 – 00:42:04:19
We talked about research

00:42:04:19 – 00:42:05:23
findings

00:42:05:23 – 00:42:07:16
and the level of participation

00:42:07:16 – 00:42:08:10
was just great.

00:42:08:10 – 00:42:10:23
We were able to do a lot of research

00:42:10:23 – 00:42:13:13
and go through those topics

00:42:13:13 – 00:42:14:15
and get the specifics

00:42:14:15 – 00:42:15:20
from the participants

00:42:15:20 – 00:42:18:24
so that we had very clear results.

00:42:19:07 – 00:42:20:11
Next slide, please.

00:42:28:01 – 00:42:29:21
And in terms of the

00:42:29:21 – 00:42:31:16
discussion we were on

00:42:31:16 – 00:42:34:03
for about 173 minutes

00:42:34:03 – 00:42:37:10
and we were able to show

00:42:37:11 – 00:42:39:07
a lot of in-depth discussion

00:42:39:07 – 00:42:41:11
and comments that came through.

00:42:41:11 – 00:42:43:22
It gave a lot of value

00:42:43:22 – 00:42:44:21
to our discussion.

00:42:44:21 – 00:42:47:00
We had quite a bit of participation

00:42:47:00 – 00:42:48:18
and consideration to go over

00:42:48:18 – 00:42:50:04
and some of the topics

00:42:50:04 – 00:42:52:12
that were discussed really focused.

00:42:52:12 – 00:42:55:10
We talked about some of those earlier.

00:42:55:10 – 00:42:56:01
For example,

00:42:56:01 – 00:42:57:04
we talked about deaf

00:42:57:04 – 00:42:59:09
community readiness for AI.

00:42:59:09 – 00:43:00:14
Is the deaf community

00:43:00:14 – 00:43:02:04
actually ready for this?

00:43:02:04 – 00:43:06:09
So these discussions were so important

00:43:06:09 – 00:43:08:01
and I think having the participants

00:43:08:01 – 00:43:10:03
there to talk about AI

00:43:10:03 – 00:43:11:06
and to consider

00:43:11:06 – 00:43:11:18
whether or not

00:43:11:18 – 00:43:12:19
they were ready

00:43:12:19 – 00:43:15:21
to have this method of communication

00:43:15:21 – 00:43:19:14
as part of their life, to have AI there.

00:43:20:03 – 00:43:20:23
They discussed whether or

00:43:20:23 – 00:43:21:24
not they were ready for that.

00:43:23:06 – 00:43:24:14
And many of the

00:43:24:14 – 00:43:27:14
participants, 17

00:43:29:17 – 00:43:30:23
many of the participants

00:43:30:23 – 00:43:33:23
that were there from the advisory group,

00:43:37:21 – 00:43:38:14
were involved

00:43:38:14 – 00:43:41:14
in this process.

00:43:43:13 – 00:43:46:13
And

00:43:55:09 – 00:43:56:02
let me go back

00:43:56:02 – 00:43:59:02
for a second.

00:44:08:21 – 00:44:09:03
Okay.

00:44:09:03 – 00:44:12:03
Going back a bit to the process,

00:44:15:23 – 00:44:18:23
the discussion went on for 173 minutes

00:44:19:10 – 00:44:20:13
and we were able

00:44:20:13 – 00:44:22:01
to share different comments.

00:44:22:01 – 00:44:24:18
The participants went over

00:44:24:18 – 00:44:26:08
the things that we talked about today

00:44:26:08 – 00:44:29:08
and also

00:44:30:09 – 00:44:32:18
hashtag Save death.

00:44:32:18 – 00:44:37:06
I became a hashtag that we used

00:44:37:06 – 00:44:40:06
and we got a lot of feedback

00:44:41:21 – 00:44:44:18
and also recommendations.

00:44:44:18 – 00:44:46:16
And we talked about the broader influence

00:44:46:16 – 00:44:49:16
that this will have on the community.

00:44:55:22 – 00:44:56:09
We talked

00:44:56:09 – 00:44:59:09
about the organizations,

00:45:00:19 – 00:45:02:20
and these discussions were very important

00:45:02:20 – 00:45:05:06
to talk about collecting the data

00:45:05:06 – 00:45:08:06
and the process

00:45:09:07 – 00:45:10:01
of.

00:45:10:01 – 00:45:11:06
Next slide, please.

00:45:31:09 – 00:45:32:02
Okay.

00:45:32:02 – 00:45:33:17
So we had three

00:45:33:17 – 00:45:34:15
categories

00:45:34:15 – 00:45:36:02
to flesh out

00:45:36:02 – 00:45:38:00
all of this information through.

00:45:38:00 – 00:45:40:04
We wanted to go through

00:45:40:04 – 00:45:44:00
and with our advisory group members and

00:45:44:00 – 00:45:47:20
we decided for more thematic

00:45:47:20 – 00:45:51:06
areas of research as a team had arisen.

00:45:51:16 – 00:45:53:04
All six of them were shown

00:45:53:04 – 00:45:56:04
in the first passcode book,

00:45:56:12 – 00:45:58:04
and now we’re going to add

00:45:58:04 – 00:46:01:10
that to what we’ll send out to all.

00:46:01:17 – 00:46:04:03
And that’s also included in the report.

00:46:04:03 – 00:46:07:14
I think I saw some questions in the Q A

00:46:07:16 – 00:46:10:16
It is available online.

00:46:10:20 – 00:46:12:22
It will be available

00:46:12:22 – 00:46:16:00
as we’re doing an invitation process

00:46:16:00 – 00:46:19:23
to add that to our website.

00:46:30:18 – 00:46:32:05
I

00:46:32:05 – 00:46:33:03
do you want me to

00:46:33:03 – 00:46:36:03
move on to the next slide?

00:46:36:06 – 00:46:37:04
I think we’re at the end.

00:46:37:04 – 00:46:37:13
Okay.

00:47:00:05 – 00:47:03:05
So

00:47:05:07 – 00:47:06:08
I’m saying if the team

00:47:06:08 – 00:47:08:12
could all turn on their cameras,

00:47:08:12 – 00:47:11:12
we’ll start in with the Q&A.

00:47:16:08 – 00:47:16:19
All right.

00:47:16:19 – 00:47:17:20
Are we ready to address

00:47:17:20 – 00:47:20:03
the questions in the Q&A?

00:47:20:03 – 00:47:22:08
Let’s do it, though.

00:47:22:08 – 00:47:23:21
The first thing I’d like to do

00:47:23:21 – 00:47:25:19
is thank you all for your questions.

00:47:25:19 – 00:47:28:19
Thank you so very much.

00:47:29:04 – 00:47:29:24
The first question

00:47:29:24 – 00:47:32:24
that we have for the panel

00:47:34:01 – 00:47:35:16
in the future,

00:47:35:16 – 00:47:38:16
say 10 to 15 years down the road,

00:47:38:20 – 00:47:39:08
A.I.

00:47:39:08 – 00:47:42:08
by A.I., what will that look like?

00:47:43:21 – 00:47:46:21
How many applications

00:47:46:22 – 00:47:48:12
are possible?

00:47:48:12 – 00:47:53:08
For example, TV theaters, movie theaters?

00:47:53:19 – 00:47:55:16
Will there be an automatic interpreter

00:47:55:16 – 00:47:58:16
pop up on the screen?

00:48:00:08 – 00:48:01:12
What do you think

00:48:01:12 – 00:48:02:24
is the potential for A.I.

00:48:02:24 – 00:48:05:24
in the future?

00:48:08:21 – 00:48:10:14
Jeff Would you like to go?

00:48:10:14 – 00:48:13:05
Jeff saying yes, I will talk about that.

00:48:13:05 – 00:48:13:24
I bet

00:48:13:24 – 00:48:15:19
I will have so many applications

00:48:15:19 – 00:48:16:14
in the future.

00:48:16:14 – 00:48:18:07
I think in 10 to 15 years,

00:48:18:07 – 00:48:20:09
the possibilities are astounding.

00:48:20:09 – 00:48:23:23
There’s you know, it I would say that

00:48:24:00 – 00:48:25:08
computes two equates

00:48:25:08 – 00:48:28:00
to about 100 years of human development

00:48:28:00 – 00:48:29:06
because technology moves

00:48:29:06 – 00:48:30:21
at such a rapid pace.

00:48:30:21 – 00:48:31:10
One thing

00:48:31:10 – 00:48:33:03
I would like to say that for sure,

00:48:33:03 – 00:48:34:19
I know that it will have improved

00:48:34:19 – 00:48:36:02
drastically by then

00:48:36:02 – 00:48:38:00
because I think it’s going to continue

00:48:38:00 – 00:48:39:16
marching on

00:48:39:16 – 00:48:41:14
and improving as time goes by.

00:48:41:14 – 00:48:43:12
Now, will we be more trusting

00:48:43:12 – 00:48:44:18
of these applications

00:48:44:18 – 00:48:47:18
in the future and more confident with it?

00:48:47:19 – 00:48:51:10
I think in low risk situations,

00:48:52:22 – 00:48:54:14
I don’t think it should be too bad.

00:48:54:14 – 00:48:57:14
For example,

00:48:57:14 – 00:48:59:18
not a police encounter,

00:48:59:18 – 00:49:00:17
medical encounter

00:49:00:17 – 00:49:03:20
or anything of that sort, but I bet

00:49:03:20 – 00:49:06:12
I could be used for automatic

00:49:06:12 – 00:49:08:09
conversation and dialogs like,

00:49:08:09 – 00:49:10:04
you know, with robocalls,

00:49:10:04 – 00:49:12:14
sharing information,

00:49:12:14 – 00:49:13:24
having something set,

00:49:13:24 – 00:49:17:13
you know, an automated system.

00:49:18:03 – 00:49:19:10
But in the future

00:49:19:10 – 00:49:22:10
I foresee more captioning being involved,

00:49:22:17 – 00:49:24:20
not necessarily ASL only,

00:49:24:20 – 00:49:26:19
but I think captioning

00:49:26:19 – 00:49:29:16
and multilingual captioning as a whole

00:49:29:16 – 00:49:32:06
will have developed so much over time.

00:49:32:06 – 00:49:33:21
I think there are so many possibilities

00:49:33:21 – 00:49:34:16
and different directions

00:49:34:16 – 00:49:35:23
it could go again.

00:49:35:23 – 00:49:37:11
It’s hard to predict everything

00:49:37:11 – 00:49:38:19
that may happen in the future,

00:49:38:19 – 00:49:39:20
but in a nutshell,

00:49:39:20 – 00:49:42:01
i believe that’s what would happen

00:49:42:01 – 00:49:45:03
and I think that some examples of that

00:49:45:08 – 00:49:46:13
do so.

00:49:46:13 – 00:49:47:06
For example,

00:49:47:06 – 00:49:49:21
when you’re driving through a place

00:49:49:21 – 00:49:51:20
and you want to order coffee

00:49:51:20 – 00:49:53:12
or you want to order food

00:49:53:12 – 00:49:55:22
and that kind of situation,

00:49:55:22 – 00:49:58:13
and when you’re ordering your food,

00:49:58:13 – 00:49:59:00
you know

00:50:00:04 – 00:50:00:14
it.

00:50:00:14 – 00:50:01:05
It’s not

00:50:01:05 – 00:50:02:03
going to be something

00:50:02:03 – 00:50:03:10
that’s disastrous to your life

00:50:03:10 – 00:50:05:00
if something messes up.

00:50:05:00 – 00:50:06:20
But when it comes to the barriers,

00:50:06:20 – 00:50:09:01
the deaf community experiences

00:50:09:01 – 00:50:10:12
in other situations,

00:50:10:12 – 00:50:12:06
there’s so many different possibilities

00:50:12:06 – 00:50:13:17
of how this could go.

00:50:13:17 – 00:50:15:04
And I think, again,

00:50:15:04 – 00:50:17:14
we have to make sure that the technology

00:50:17:14 – 00:50:19:08
is ready, verified

00:50:19:08 – 00:50:21:16
and it’s going to be more beneficial

00:50:21:16 – 00:50:23:18
and and not harmful.

00:50:23:18 – 00:50:25:18
I think that’s what we have to see,

00:50:25:18 – 00:50:27:02
you know?

00:50:27:02 – 00:50:27:19
Theresa here,

00:50:27:19 – 00:50:29:03
I’d like to make a comment as well,

00:50:29:03 – 00:50:30:15
and I’d like to emphasize the point

00:50:30:15 – 00:50:34:01
about low risk situations for a moment.

00:50:34:01 – 00:50:35:06
I think that’s important.

00:50:35:06 – 00:50:37:08
It’s imperative for us to realize

00:50:37:08 – 00:50:38:23
that we always have a need

00:50:38:23 – 00:50:41:23
for human interpreters,

00:50:41:24 – 00:50:43:12
specific situations

00:50:43:12 – 00:50:44:24
like medical situations,

00:50:44:24 – 00:50:47:06
legal situations, court

00:50:47:06 – 00:50:49:04
law enforcement interactions.

00:50:49:04 – 00:50:50:07
It’s really imperative

00:50:50:07 – 00:50:51:24
that we include the human aspect

00:50:51:24 – 00:50:54:08
and human judgment in those situations

00:50:55:19 – 00:50:56:24
in this century.

00:50:56:24 – 00:51:00:19
And to add to that, I think one thing

00:51:00:19 – 00:51:01:16
that we also have to

00:51:01:16 – 00:51:04:16
emphasize is knowing the language itself

00:51:04:19 – 00:51:07:06
and knowing that process,

00:51:07:06 – 00:51:09:14
whether the responses

00:51:09:14 – 00:51:11:00
and also looking at

00:51:11:00 – 00:51:12:22
whether it’s a live person,

00:51:12:22 – 00:51:16:06
if it’s going to be a mixed situation,

00:51:16:10 – 00:51:17:15
we have to recognize

00:51:17:15 – 00:51:18:23
exactly what is involved

00:51:18:23 – 00:51:20:21
situation by situation

00:51:20:21 – 00:51:24:08
so that we can make sure that the access

00:51:24:08 – 00:51:25:21
that’s being provided is as close

00:51:25:21 – 00:51:27:17
to 100% accurate as possible.

00:51:27:17 – 00:51:29:16
So like Theresa said, humans

00:51:29:16 – 00:51:31:13
have to be involved in this

00:51:31:13 – 00:51:33:05
and in this feedback loop,

00:51:33:05 – 00:51:35:15
and it’s very critical that we

00:51:35:15 – 00:51:37:00
emphasize that.

00:51:37:00 – 00:51:37:16
Theresa saying,

00:51:37:16 – 00:51:39:01
I’d like to add a little bit more.

00:51:39:01 – 00:51:40:01
One thing to think about

00:51:40:01 – 00:51:41:12
is in my past experience,

00:51:41:12 – 00:51:45:13
for many years, living in Mexico,

00:51:45:13 – 00:51:47:00
the deaf population

00:51:47:00 – 00:51:48:20
in the community, in New Mexico,

00:51:48:20 – 00:51:51:22
New Mexico, you know,

00:51:52:19 – 00:51:56:00
they were able to be

00:51:56:00 – 00:51:57:15
mainstreamed into the school

00:51:59:14 – 00:52:00:20
and the dialects are

00:52:00:20 – 00:52:02:22
different in different areas.

00:52:02:22 – 00:52:05:06
I may not recognize that.

00:52:05:06 – 00:52:06:14
So growing up in those schools

00:52:06:14 – 00:52:08:13
where I where the communities were small,

00:52:08:13 – 00:52:10:07
the dialects were diverse,

00:52:10:07 – 00:52:12:21
I may not have that capability

00:52:12:21 – 00:52:15:05
just yet to recognize that.

00:52:15:05 – 00:52:17:17
So we need to make sure that our air

00:52:17:17 – 00:52:20:17
is ready for this specifically,

00:52:20:18 – 00:52:21:14
for example,

00:52:21:14 – 00:52:24:14
with our black deaf signers,

00:52:24:21 – 00:52:26:08
the American sign language

00:52:26:08 – 00:52:27:02
that they use

00:52:27:02 – 00:52:28:17
is so culturally diverse

00:52:28:17 – 00:52:30:12
than American Sign language.

00:52:30:12 – 00:52:32:00
We have to make sure

00:52:32:00 – 00:52:32:19
that we have the right

00:52:32:19 – 00:52:35:10
understanding of interpreting.

00:52:35:10 – 00:52:38:10
And for this approach.

00:52:43:14 – 00:52:44:03
Yes.

00:52:44:03 – 00:52:46:19
And more questions.

00:52:46:19 – 00:52:47:17
Very exciting.

00:52:47:17 – 00:52:49:12
Okay, so another question.

00:52:49:12 – 00:52:52:22
Question number two is related

00:52:52:22 – 00:52:55:22
to the term

00:52:56:06 – 00:52:59:06
sign language

00:53:03:19 – 00:53:05:01
and atomization.

00:53:05:01 – 00:53:08:01


00:53:09:05 – 00:53:09:23
So basically

00:53:09:23 – 00:53:12:23
related to privacy,

00:53:13:03 – 00:53:15:23
protecting individuals and their data.

00:53:18:00 – 00:53:19:14
So from

00:53:19:14 – 00:53:20:21
the research and the report

00:53:20:21 – 00:53:22:15
that you all produced,

00:53:22:15 – 00:53:24:20
can you guys discuss and share a bit

00:53:24:20 – 00:53:27:20
about the data privacy

00:53:28:18 – 00:53:30:06
of confidentiality

00:53:30:06 – 00:53:33:14
and also for automatic interpretation,

00:53:33:23 – 00:53:36:24
would you use an avatar and

00:53:37:10 – 00:53:38:13
what does this look like?

00:53:38:13 – 00:53:39:22
How would you be able to

00:53:39:22 – 00:53:41:06
represent individuals

00:53:41:06 – 00:53:44:06
while also protecting them?

00:53:44:09 – 00:53:44:24
Jeff Here

00:53:44:24 – 00:53:46:03
I would like to take that one,

00:53:46:03 – 00:53:47:18
if that’s all right with the group.

00:53:47:18 – 00:53:50:12
So I prefer to sign

00:53:50:12 – 00:53:51:20
automatic interpreting

00:53:51:20 – 00:53:53:14
for data collection.

00:53:53:14 – 00:53:56:00
The faith is involved in that.

00:53:56:00 – 00:53:58:21
So one aspect of data collection is worth

00:53:58:21 – 00:54:00:05
signing is

00:54:00:05 – 00:54:01:09
they don’t have our faith,

00:54:01:09 – 00:54:02:18
and it’s imperative

00:54:02:18 – 00:54:05:15
to compare that with speech recognition

00:54:05:15 – 00:54:07:06
through automatic speech recognition.

00:54:07:06 – 00:54:08:23
You have voice and intonation,

00:54:08:23 – 00:54:09:15
but with signing,

00:54:09:15 – 00:54:11:06
if you don’t include the faith,

00:54:11:06 – 00:54:12:22
that’s a deep part of the language

00:54:12:22 – 00:54:14:15
that’s missing itself.

00:54:14:15 – 00:54:18:16
So there’s no real easy way to avoid that

00:54:18:16 – 00:54:20:24
with data collection,

00:54:20:24 – 00:54:22:23
because we have to have our faith

00:54:22:23 – 00:54:25:13
for the language foundation and tone

00:54:25:13 – 00:54:27:02
and meaning to be there.

00:54:27:02 – 00:54:28:13
So we have to be very careful

00:54:28:13 – 00:54:30:08
with the data itself.

00:54:30:08 – 00:54:31:24
We have to protect that

00:54:31:24 – 00:54:32:12
to make sure

00:54:32:12 – 00:54:35:03
that people are fully informed.

00:54:35:03 – 00:54:37:10
Now, informed consent,

00:54:37:10 – 00:54:38:22
we’re going to use that in the training,

00:54:38:22 – 00:54:40:02
in AI by AI.

00:54:40:02 – 00:54:41:20
We will be talking about that

00:54:41:20 – 00:54:42:23
facial recognition

00:54:42:23 – 00:54:44:23
and other identifying information.

00:54:44:23 – 00:54:47:07
My background may be there as well.

00:54:47:07 – 00:54:49:22
So we have to really filter out

00:54:49:22 – 00:54:51:04
that information

00:54:51:04 – 00:54:54:03
and see what part of that is protected

00:54:54:03 – 00:54:56:11
and what part of it is stored.

00:54:56:11 – 00:54:58:00
And we also have to think about

00:54:58:00 – 00:55:00:06
what is private.

00:55:00:06 – 00:55:02:21
And with avatars, you know,

00:55:02:21 – 00:55:03:22
it may be possible

00:55:03:22 – 00:55:06:11
to produce signs with avatars,

00:55:06:11 – 00:55:08:03
but it could be a little bit off.

00:55:08:03 – 00:55:11:03
For example, the avatar.

00:55:11:07 – 00:55:13:14
What can we train that avatar

00:55:13:14 – 00:55:14:18
to make the signs?

00:55:14:18 – 00:55:16:24
And can we identify that

00:55:16:24 – 00:55:18:02
and know that that’s

00:55:18:02 – 00:55:20:03
what the person is signing?

00:55:20:03 – 00:55:21:23
That can be a little bit ambiguous

00:55:21:23 – 00:55:23:00
in trying to identify

00:55:23:00 – 00:55:26:15
what the data and the the machine

00:55:26:15 – 00:55:29:22
learning is actually tracking in.

00:55:31:18 – 00:55:34:15
All of that goes back to

00:55:34:15 – 00:55:37:15
the subject of informed consent.

00:55:39:02 – 00:55:40:23
And I’d like to add

00:55:40:23 – 00:55:42:17
to that one more thing.

00:55:42:17 – 00:55:45:18
In terms of informed consent, often

00:55:45:18 – 00:55:47:07
we think, okay,

00:55:47:07 – 00:55:48:20
this is just a one time thing

00:55:48:20 – 00:55:51:08
I’m signing and I’m giving my consent,

00:55:51:08 – 00:55:52:24
but informed consent

00:55:52:24 – 00:55:54:08
really needs to happen

00:55:54:08 – 00:55:56:03
on an ongoing basis.

00:55:56:03 – 00:55:57:17
We need to be reminded

00:55:57:17 – 00:55:58:20
and we need to make sure

00:55:58:20 – 00:56:01:06
that we continue to give that consent

00:56:01:06 – 00:56:04:07
and continue to agree and remind people

00:56:04:13 – 00:56:06:11
that they have permission

00:56:06:11 – 00:56:07:13
to remove themselves

00:56:07:13 – 00:56:09:14
or to take that consent back.

00:56:09:14 – 00:56:09:23


00:56:09:23 – 00:56:10:19
And so they have

00:56:10:19 – 00:56:11:16
the right to be involved.

00:56:11:16 – 00:56:14:16
They had the right to decline.

00:56:19:04 – 00:56:20:09
Okay.

00:56:20:09 – 00:56:21:19
QUESTION

00:56:21:19 – 00:56:23:01
Thank you so much for your question.

00:56:23:01 – 00:56:23:21
So the next question

00:56:23:21 – 00:56:24:21
we’re going to address

00:56:24:21 – 00:56:28:01
is I’d like to add a little bit to

00:56:28:01 – 00:56:29:23
this as well.

00:56:32:08 – 00:56:35:08
The discussion about

00:56:35:23 – 00:56:38:23
I’m sorry, the discussion about how

00:56:40:18 – 00:56:43:24
this task force and this advisory group

00:56:44:05 – 00:56:45:20
has collaborated

00:56:45:20 – 00:56:48:02
with other organizations,

00:56:48:02 – 00:56:50:15
the NAD Gallaudet University,

00:56:50:15 – 00:56:53:18
other educational bodies.

00:56:53:18 – 00:56:55:10
What does that look like?

00:56:55:10 – 00:56:58:10
What does your partnership look like?

00:57:02:05 – 00:57:04:02
And I can answer that question.

00:57:04:02 – 00:57:08:15
And so I think that first of all,

00:57:09:03 – 00:57:09:24
the advisory

00:57:09:24 – 00:57:12:07
council is really a device,

00:57:12:07 – 00:57:13:18
a diverse group.

00:57:13:18 – 00:57:16:14
We welcome people

00:57:16:14 – 00:57:18:15
from various organizations

00:57:18:15 – 00:57:21:06
to make sure that we’re reflecting

00:57:21:06 – 00:57:23:00
various people, like, for example,

00:57:23:00 – 00:57:24:22
from Gallaudet University,

00:57:24:22 – 00:57:28:16
people from Nairobi, people from Nadi.

00:57:29:00 – 00:57:32:00
We have participants from all over, and

00:57:32:06 – 00:57:33:18
I think that’s important.

00:57:33:18 – 00:57:36:01
But also we want to recognize

00:57:36:01 – 00:57:37:06
that we want to continue

00:57:37:06 – 00:57:40:22
to involve individuals like, for example,

00:57:40:22 – 00:57:45:10
management.

00:57:45:10 – 00:57:47:03
Omar That’s another one.

00:57:47:03 – 00:57:51:12
We want to have diverse representation

00:57:51:12 – 00:57:52:21
to be able to continue

00:57:52:21 – 00:57:54:14
to working with everyone,

00:57:54:14 – 00:57:56:22
because there are groups that have been

00:57:56:22 – 00:57:59:15
have not been included

00:57:59:15 – 00:58:01:05
in these types of processes in the past.

00:58:01:05 – 00:58:03:12
So we want to make sure that our research

00:58:03:12 – 00:58:05:06
and our study can continue

00:58:05:06 – 00:58:06:17
that kind of collaboration

00:58:06:17 – 00:58:08:06
that we’ve already established.

00:58:08:06 – 00:58:08:22
Like I said,

00:58:08:22 – 00:58:12:07
whether it be with Gallaudet University,

00:58:12:07 – 00:58:13:09
other organizations, I’m

00:58:13:09 – 00:58:14:04
not sure if anyone else

00:58:14:04 – 00:58:15:20
has something to add to that comment.

00:58:17:23 – 00:58:19:07
I’d like to add to the comment.

00:58:19:07 – 00:58:20:15
So we have individuals

00:58:20:15 – 00:58:23:15
with specific skills and backgrounds.

00:58:23:15 – 00:58:27:08
We have work where, you know, I myself

00:58:27:08 – 00:58:28:19
work at Gallaudet University.

00:58:28:19 – 00:58:32:01
I’m a research focused on bioethics

00:58:32:01 – 00:58:35:12
and I volunteer to participate

00:58:35:12 – 00:58:37:09
as an individual researcher.

00:58:37:09 – 00:58:38:15
I’m not representing

00:58:38:15 – 00:58:39:14
Gallaudet University,

00:58:39:14 – 00:58:40:13
but at the same time,

00:58:40:13 – 00:58:41:24
we do have discussions

00:58:41:24 – 00:58:44:24
with many people in Gallaudet University

00:58:45:09 – 00:58:47:03
throughout different areas,

00:58:47:03 – 00:58:48:23
recognizing the importance of this.

00:58:48:23 – 00:58:50:01
And sometimes we have a person

00:58:50:01 – 00:58:51:08
who’s willing to work

00:58:51:08 – 00:58:53:06
who is working for Gallaudet University,

00:58:53:06 – 00:58:54:10
but they’re not representing

00:58:54:10 – 00:58:55:24
the university itself

00:58:55:24 – 00:58:58:07
in their role in this research.

00:58:58:07 – 00:59:00:14
So in my opinion, my research

00:59:00:14 – 00:59:01:21
and what I’m looking at

00:59:01:21 – 00:59:03:19
is not representing the university

00:59:03:19 – 00:59:05:20
at large itself.

00:59:05:20 – 00:59:07:18
So I hope that clarifies that

00:59:07:18 – 00:59:08:14
to a degree.

00:59:11:14 – 00:59:13:19
This is Anne-Marie and

00:59:13:19 – 00:59:17:09
I think in terms of education

00:59:17:09 – 00:59:20:11
and nonprofit groups and organizations,

00:59:20:11 – 00:59:21:22
as an ADDY

00:59:21:22 – 00:59:23:00
and other organizations

00:59:23:00 – 00:59:24:14
that we’ve had involved,

00:59:24:14 – 00:59:26:15
these different signing

00:59:26:15 – 00:59:29:03
groups are working closely together,

00:59:29:03 – 00:59:32:03
different companies who have technology.

00:59:32:08 – 00:59:33:19
And I’d like to emphasize

00:59:33:19 – 00:59:36:22
that we are all working together

00:59:37:06 – 00:59:40:22
and that the thing is to have

00:59:41:03 – 00:59:42:14
a platform for discussion

00:59:42:14 – 00:59:44:00
for the deaf community.

00:59:44:00 – 00:59:45:08
And I know often

00:59:45:08 – 00:59:46:07
a lot of individuals

00:59:46:07 – 00:59:47:21
are overlooked in our community

00:59:47:21 – 00:59:48:15
are they’re pushed

00:59:48:15 – 00:59:50:15
out of these discussions.

00:59:50:15 – 00:59:51:16
They don’t get the opportunity

00:59:51:16 – 00:59:53:12
to explain their perspective.

00:59:53:12 – 00:59:55:04
There’s education issues.

00:59:55:04 – 00:59:59:20
And so, you know, I think that A.I.

01:00:00:00 – 01:00:02:06
here and we often see people say,

01:00:02:06 – 01:00:03:23
you know, AI is here.

01:00:03:23 – 01:00:05:13
What is that going to look like?

01:00:05:13 – 01:00:06:23
And some people say, no,

01:00:06:23 – 01:00:08:03
we don’t want to see this happen.

01:00:08:03 – 01:00:09:09
We don’t want A.I.

01:00:09:09 – 01:00:11:21
to part of this interpreting process

01:00:11:21 – 01:00:14:11
while we know it’s coming.

01:00:14:11 – 01:00:16:11
And so I think it’s our responsibility

01:00:16:11 – 01:00:18:21
to ensure that this collaborative effort

01:00:18:21 – 01:00:20:17
stays in place

01:00:20:17 – 01:00:21:24
so that we have deaf community

01:00:21:24 – 01:00:23:18
representation and representation

01:00:23:18 – 01:00:26:00
in the development of AI over time,

01:00:26:00 – 01:00:27:16
not just for education,

01:00:27:16 – 01:00:30:16
but also that these all over America,

01:00:30:21 – 01:00:32:04
they have the responsibility

01:00:32:04 – 01:00:34:08
of ensuring that deaf individuals are

01:00:34:08 – 01:00:35:07
they’re at the table

01:00:35:07 – 01:00:36:14
when these discussions happen.

01:00:40:00 – 01:00:42:16
Let’s move on to the next question.

01:00:42:16 – 01:00:42:23
Okay.

01:00:42:23 – 01:00:44:21
So for our next question,

01:00:44:21 – 01:00:47:14
what are we working on related

01:00:47:14 – 01:00:50:20
to the legislation

01:00:50:20 – 01:00:53:20
and for protection of the deaf community?

01:00:54:07 – 01:00:57:06
What is what does that look like?

01:00:57:06 – 01:00:58:18
I could take that.

01:00:58:18 – 01:01:01:19
So in terms of this, we

01:01:02:22 – 01:01:04:11
I can maybe probably clarify

01:01:04:11 – 01:01:05:13
a bit of the explanation

01:01:05:13 – 01:01:07:07
we’ve given already, but

01:01:07:07 – 01:01:08:24
when it comes to structure,

01:01:08:24 – 01:01:11:06
we do have our advisory group.

01:01:11:06 – 01:01:14:06
We also have safe A.I..

01:01:14:06 – 01:01:16:00
It’s a task force.

01:01:16:00 – 01:01:18:11
And that task force represents

01:01:18:11 – 01:01:20:12
so many different languages

01:01:20:12 – 01:01:23:12
and interpreters, providers

01:01:23:21 – 01:01:25:19
and tech people

01:01:25:19 – 01:01:27:21
that are involved in the tech development

01:01:27:21 – 01:01:28:22
of it.

01:01:28:22 – 01:01:31:00
We also have consumers

01:01:31:00 – 01:01:32:21
who would be using the services,

01:01:32:21 – 01:01:33:20
so we have a broad

01:01:33:20 – 01:01:34:20
range of people involved

01:01:34:20 – 01:01:36:05
in the task force.

01:01:36:05 – 01:01:39:24
Now, in terms of the deaf perspective,

01:01:39:24 – 01:01:43:03
we wanted to make sure that

01:01:43:19 – 01:01:44:19
they got involved,

01:01:44:19 – 01:01:46:18
but it was not until a bit later

01:01:46:18 – 01:01:47:12
after task

01:01:47:12 – 01:01:49:00
Force was formed that they got involved.

01:01:49:00 – 01:01:49:14
And so we said,

01:01:49:14 – 01:01:50:08
you know, we want silos,

01:01:50:08 – 01:01:51:00
which include we want

01:01:51:00 – 01:01:52:08
all of this included.

01:01:52:08 – 01:01:55:08
So we established the advisory council

01:01:55:08 – 01:01:58:04
to ensure that AI and say sign language

01:01:58:04 – 01:01:59:23
interpreting was represented

01:01:59:23 – 01:02:01:06
from the deaf perspective.

01:02:01:06 – 01:02:02:08
And of course

01:02:02:08 – 01:02:03:18
there’s diverse organizations,

01:02:03:18 – 01:02:05:15
the academic perspective,

01:02:05:15 – 01:02:07:10
designers, developers, all of that.

01:02:07:10 – 01:02:10:18
They were involved and in the group

01:02:11:02 – 01:02:11:11
so that

01:02:11:11 – 01:02:14:18
they could give their advice to say, I

01:02:15:06 – 01:02:17:03
now say saved by

01:02:17:03 – 01:02:18:02
AI has been working

01:02:18:02 – 01:02:19:08
with the advisory group

01:02:19:08 – 01:02:20:22
and the goal is to continue

01:02:20:22 – 01:02:24:19
to develop the policies and the law.

01:02:24:24 – 01:02:27:21
Suggestions that we have for the group

01:02:27:21 – 01:02:31:19
so that our parts and our recommendations

01:02:31:19 – 01:02:32:12
are included

01:02:32:12 – 01:02:33:20
in their reporting

01:02:33:20 – 01:02:36:08
and in their data collection and surveys.

01:02:36:08 – 01:02:37:07
Right now

01:02:37:07 – 01:02:38:18
there are ten different languages

01:02:38:18 – 01:02:39:23
that are being looked at.

01:02:39:23 – 01:02:41:06
And so in terms of American

01:02:41:06 – 01:02:44:15
Sign Language, it was noted

01:02:44:15 – 01:02:46:04
that there

01:02:46:04 – 01:02:47:23
needed to be another opportunity

01:02:47:23 – 01:02:48:14
to collaborate

01:02:48:14 – 01:02:50:02
more with the deaf community

01:02:50:02 – 01:02:51:05
to send out surveys

01:02:51:05 – 01:02:52:10
in American Sign Language.

01:02:52:10 – 01:02:53:12
So that’s where we are right now.

01:02:53:12 – 01:02:54:13
In the process.

01:02:54:13 – 01:02:56:17
We’re hoping that our dialog

01:02:56:17 – 01:02:57:17
and our discussion

01:02:57:17 – 01:03:00:04
and our collection of information

01:03:00:04 – 01:03:03:17
will become a great contributor

01:03:03:17 – 01:03:05:08
to the bigger picture.

01:03:05:08 – 01:03:06:23
Does anyone have anything to add?

01:03:06:23 – 01:03:08:12
I just wanted to clarify a comment

01:03:08:12 – 01:03:09:14
that I had made earlier.

01:03:15:22 – 01:03:16:19
Okay.

01:03:16:19 – 01:03:19:19
Moving on to the next question

01:03:20:02 – 01:03:23:03
for this report and research.

01:03:23:23 – 01:03:26:06
We are focusing on ASL.

01:03:26:06 – 01:03:28:10
What about other signed languages

01:03:28:10 – 01:03:29:15
in other countries?

01:03:29:15 – 01:03:32:03
Will we be looking at others

01:03:32:03 – 01:03:35:03
in the future?

01:03:36:13 – 01:03:37:22
I can take that question.

01:03:37:22 – 01:03:39:12
It would be great.

01:03:39:12 – 01:03:40:16
It’s a dream

01:03:40:16 – 01:03:43:10
not for ASL to be the only language

01:03:43:10 – 01:03:44:06
that we look at, right?

01:03:44:06 – 01:03:46:04
We want to consider all of this,

01:03:46:04 – 01:03:48:05
but in terms of the task force

01:03:48:05 – 01:03:51:08
and the advisory council where

01:03:51:12 – 01:03:53:01
we’re working with different

01:03:53:01 – 01:03:54:19
American organizations

01:03:54:19 – 01:03:57:20
and we have been mostly focusing on ASL,

01:03:58:01 – 01:04:01:01
but we are seeing more effort right now

01:04:01:01 – 01:04:02:17
in other parts of the world

01:04:02:17 – 01:04:04:05
where they are focusing

01:04:04:05 – 01:04:06:09
on the automatic interpreting

01:04:06:09 – 01:04:08:01
for other languages.

01:04:08:01 – 01:04:11:21
And so I am aware of the fact

01:04:11:21 – 01:04:14:15
that there are some places in Europe

01:04:14:15 – 01:04:15:23
that are focusing on things.

01:04:15:23 – 01:04:17:09
I don’t have the specific names.

01:04:17:09 – 01:04:19:19
They don’t they don’t come to me.

01:04:19:19 – 01:04:20:21
These names are not coming to me

01:04:20:21 – 01:04:21:06
right now.

01:04:21:06 – 01:04:25:06
But yeah, and Tim can talk more about

01:04:26:05 – 01:04:26:12
one of

01:04:26:12 – 01:04:29:12
those organizations.

01:04:31:16 – 01:04:32:16
We do have people involved

01:04:32:16 – 01:04:33:23
in Europe and Canada.

01:04:33:23 – 01:04:35:10
I know that there are many more

01:04:35:10 – 01:04:36:09
all over the world

01:04:36:09 – 01:04:37:09
who are also looking

01:04:37:09 – 01:04:39:05
at the same technology,

01:04:39:05 – 01:04:42:20
but because right now, Safe Eye

01:04:42:21 – 01:04:45:01
Task Force is focusing on

01:04:45:01 – 01:04:47:04
American policy legislation

01:04:47:04 – 01:04:50:12
and all of that current focus is

01:04:50:12 – 01:04:52:20
specifically what’s happening in America

01:04:52:20 – 01:04:53:21
and North America.

01:04:53:21 – 01:04:55:01
But at the same time,

01:04:55:01 – 01:04:58:13
we could have an impact on Canada

01:04:58:22 – 01:04:59:18
some of our research,

01:04:59:18 – 01:05:01:11
could impact Europe as well.

01:05:01:11 – 01:05:03:17
So I think this is with the process,

01:05:03:17 – 01:05:06:17
we will probably

01:05:07:02 – 01:05:09:01
continue to move forward

01:05:09:01 – 01:05:12:01
and see more replication

01:05:12:01 – 01:05:15:01
of our studies or expansion of our focus.

01:05:15:06 – 01:05:17:05
Looking at it on a more global scale

01:05:17:05 – 01:05:19:08
as we move forward.

01:05:19:08 – 01:05:21:24
Theresa Here I’d like to also add as well

01:05:21:24 – 01:05:23:01
that I think it’s imperative

01:05:23:01 – 01:05:24:12
that we recognize people

01:05:24:12 – 01:05:26:17
using sign language here in the U.S.

01:05:26:17 – 01:05:29:02
are not only using ASL.

01:05:29:02 – 01:05:30:19
So it’s important for us to know

01:05:30:19 – 01:05:32:23
that there are many foreign

01:05:32:23 – 01:05:35:16
languages used here in the US

01:05:35:16 – 01:05:38:08
and we’ve feedback from users of sign

01:05:38:08 – 01:05:39:09
language in the U.S.,

01:05:39:09 – 01:05:41:15
not just American Sign language.

01:05:41:15 – 01:05:45:06
We will or we will be reaching out to

01:05:45:07 – 01:05:46:11
other sign languages

01:05:46:11 – 01:05:48:15
and pulling in that data and information

01:05:48:15 – 01:05:50:11
and their experiences as well.

01:05:50:11 – 01:05:51:08
And it’s important

01:05:51:08 – 01:05:54:14
for us to include everyone at the table.

01:05:58:22 – 01:05:59:12
Okay.

01:05:59:12 – 01:06:01:01
Next question.

01:06:01:01 – 01:06:04:01
So this question is related to readiness.

01:06:05:21 – 01:06:07:20
So in terms

01:06:07:20 – 01:06:10:04
of these technological companies

01:06:10:04 – 01:06:11:12
in development

01:06:11:12 – 01:06:13:16
and a lot of these people

01:06:13:16 – 01:06:14:06
who are working

01:06:14:06 – 01:06:14:24
in this

01:06:14:24 – 01:06:16:10
arena and make these decisions

01:06:16:10 – 01:06:18:08
don’t always include deaf people.

01:06:18:08 – 01:06:21:04
So what approach do we think?

01:06:21:04 – 01:06:23:08
Or how can we make sure

01:06:23:08 – 01:06:24:17
that deaf individuals

01:06:24:17 – 01:06:25:07
are involved

01:06:25:07 – 01:06:26:09
in these conversations

01:06:26:09 – 01:06:27:02
and that they’re always

01:06:27:02 – 01:06:31:07
a part of the process for AI by AI,

01:06:32:06 – 01:06:35:06
machine learning and everything are

01:06:37:02 – 01:06:38:12
Emery Here, I’ll take that.

01:06:38:12 – 01:06:39:01
I’d be happy

01:06:39:01 – 01:06:40:18
to make a comment about that.

01:06:40:18 – 01:06:42:18
So with regards to

01:06:42:18 – 01:06:44:13
what we’ve been speaking about,

01:06:44:13 – 01:06:45:15
it’s important

01:06:45:15 – 01:06:49:06
that if all of the impact organizations

01:06:49:10 – 01:06:51:09
that were serving the community,

01:06:51:09 – 01:06:55:01
educational, corporate America,

01:06:55:01 – 01:06:55:24
different areas,

01:06:55:24 – 01:06:57:10
we know that part

01:06:57:10 – 01:06:59:18
of our screening process,

01:06:59:18 – 01:07:01:21
people want to sponsor our organization.

01:07:01:21 – 01:07:04:07
They want to to look into this.

01:07:04:07 – 01:07:05:21
They want to look in the screening

01:07:05:21 – 01:07:10:00
is disability access included?

01:07:10:00 – 01:07:11:01
Is that supported?

01:07:11:01 – 01:07:12:10
Are your hiring deaf

01:07:12:10 – 01:07:14:13
and hard of hearing employees?

01:07:14:13 – 01:07:15:14
There’s so many things

01:07:15:14 – 01:07:17:15
to look at discuss with them.

01:07:17:15 – 01:07:19:10
It’s very creative

01:07:19:10 – 01:07:21:12
in the approach to HIPAA.

01:07:21:12 – 01:07:23:17
And so it’s a big it’s a very hot topic.

01:07:25:01 – 01:07:27:16
Not only are the

01:07:27:16 – 01:07:30:05
the community and organizations at large,

01:07:30:05 – 01:07:31:14
but also the individual.

01:07:31:14 – 01:07:32:17
Again,

01:07:32:17 – 01:07:33:21
what it looks like to us

01:07:33:21 – 01:07:35:13
is we have to really partner

01:07:35:13 – 01:07:38:19
and really push this,

01:07:38:19 – 01:07:41:19
that all of them

01:07:42:05 – 01:07:43:08
it’s important for all

01:07:43:08 – 01:07:45:20
because they’re impacted

01:07:45:20 – 01:07:47:18
by the development and design of this.

01:07:47:18 – 01:07:48:05
It’s important

01:07:48:05 – 01:07:49:24
for all people to be included,

01:07:49:24 – 01:07:52:01
not just hire them for, you know,

01:07:52:01 – 01:07:53:10
a consulting position,

01:07:53:10 – 01:07:55:22
a temporary feedback on this,

01:07:55:22 – 01:07:58:00
but have them involved in the development

01:07:58:00 – 01:08:00:20
and design of this. It’s imperative.

01:08:00:20 – 01:08:03:20
It’s so important.

01:08:06:17 – 01:08:08:10
Okay.

01:08:08:10 – 01:08:10:00
For the next question,

01:08:10:00 – 01:08:12:22
automatic speech recognition.

01:08:12:22 – 01:08:16:19
ESR In comparison to eBay.

01:08:16:24 – 01:08:20:08
AI how comfortable

01:08:20:08 – 01:08:22:15
and how much can you trust these?

01:08:22:15 – 01:08:25:15
Two And can you compare the two?

01:08:25:22 – 01:08:28:02
Jeff Here I’d like to take that.

01:08:28:02 – 01:08:31:07
You know, it’s like comparing apples to

01:08:31:07 – 01:08:32:24
oranges, both of them are fruit,

01:08:32:24 – 01:08:34:15
but they are a little bit different.

01:08:34:15 – 01:08:37:22
So this is a very specific portion

01:08:37:22 – 01:08:39:07
of technology

01:08:39:07 – 01:08:42:05
and it focuses on translating from one

01:08:42:05 – 01:08:45:14
language to another.

01:08:45:14 – 01:08:46:17
So so

01:08:48:00 – 01:08:48:19
in its

01:08:48:19 – 01:08:51:20
form, so spoken speech into written form

01:08:52:02 – 01:08:54:24
Abe AI focuses on interpreting

01:08:54:24 – 01:08:57:20
and that is an automatic on the spot

01:08:57:20 – 01:09:00:17
that moment find production.

01:09:00:17 – 01:09:03:03
So that means we have information

01:09:03:03 – 01:09:07:07
feeding into the process

01:09:07:14 – 01:09:10:13
and speech recognition is just picking up

01:09:10:13 – 01:09:12:08
on the kid’s speech.

01:09:12:08 – 01:09:13:14
The context is different,

01:09:13:14 – 01:09:15:18
the information processing is different,

01:09:15:18 – 01:09:17:15
and most of the equivalent

01:09:17:15 – 01:09:20:14
for ASL and speech recognition

01:09:20:14 – 01:09:23:16
would be ASL are signed recognition

01:09:24:01 – 01:09:27:14
and that technology is part of A.I.

01:09:27:14 – 01:09:28:14
by A.I..

01:09:28:14 – 01:09:29:19
But Abe A.I.

01:09:29:19 – 01:09:31:03
includes a variety

01:09:31:03 – 01:09:32:05
other technologies

01:09:32:05 – 01:09:35:05
to help those components work together.

01:09:35:10 – 01:09:37:22
It picks up on that subject

01:09:37:22 – 01:09:40:22
processing, body language,

01:09:41:05 – 01:09:44:01
situational context, clues

01:09:44:01 – 01:09:45:23
and

01:09:45:23 – 01:09:48:23
pragmatics

01:09:49:04 – 01:09:50:13
are the concepts

01:09:50:13 – 01:09:53:13
of the transcriptions.

01:09:53:24 – 01:09:55:05
And just to add to that,

01:09:55:05 – 01:09:56:07
this is Tim here.

01:09:56:07 – 01:10:00:15
And in terms of speech recognition,

01:10:01:16 – 01:10:02:05
that is

01:10:02:05 – 01:10:05:05
something where we already see

01:10:05:15 – 01:10:06:19
a much more development

01:10:06:19 – 01:10:08:11
because there have been years

01:10:08:11 – 01:10:09:05
of investment

01:10:09:05 – 01:10:11:14
in the development of that technology

01:10:11:14 – 01:10:12:19
while sign language

01:10:12:19 – 01:10:14:24
recognition is behind.

01:10:14:24 – 01:10:18:07
And so that becomes an issue of equity.

01:10:18:14 – 01:10:20:01
And there’s a concern there

01:10:20:01 – 01:10:21:17
that with speech recognition,

01:10:21:17 – 01:10:23:20
because of the time and the investment

01:10:23:20 – 01:10:24:22
that’s already there,

01:10:24:22 – 01:10:26:09
we’ve spoken language,

01:10:26:09 – 01:10:28:15
caring people continue to benefit from,

01:10:28:15 – 01:10:29:10
while deaf

01:10:29:10 – 01:10:31:18
individuals cannot have the same access

01:10:31:18 – 01:10:33:09
or the same benefit.

01:10:33:09 – 01:10:35:00
We go back to, for example,

01:10:35:00 – 01:10:37:00
the invention of the telephone.

01:10:37:00 – 01:10:38:04
Of course, hearing

01:10:38:04 – 01:10:39:17
people were able to use the phone

01:10:39:17 – 01:10:42:02
for many years and enjoy that technology

01:10:42:02 – 01:10:43:06
until 80 years later.

01:10:43:06 – 01:10:44:21
We finally got the video phone

01:10:44:21 – 01:10:46:16
and these other forms of technology

01:10:46:16 – 01:10:46:24
where deaf

01:10:46:24 – 01:10:48:05
people could benefit

01:10:48:05 – 01:10:50:18
from the same kind of experience.

01:10:50:18 – 01:10:53:06
So we always have to look at

01:10:53:06 – 01:10:54:08
these situations

01:10:54:08 – 01:10:56:24
and make sure that there is funding

01:10:56:24 – 01:11:00:12
and that research is being done to try to

01:11:00:19 – 01:11:04:08
ensure that sign language is caught up

01:11:04:12 – 01:11:07:08
to what the hearing community is able

01:11:07:08 – 01:11:07:22
to enjoy.

01:11:11:08 – 01:11:12:23
As A next question is

01:11:12:23 – 01:11:14:06
regarding people

01:11:14:06 – 01:11:18:01
with intellectual disabilities, autism

01:11:18:17 – 01:11:21:06
learning disabilities, language

01:11:21:06 – 01:11:24:06
fluency

01:11:24:08 – 01:11:25:07
and the like.

01:11:25:07 – 01:11:28:18
How does this relate to Abe?

01:11:29:01 – 01:11:32:14
I will their abilities and disabilities

01:11:32:14 – 01:11:35:14
be included in this process?

01:11:36:23 – 01:11:38:24
I can take that, Yeah.

01:11:38:24 – 01:11:41:09
Thank you. This is a great question.

01:11:41:09 – 01:11:42:13
Similar

01:11:42:13 – 01:11:44:12
to what we were talking about earlier.

01:11:44:12 – 01:11:47:22
When it comes to, for example,

01:11:48:16 – 01:11:49:24
the FCC,

01:11:49:24 – 01:11:52:13
the Federal Communications Commission,

01:11:52:13 – 01:11:55:13
and talking about language itself

01:11:55:23 – 01:11:58:23
and speech recognition,

01:12:00:21 – 01:12:03:05
there’s understanding

01:12:03:05 – 01:12:05:05
of people

01:12:05:05 – 01:12:06:23
that have different backgrounds.

01:12:06:23 – 01:12:09:24
Same thing happens with this API

01:12:09:24 – 01:12:11:02
and the approach.

01:12:11:02 – 01:12:12:20
We want to make sure that we resolve

01:12:12:20 – 01:12:13:22
some of these issues.

01:12:13:22 – 01:12:14:24
We need to be able

01:12:14:24 – 01:12:16:17
develop something for them

01:12:16:17 – 01:12:19:20
to make sure that it’s successful.

01:12:20:02 – 01:12:23:02
And so it’s still to be seen.

01:12:23:10 – 01:12:24:20
We are not able

01:12:24:20 – 01:12:26:04
answer that at this time

01:12:26:04 – 01:12:28:04
unless there’s anyone else here involved.

01:12:28:04 – 01:12:30:02
But and from what I understand,

01:12:30:02 – 01:12:31:01
I think it’s still a hot

01:12:31:01 – 01:12:32:10
topic of discussion

01:12:32:10 – 01:12:33:12
and it’s something

01:12:33:12 – 01:12:34:17
that the community of people

01:12:34:17 – 01:12:36:13
is still talking about.

01:12:36:13 – 01:12:38:18
But yeah, great question

01:12:38:18 – 01:12:41:12
and saying, yes, Jeff had mentioned the

01:12:41:12 – 01:12:44:18
data, They’re good data and

01:12:45:23 – 01:12:47:00
good models.

01:12:47:00 – 01:12:50:08
So with that design process,

01:12:50:14 – 01:12:51:17
we have to prepare

01:12:51:17 – 01:12:53:09
for a variety of deaf members

01:12:53:09 – 01:12:54:15
in the community.

01:12:54:15 – 01:12:55:00
You know,

01:12:55:00 – 01:12:55:16
in the beginning

01:12:55:16 – 01:12:57:00
we’ve got to collect data

01:12:57:00 – 01:12:58:23
from the community at large

01:12:58:23 – 01:13:01:23
and make sure that it is

01:13:02:09 – 01:13:05:00
appropriate in moving forward

01:13:05:00 – 01:13:05:16
that it’s going to be

01:13:05:16 – 01:13:07:10
beneficial to the community.

01:13:07:10 – 01:13:08:18
Now, if we don’t include

01:13:08:18 – 01:13:09:20
those in the beginning,

01:13:09:20 – 01:13:12:10
that could be a problem with our models.

01:13:12:10 – 01:13:15:10
They won’t be prepared for that.

01:13:17:10 – 01:13:18:19
This is Teresa,

01:13:18:19 – 01:13:21:17
and I’d like to add a specific example

01:13:21:17 – 01:13:23:07
in terms of writing English.

01:13:23:07 – 01:13:25:01
So we see that I know

01:13:25:01 – 01:13:26:10
most people are probably familiar

01:13:26:10 – 01:13:29:10
with Chat. JPT

01:13:29:24 – 01:13:32:12
Currently you’re able to ask chat.

01:13:32:12 – 01:13:36:20
JPT and to develop

01:13:37:00 – 01:13:40:16
some draft of, for example,

01:13:40:16 – 01:13:44:24
plain language meaning the concept

01:13:44:24 – 01:13:46:19
of using it for people

01:13:46:19 – 01:13:49:16
with intellectual disabilities. So

01:13:51:05 – 01:13:52:07
that reflects the

01:13:52:07 – 01:13:53:23
importance of the inclusion

01:13:53:23 – 01:13:56:08
of people in design.

01:13:56:08 – 01:13:57:17
And also

01:13:57:17 – 01:14:00:05
when we are having these discussions

01:14:00:05 – 01:14:03:01
about how to design technology,

01:14:03:01 – 01:14:04:13
we have to ask the question

01:14:04:13 – 01:14:06:06
who’s going to be involved?

01:14:06:06 – 01:14:08:00
Because sometimes the people who are

01:14:08:00 – 01:14:10:01
there are not the people

01:14:10:01 – 01:14:11:01
that need to be involved.

01:14:11:01 – 01:14:12:05
So we want to make sure

01:14:12:05 – 01:14:13:15
that we recognizing

01:14:13:15 – 01:14:14:10
and not forgetting

01:14:14:10 – 01:14:15:16
about these individuals

01:14:15:16 – 01:14:18:16
and these various communities.

01:14:19:08 – 01:14:20:21
I mean, one thing

01:14:20:21 – 01:14:22:03
that I think about is,

01:14:22:03 – 01:14:23:23
for example, a CDI.

01:14:23:23 – 01:14:24:15
Often

01:14:24:15 – 01:14:25:23
we have seen the use

01:14:25:23 – 01:14:27:21
of a certified deaf interpreter

01:14:27:21 – 01:14:30:05
who comes into the situation

01:14:30:05 – 01:14:32:05
and we look at how that changes

01:14:32:05 – 01:14:34:01
and improves communication.

01:14:34:01 – 01:14:35:09
The experience for the deaf

01:14:35:09 – 01:14:37:04
individual is improved,

01:14:37:04 – 01:14:39:02
and I think we can see that benefit

01:14:39:02 – 01:14:40:02
and understand how it would

01:14:40:02 – 01:14:42:10
apply as well to I by I.

01:14:47:02 – 01:14:47:12
Okay.

01:14:47:12 – 01:14:51:03
Next question is related to

01:14:53:01 – 01:14:55:03
there are three different webinars

01:14:55:03 – 01:14:58:03
that you guys hosted

01:14:58:15 – 01:15:01:08
and during those webinars,

01:15:01:08 – 01:15:04:08
did you have a group

01:15:04:13 – 01:15:07:05
or did you have people

01:15:07:05 – 01:15:10:00
who were not using sign language

01:15:10:00 – 01:15:11:24
involved in the group

01:15:11:24 – 01:15:13:19
that they do not use sign language

01:15:13:19 – 01:15:18:00
at all to communicate, to discuss by I

01:15:23:22 – 01:15:26:12
and this is Emery here.

01:15:26:12 – 01:15:28:09
Another good question

01:15:28:09 – 01:15:32:00
and yeah, we had asked them for

01:15:32:12 – 01:15:34:04
to specifically identify

01:15:34:04 – 01:15:37:14
their level of sign skill and

01:15:38:01 – 01:15:39:06
I know that

01:15:39:06 – 01:15:40:19
the panel was talking about that.

01:15:40:19 – 01:15:43:23
I can’t remember exactly how they said,

01:15:43:23 – 01:15:44:16
but it was

01:15:44:16 – 01:15:45:08
the question

01:15:45:08 – 01:15:47:08
was about their level of signing.

01:15:47:08 – 01:15:50:14
And so the it was an open discussion

01:15:50:14 – 01:15:51:21
and we told everyone that

01:15:51:21 – 01:15:53:16
it was going to be in sign language.

01:15:53:16 – 01:15:55:09
But the percentage of people

01:15:55:09 – 01:15:57:07
who were not fluent in

01:15:57:07 – 01:15:59:10
sign, I’m not totally sure.

01:15:59:10 – 01:16:01:22
I don’t know if anyone has a different

01:16:01:22 – 01:16:03:08
recall, something different and

01:16:05:09 – 01:16:05:19
I don’t

01:16:05:19 – 01:16:06:19
think that we had

01:16:06:19 – 01:16:08:19
collected enough data about that.

01:16:08:19 – 01:16:11:15
I think it was hard for us to evaluate,

01:16:11:15 – 01:16:12:14
let alone

01:16:12:14 – 01:16:14:08
evaluate our own data

01:16:14:08 – 01:16:15:17
that we had collected.

01:16:15:17 – 01:16:18:17
So

01:16:21:08 – 01:16:24:08
and in terms of the captioning

01:16:24:08 – 01:16:26:17
and English based caption,

01:16:26:17 – 01:16:28:16
we were not focused on that.

01:16:28:16 – 01:16:29:05
We were more

01:16:29:05 – 01:16:30:19
focused on the sign language

01:16:30:19 – 01:16:32:10
interpreting aspect.

01:16:32:10 – 01:16:36:16
So we do already have a lot of discussion

01:16:36:21 – 01:16:38:00
happening

01:16:38:00 – 01:16:40:22
to the text based aspect of this,

01:16:40:22 – 01:16:43:13
but sign language, there are some gaps,

01:16:43:13 – 01:16:46:13
so we want to focus on that specifically.

01:16:51:00 – 01:16:54:00
Okay,

01:16:55:18 – 01:16:56:17
please help one moment

01:16:56:17 – 01:16:59:17
while through these questions.

01:17:04:07 – 01:17:04:16
Okay.

01:17:04:16 – 01:17:08:00
This question is regarding

01:17:08:00 – 01:17:11:00
the development of the curriculum

01:17:11:07 – 01:17:13:19
at the university level regarding

01:17:13:19 – 01:17:16:19
different interpreting processes

01:17:19:10 – 01:17:22:04
and

01:17:22:04 – 01:17:24:03
for VRA,

01:17:24:03 – 01:17:26:24
VRA and the like.

01:17:26:24 – 01:17:29:21
Now are we going to be adding a I

01:17:31:02 – 01:17:34:01
and there any feedback

01:17:34:01 – 01:17:36:00
that you could share regarding

01:17:36:00 – 01:17:39:12
curriculum development on this subject?

01:17:47:08 – 01:17:48:17
Okay, that’s a good question.

01:17:48:17 – 01:17:49:07
Again,

01:17:49:07 – 01:17:53:09
I think maybe for a future discussion,

01:17:53:09 – 01:17:54:24
it might be more beneficial.

01:17:54:24 – 01:17:55:14
We haven’t

01:17:55:14 – 01:17:57:08
necessarily gotten to that point yet.

01:17:57:08 – 01:17:59:00
We’ll have to do some training

01:17:59:00 – 01:17:59:19
with the curriculum

01:17:59:19 – 01:18:01:10
and workshops and all of.

01:18:01:10 – 01:18:02:03
But again,

01:18:02:03 – 01:18:02:21
I would say

01:18:02:21 – 01:18:06:12
that it could come up at a symposium

01:18:06:22 – 01:18:09:17
or maybe another opportunity

01:18:09:17 – 01:18:13:20
for discussion could become available

01:18:13:20 – 01:18:16:20
in the future.

01:18:21:17 – 01:18:23:24
We haven’t discussed this as of yet,

01:18:23:24 – 01:18:27:05
but in considering the

01:18:27:05 – 01:18:30:05
ethical foundations of this.

01:18:30:10 – 01:18:33:04
So for interpreting,

01:18:33:04 – 01:18:36:16
we do have the ethical foundation

01:18:36:23 – 01:18:38:00
for the Deaf community.

01:18:38:00 – 01:18:42:02
We do have our ethical expectations

01:18:42:20 – 01:18:44:12
and norms

01:18:44:12 – 01:18:46:05
as well as other aspects of that.

01:18:46:05 – 01:18:48:01
But when we enter into this

01:18:48:01 – 01:18:48:24
with some knowledge,

01:18:48:24 – 01:18:52:13
now we have new technology

01:18:52:18 – 01:18:54:07
and what kind of new questions

01:18:54:07 – 01:18:56:05
are going to arise from this?

01:18:56:05 – 01:18:57:17
And that’s part of what we’re hoping

01:18:57:17 – 01:18:59:24
to have some discussions regarding

01:18:59:24 – 01:19:03:01
that topic in the next symposium

01:19:03:01 – 01:19:03:17
next month.

01:19:07:14 – 01:19:08:04
wow, I forgot.

01:19:08:04 – 01:19:09:18
It’s March already.

01:19:09:18 – 01:19:12:18
It is.

01:19:14:16 – 01:19:15:00
Okay.

01:19:15:00 – 01:19:16:08
Next question.

01:19:16:08 – 01:19:19:01
One person had a comment

01:19:19:01 – 01:19:20:13
and said, a lot is happening

01:19:20:13 – 01:19:20:23
right now

01:19:20:23 – 01:19:24:11
in Europe as it relates to AI by AI.

01:19:24:18 – 01:19:28:04
Are you familiar with that and also how

01:19:28:04 – 01:19:32:16
that relates to D GDP?

01:19:33:14 – 01:19:36:11
R And so that’s

01:19:36:11 – 01:19:39:17
the general data protection

01:19:41:01 – 01:19:44:06
regulation to that law.

01:19:44:15 – 01:19:47:20
It’s a very strict

01:19:48:00 – 01:19:51:00
law on privacy and protection.

01:19:51:09 – 01:19:52:08
So can you guys

01:19:52:08 – 01:19:55:08
discuss in touch a bit on that topic?

01:19:56:11 – 01:19:57:12
I would like to do that.

01:19:57:12 – 01:19:59:14
Thank you for bringing that up.

01:19:59:14 – 01:20:04:16
In general, the data has been

01:20:04:18 – 01:20:07:10
it’s been one of the best

01:20:07:10 – 01:20:09:17
data privacy laws.

01:20:09:17 – 01:20:13:07
Hats off to them to the EU

01:20:13:07 – 01:20:14:10
for developing that.

01:20:14:10 – 01:20:15:09
It’s been wonderful

01:20:15:09 – 01:20:17:17
so I think one of the biggest concepts

01:20:17:17 – 01:20:19:02
or take away from this

01:20:19:02 – 01:20:22:02
is the topic of the right

01:20:22:05 – 01:20:24:02
to be forgotten.

01:20:24:02 – 01:20:25:02
Meaning

01:20:25:02 – 01:20:27:06
we can remind them and say, Hey,

01:20:27:06 – 01:20:29:17
I want you to remove my information

01:20:29:17 – 01:20:32:17
and they have to honor your request.

01:20:32:18 – 01:20:35:00
And that is one of the biggest takeaways

01:20:35:00 – 01:20:37:23
for this foundational concept.

01:20:37:23 – 01:20:40:13
And so another thing to look at as

01:20:40:13 – 01:20:43:08
well is the minors.

01:20:43:08 – 01:20:46:08
The age of the data collection.

01:20:46:10 – 01:20:48:21
Parents have to approve that or

01:20:50:03 – 01:20:51:15
revoke approval.

01:20:51:15 – 01:20:52:14
And so really,

01:20:52:14 – 01:20:54:17
in some of these aspects, the EU

01:20:54:17 – 01:20:55:20
ahead of us

01:20:55:20 – 01:20:59:12
and, you know, the American legislature

01:20:59:12 – 01:21:01:22
does have some protections,

01:21:01:22 – 01:21:03:15
but it’s not as focused.

01:21:03:15 – 01:21:06:07
It’s more focused. Children under age.

01:21:06:07 – 01:21:09:07
I believe it’s 13 or 14,

01:21:10:03 – 01:21:11:15
but the EU is ahead of us

01:21:11:15 – 01:21:13:07
in this respect.

01:21:13:07 – 01:21:16:01
The legislation there

01:21:16:01 – 01:21:17:09
there’s a lot of harm

01:21:17:09 – 01:21:19:10
that can be done with this data.

01:21:19:10 – 01:21:20:21
And so that’s something that really needs

01:21:20:21 – 01:21:22:09
to be fleshed out with anyone else.

01:21:22:09 – 01:21:25:09
Like to add to that,

01:21:30:02 – 01:21:31:05
I could go on for,

01:21:31:05 – 01:21:32:23
for ages about this,

01:21:32:23 – 01:21:34:14
but it’s the biggest thing

01:21:34:14 – 01:21:35:15
since

01:21:35:15 – 01:21:39:09
the organization had been established.

01:21:39:09 – 01:21:44:01
The Data collection and privacy

01:21:44:01 – 01:21:47:20
and who is responsible for making sure

01:21:47:20 – 01:21:50:20
that the data is retained safely,

01:21:51:02 – 01:21:53:12
that it’s not leaked?

01:21:53:12 – 01:21:56:16
And if that data is leaked,

01:21:56:18 – 01:21:59:17
how are they going to inform individuals

01:21:59:17 – 01:22:02:13
whose data has been breached that?

01:22:02:13 – 01:22:04:01
This has occurred.

01:22:04:01 – 01:22:05:21
So that’s part of the process

01:22:05:21 – 01:22:07:14
that should be included

01:22:07:14 – 01:22:09:07
in the transparency

01:22:09:07 – 01:22:12:00
and, you know, maintaining that contact

01:22:12:00 – 01:22:15:11
with individuals on the subject.

01:22:16:16 – 01:22:19:16
Thank you.

01:22:21:17 – 01:22:22:04
Okay.

01:22:22:04 – 01:22:25:04
We still have more questions.

01:22:26:02 – 01:22:27:17
This question is related

01:22:27:17 – 01:22:30:17
to machine learning AML.

01:22:30:20 – 01:22:32:24
So and with sign language

01:22:32:24 – 01:22:35:24
recognition, ASL are

01:22:36:19 – 01:22:38:15
who will be training

01:22:38:15 – 01:22:41:15
and teaching the language data.

01:22:41:22 – 01:22:43:17
Where is it going to come from?

01:22:43:17 – 01:22:46:05
From interpreters, from people?

01:22:46:05 – 01:22:49:05
Where will that come from?

01:22:51:01 – 01:22:52:14
So that is an excellent question.

01:22:52:14 – 01:22:53:03
Again,

01:22:53:03 – 01:22:54:07
that is something that

01:22:54:07 – 01:22:56:20
we can’t really control.

01:22:56:20 – 01:22:59:15
It’s up to the company

01:22:59:15 – 01:23:01:18
who is developing that

01:23:01:18 – 01:23:03:06
and doing the work.

01:23:03:06 – 01:23:06:06
So I know that in academia

01:23:06:08 – 01:23:07:19
there’s a lot of evaluations

01:23:07:19 – 01:23:08:24
on authors,

01:23:08:24 – 01:23:12:11
like, for example Oscar Kilner.

01:23:12:19 – 01:23:15:06


01:23:15:06 – 01:23:19:14
Lopez was an author in the ASL,

01:23:19:14 – 01:23:23:14
are community and used interpreter data.

01:23:23:23 – 01:23:27:24
For example, there was a very well-known

01:23:29:03 – 01:23:31:00
individual from Germany

01:23:31:00 – 01:23:34:03
and they had

01:23:35:05 – 01:23:37:11
a a system

01:23:37:11 – 01:23:38:14
that would do

01:23:38:14 – 01:23:40:03
weather reports and alerts

01:23:40:03 – 01:23:41:06
and it had an interpreter

01:23:41:06 – 01:23:42:13
down in the corner.

01:23:42:13 – 01:23:45:17
They recorded that for many years

01:23:46:07 – 01:23:49:07
and they used that data to train

01:23:49:07 – 01:23:51:11
the machine.

01:23:51:11 – 01:23:55:09
And so there was a very limited context

01:23:55:09 – 01:23:57:23
only using that one interpreter.

01:23:57:23 – 01:24:01:02
So it’s a great idea in theory

01:24:01:02 – 01:24:04:02
in the future to use,

01:24:04:19 – 01:24:06:04
you know, in other situations,

01:24:06:04 – 01:24:08:01
it may impact it in that way.

01:24:08:01 – 01:24:11:01
It’s very hard

01:24:11:24 – 01:24:14:11
because the data requires

01:24:14:11 – 01:24:16:16
retention and storage

01:24:16:16 – 01:24:19:16
and a lot of video data available

01:24:19:16 – 01:24:20:10
on the web out

01:24:20:10 – 01:24:23:04
there is not the best quality.

01:24:23:04 – 01:24:24:03
For example,

01:24:24:03 – 01:24:27:03
if you look at like an ASL, one class

01:24:27:05 – 01:24:31:15
student signing hey or a song or whatever

01:24:31:21 – 01:24:33:07
you’re looking and you’re saying,

01:24:33:07 – 01:24:34:07
Hey, you’re doing a good job,

01:24:34:07 – 01:24:35:11
you’re learning, you’re doing well.

01:24:35:11 – 01:24:36:15
But that’s not the model

01:24:36:15 – 01:24:37:22
that we want to use in

01:24:37:22 – 01:24:39:02
training the machines

01:24:39:02 – 01:24:40:03
for machine learning.

01:24:41:04 – 01:24:44:04
And the issue that arises, you know,

01:24:44:07 – 01:24:47:02
is a standard for A.I.,

01:24:47:02 – 01:24:47:12
you know,

01:24:47:12 – 01:24:48:09
where do we get this

01:24:48:09 – 01:24:49:08
data collection from?

01:24:49:08 – 01:24:50:18
From various,

01:24:50:18 – 01:24:51:02
you know,

01:24:51:02 – 01:24:53:06
and that is a bidirectional approach

01:24:53:06 – 01:24:53:24
to interpreting.

01:24:53:24 – 01:24:55:16
But there’s a big problem with that

01:24:55:16 – 01:24:56:18
as well.

01:24:56:18 – 01:24:58:11
Consent and privacy

01:24:58:11 – 01:25:02:01
and confidentiality are highly regulated.

01:25:02:01 – 01:25:04:10
And so we can’t use VR if

01:25:04:10 – 01:25:06:02
even though that would be the best place

01:25:06:02 – 01:25:07:13
for data collection.

01:25:07:13 – 01:25:10:19
So we have both signed and speech

01:25:11:06 – 01:25:13:04
and we are working on

01:25:13:04 – 01:25:15:04
how the two interact

01:25:15:04 – 01:25:19:05
and that’s what is really good.

01:25:19:05 – 01:25:20:06
But we’ve got to look

01:25:20:06 – 01:25:21:11
at different sources

01:25:21:11 – 01:25:23:18
and the organizations themselves.

01:25:23:18 – 01:25:26:12
It’s not

01:25:26:12 – 01:25:28:01
we’ve we’ve got to choose

01:25:28:01 – 01:25:28:22
what data we use.

01:25:28:22 – 01:25:30:16
It’s a very important to start

01:25:30:16 – 01:25:34:07
thinking about our legal framework

01:25:35:12 – 01:25:36:23
and, try to

01:25:36:23 – 01:25:39:23
encourage that and remind them to use

01:25:40:00 – 01:25:40:07
you know,

01:25:40:07 – 01:25:43:19
we’ve got to make sure that the data set

01:25:44:07 – 01:25:48:11
will be include from different sources.

01:25:48:17 – 01:25:50:13
The accuracy is there,

01:25:50:13 – 01:25:52:09
the variety is there.

01:25:52:09 – 01:25:54:06
And the group models,

01:25:54:06 – 01:25:57:22
for example, may be there

01:25:57:22 – 01:26:00:07
may be some fluent individuals,

01:26:00:07 – 01:26:01:14
deaf individuals there.

01:26:01:14 – 01:26:02:13
And that’s not the best

01:26:02:13 – 01:26:03:23
planning model to have.

01:26:03:23 – 01:26:05:03
I know that I’m not the best

01:26:05:03 – 01:26:06:07
signing model to have.

01:26:06:07 – 01:26:08:02
I’m not perfectly fluent myself.

01:26:08:02 – 01:26:09:18
It’s my native language

01:26:09:18 – 01:26:12:10
and I want to understand me.

01:26:12:10 – 01:26:15:02
And the same applies to other

01:26:15:02 – 01:26:16:14
signing styles

01:26:16:14 – 01:26:18:23
and signers with different abilities.

01:26:18:23 – 01:26:20:08
So there’s so many groups

01:26:20:08 – 01:26:21:13
that need to be included

01:26:21:13 – 01:26:24:13
and represented in this machine learning.

01:26:30:01 – 01:26:31:11
And to add to that,

01:26:31:11 – 01:26:32:23
it can be very challenging,

01:26:32:23 – 01:26:35:23
but also it can become an opportunity.

01:26:36:00 – 01:26:36:20
So for example,

01:26:36:20 – 01:26:38:01
we have a lot of deaf

01:26:38:01 – 01:26:39:14
individuals all over

01:26:39:14 – 01:26:42:04
their stories, their history.

01:26:42:04 – 01:26:43:06
A lot of times

01:26:43:06 – 01:26:44:20
this is not shared

01:26:44:20 – 01:26:47:00
and so we want to collect and and

01:26:47:00 – 01:26:48:03
save that data.

01:26:49:22 – 01:26:50:21
So I think it becomes a

01:26:50:21 – 01:26:51:15
bit of a project

01:26:51:15 – 01:26:54:15
to see what the impact is.

01:26:54:15 – 01:26:55:23
Understanding stories

01:26:55:23 – 01:26:58:24
and being able to record this history.

01:26:58:24 – 01:27:00:12
And also it’s an opportunity

01:27:00:12 – 01:27:01:16
to take a deeper dive

01:27:01:16 – 01:27:05:02
into these generational situations and,

01:27:05:11 – 01:27:06:10
getting information

01:27:06:10 – 01:27:08:21
from all over from diverse groups.

01:27:08:21 – 01:27:11:03
But I think the challenge can be funding,

01:27:11:03 – 01:27:13:13
and it’s also challenging to find people

01:27:13:13 – 01:27:14:21
who are able to go out

01:27:14:21 – 01:27:16:21
and record this information

01:27:16:21 – 01:27:18:02
and a quality way.

01:27:18:02 – 01:27:20:20
And so that’s a project in and of itself.

01:27:20:20 – 01:27:22:24
But we have this opportunity now

01:27:22:24 – 01:27:25:24
to look at different domains

01:27:26:03 – 01:27:27:24
and how to use sign language

01:27:27:24 – 01:27:30:17
and the medical and legal legal realms.

01:27:30:17 – 01:27:31:24
And we can see how

01:27:31:24 – 01:27:35:06
that becomes bigger project.

01:27:36:04 – 01:27:39:04
But again,

01:27:40:09 – 01:27:43:12
I think we need to go into that more

01:27:43:12 – 01:27:44:18
and we need to see where

01:27:44:18 – 01:27:46:01
we can get that funding from,

01:27:46:01 – 01:27:47:20
because that’s one of the challenges.

01:27:53:00 – 01:27:53:15
Okay.

01:27:53:15 – 01:27:55:10
Next question.

01:27:55:10 – 01:28:00:02
We have a question from hang on.

01:28:00:02 – 01:28:03:02
I left a deaf interpreter

01:28:05:02 – 01:28:06:02
and the question

01:28:06:02 – 01:28:10:11
is using AI captioning or work

01:28:10:11 – 01:28:13:23
or for a meeting or something of the like

01:28:14:15 – 01:28:15:20
are one of the many tools

01:28:15:20 – 01:28:17:00
that have been tested

01:28:17:00 – 01:28:18:11
with other countries

01:28:18:11 – 01:28:21:11
and their language in their access.

01:28:21:11 – 01:28:23:15
But it seems to fail

01:28:23:15 – 01:28:26:07
or it’s not as accurate

01:28:26:07 – 01:28:28:09
at capturing everything.

01:28:28:09 – 01:28:31:09
What approach would you use for A.I.

01:28:31:10 – 01:28:35:04
by making sure it’s accurate and

01:28:36:06 – 01:28:39:06
successful?

01:28:44:15 – 01:28:45:05
It’s all about

01:28:45:05 – 01:28:48:05
the data collection.

01:28:48:06 – 01:28:49:07
This is the memory here.

01:28:49:07 – 01:28:52:03
I think. Again, a great question.

01:28:52:03 – 01:28:53:14
And to expand on that,

01:28:53:14 – 01:28:55:17
I think we see the same situation

01:28:55:17 – 01:28:57:11
with live captioning

01:28:57:11 – 01:29:01:00
that is nuanced voice tone.

01:29:01:04 – 01:29:04:04
Sign language is just

01:29:04:04 – 01:29:05:12
we have to see how we’re going

01:29:05:12 – 01:29:07:18
to approach that when it comes to AI,

01:29:08:17 – 01:29:10:11
a AI is already challenging

01:29:10:11 – 01:29:12:09
for American Sign language

01:29:12:09 – 01:29:14:03
because it’s a conceptual

01:29:14:03 – 01:29:16:06
and a visual language.

01:29:16:06 – 01:29:18:12
So to be able to capture that

01:29:18:12 – 01:29:21:02
and replicate it and I you know,

01:29:21:02 – 01:29:22:02
this is a great question

01:29:22:02 – 01:29:24:05
because how can we approach this

01:29:24:05 – 01:29:25:15
to make this happen?

01:29:25:15 – 01:29:27:07
It’s still a hot topic.

01:29:27:07 – 01:29:28:17
There’s a lot of discussion on this

01:29:28:17 – 01:29:31:10
because people are still on this process

01:29:31:10 – 01:29:32:14
of trying to screen

01:29:32:14 – 01:29:35:00
and figure out how this pertains to A.I..

01:29:35:00 – 01:29:37:04
But yeah,

01:29:37:04 – 01:29:38:14
Tim, here I’d like to add,

01:29:38:14 – 01:29:40:13
if you look at different fields,

01:29:40:13 – 01:29:43:13
for example, linguistic studies,

01:29:43:16 – 01:29:45:01
the sign language,

01:29:45:01 – 01:29:48:23
it starts in about 1960,

01:29:49:07 – 01:29:51:23
68 with Stokie and his team.

01:29:51:23 – 01:29:53:23
And so you look at ASL

01:29:53:23 – 01:29:55:06
and you look at that

01:29:55:06 – 01:29:57:05
in how it’s developed over time,

01:29:57:05 – 01:29:58:17
and it’s really only been studied

01:29:58:17 – 01:30:01:02
in depth for 50 to 60 years.

01:30:01:02 – 01:30:03:12
It’s in its infancy at best.

01:30:03:12 – 01:30:05:15
And so there’s so many fine languages

01:30:05:15 – 01:30:06:11
all around the world

01:30:06:11 – 01:30:07:17
that have not been studied

01:30:07:17 – 01:30:09:05
and have not been documented

01:30:09:05 – 01:30:10:13
to that degree.

01:30:10:13 – 01:30:12:24
So we still need some more research.

01:30:12:24 – 01:30:15:24
And where you know,

01:30:16:08 – 01:30:17:12
it’s very important

01:30:17:12 – 01:30:19:13
for us to really look at that

01:30:19:13 – 01:30:20:18
with science languages,

01:30:20:18 – 01:30:22:10
but also with many spoken languages

01:30:22:10 – 01:30:23:07
as well.

01:30:23:07 – 01:30:24:24
There’s, you know,

01:30:24:24 – 01:30:26:21
thousands of languages in the world

01:30:26:21 – 01:30:29:06
and they don’t have a written form

01:30:29:06 – 01:30:31:07
for every one of them.

01:30:31:07 – 01:30:32:22
And so

01:30:32:22 – 01:30:34:10
there’s a lot of minority languages

01:30:34:10 – 01:30:36:18
as well and dialects.

01:30:36:18 – 01:30:40:04
And so those are at risk for extinction

01:30:40:11 – 01:30:41:18
because they’re not written,

01:30:41:18 – 01:30:43:07
they’re not studied,

01:30:43:07 – 01:30:44:24
they’re not documented.

01:30:44:24 – 01:30:46:08
And in a few years,

01:30:46:08 – 01:30:47:13
the most common languages

01:30:47:13 – 01:30:50:02
that will be used are major languages.

01:30:50:02 – 01:30:51:06
In the minority languages

01:30:51:06 – 01:30:52:22
will have dissipated.

01:30:52:22 – 01:30:56:03
And so that is something that is a risk.

01:30:56:03 – 01:30:59:23
And we have to look at we can’t leave

01:30:59:23 – 01:31:00:21
those behind.

01:31:04:20 – 01:31:05:03
I’d like

01:31:05:03 – 01:31:06:15
to add to that comment

01:31:06:15 – 01:31:08:24
when we talk about data.

01:31:08:24 – 01:31:10:16
I made a comment recently about that

01:31:10:16 – 01:31:11:03
I was wrong.

01:31:11:03 – 01:31:13:12
I should have said that more data

01:31:13:12 – 01:31:15:08
is coming from

01:31:15:08 – 01:31:17:18
underrepresented communities

01:31:17:18 – 01:31:19:14
and that we need to identify

01:31:19:14 – 01:31:20:20
those communities

01:31:20:20 – 01:31:23:03
and make them aware

01:31:23:03 – 01:31:24:19
that we’d like them to collaborate

01:31:24:19 – 01:31:25:16
with us

01:31:25:16 – 01:31:26:12
to make sure

01:31:26:12 – 01:31:29:12
that they are represented in the data.

01:31:38:24 – 01:31:39:09
Okay.

01:31:39:09 – 01:31:40:20
Tim recently made a comment

01:31:40:20 – 01:31:42:14
about research,

01:31:42:14 – 01:31:45:17
and I’d like to piggyback

01:31:45:17 – 01:31:46:19
my question off of that.

01:31:46:19 – 01:31:49:19
Are there any publications, books,

01:31:49:24 – 01:31:51:14
resources, articles

01:31:51:14 – 01:31:53:21
related to the issue of A.I.

01:31:53:21 – 01:31:58:01
and the Deaf community in Conflict

01:31:59:17 – 01:32:02:10
and I’m sorry, together

01:32:02:10 – 01:32:05:04
intersection.

01:32:05:04 – 01:32:08:03
Can you spell that again?

01:32:08:03 – 01:32:09:20
I’m sorry.

01:32:09:20 – 01:32:10:16
Intersection.

01:32:10:16 – 01:32:13:16
Intersection.

01:32:13:18 – 01:32:16:13
Okay, so my focus in the research

01:32:16:13 – 01:32:19:19
is more on the technical side of things.

01:32:19:19 – 01:32:22:15
I’m not necessarily too heavy on the

01:32:23:14 – 01:32:24:23
on the other

01:32:24:23 – 01:32:26:20
area with the deaf community

01:32:26:20 – 01:32:28:19
and looking into the socio side of it.

01:32:28:19 – 01:32:29:18
But I do think it’s

01:32:29:18 – 01:32:32:21
very interesting research and

01:32:34:02 – 01:32:36:01
I think it’s a good way to do

01:32:36:01 – 01:32:39:08
some formal search of the literature.

01:32:39:14 – 01:32:40:21
So I would suggest

01:32:40:21 – 01:32:42:07
that if you’re interested in this,

01:32:42:07 – 01:32:43:17
that you look at things

01:32:43:17 – 01:32:45:02
like, for example,

01:32:45:02 – 01:32:48:20
Richard Dana or Danny Bragg

01:32:49:04 – 01:32:51:01
and they both have worked

01:32:51:01 – 01:32:54:02
focus on the ethical aspect

01:32:54:22 – 01:32:59:24
of these and the surrounding topics.

01:32:59:24 – 01:33:01:04
And so it’s a lot,

01:33:01:04 – 01:33:02:08
but I could probably

01:33:02:08 – 01:33:06:13
share a bibliography for you all

01:33:06:13 – 01:33:07:16
so that you could take a look

01:33:07:16 – 01:33:08:15
at those authors

01:33:08:15 – 01:33:11:15
and look more into their work.

01:33:13:21 – 01:33:15:14
I wish we could,

01:33:15:14 – 01:33:15:24
you know,

01:33:15:24 – 01:33:17:09
with the advisory group,

01:33:17:09 – 01:33:20:03
we touch on so many things

01:33:20:03 – 01:33:21:20
and I wish we could keep discussing

01:33:21:20 – 01:33:25:01
how the Germans are.

01:33:25:01 – 01:33:26:23
The shared study and research

01:33:26:23 – 01:33:28:04
really impacts everything.

01:33:30:00 – 01:33:30:19
Good idea.

01:33:30:19 – 01:33:33:19
Good idea.

01:33:40:01 – 01:33:40:16
Okay.

01:33:40:16 – 01:33:43:05
Any more questions?

01:33:43:05 – 01:33:45:07
Okay, I do have

01:33:45:07 – 01:33:47:09
I have more questions here,

01:33:47:09 – 01:33:50:09
so I will

01:33:50:23 – 01:33:53:08
copy exactly from what

01:33:53:08 – 01:33:54:11
I’m seeing here on the question,

01:33:54:11 – 01:33:55:15
this might be a good question

01:33:55:15 – 01:33:57:07
for Jeff to answer.

01:33:57:07 – 01:34:00:08
So, Jeff, we’re wondering

01:34:00:08 – 01:34:02:15
if the deaf community

01:34:02:15 – 01:34:05:15
is open to this system.

01:34:06:05 – 01:34:09:05
So for each individual user,

01:34:11:19 – 01:34:14:06
will they all be trained

01:34:14:06 – 01:34:17:19
and taught in their language of ASL?

01:34:17:19 – 01:34:18:14
For example,

01:34:18:14 – 01:34:20:07
if there’s a website,

01:34:20:07 – 01:34:23:07
well, that have American Sign language

01:34:23:09 – 01:34:26:09
and will it be copied and saved,

01:34:26:18 – 01:34:30:07
and will there be anything like

01:34:30:07 – 01:34:31:11
if this is saved,

01:34:31:11 – 01:34:32:16
will personal information

01:34:32:16 – 01:34:34:14
be saved in a server?

01:34:34:14 – 01:34:37:04
How will they ensure confidentiality?

01:34:37:04 – 01:34:37:16
While deaf

01:34:37:16 – 01:34:41:12
individuals have personal rights

01:34:41:12 – 01:34:43:06
to say yes, I’m

01:34:43:06 – 01:34:44:19
okay with releasing my information

01:34:44:19 – 01:34:46:14
at this company, How will that work?

01:34:48:14 – 01:34:50:10
Yeah, that’s a great question.

01:34:50:10 – 01:34:53:04
I think the formal term for

01:34:53:04 – 01:34:56:04
that is called fine tuning.

01:34:57:13 – 01:35:00:08
So with fine tuning you can go in

01:35:00:08 – 01:35:03:13
and make sure that the model is a fit

01:35:03:13 – 01:35:05:12
for your personal style,

01:35:05:12 – 01:35:08:11
your personal terms of choice.

01:35:08:11 – 01:35:11:15
So we can do that currently with English.

01:35:11:15 – 01:35:13:07
When it comes to Elm,

01:35:13:07 – 01:35:15:08
large language models like for example,

01:35:15:08 – 01:35:18:19
Chat, GPT, you can go in and personalize

01:35:18:19 – 01:35:20:01
and bind tune

01:35:20:01 – 01:35:21:23
the data that comes out of it.

01:35:21:23 – 01:35:23:15
And so I would imagine

01:35:23:15 – 01:35:24:09
that the same thing

01:35:24:09 – 01:35:25:09
will eventually happen

01:35:25:09 – 01:35:28:18
when it comes to AI by AI and also

01:35:28:18 – 01:35:31:18
in terms of sharing with others.

01:35:31:23 – 01:35:34:19
I don’t say I don’t see any reason why

01:35:34:19 – 01:35:36:01
it would take a long time

01:35:36:01 – 01:35:36:14
to be able

01:35:36:14 – 01:35:38:07
to for people

01:35:38:07 – 01:35:40:01
to give their informed consent.

01:35:40:01 – 01:35:41:15
So it is your data

01:35:41:15 – 01:35:42:20
and you’ll be able to do

01:35:42:20 – 01:35:44:22
whatever you want to do with it.

01:35:44:22 – 01:35:46:23
That would be the idea.

01:35:46:23 – 01:35:48:02
I don’t know if anyone has

01:35:48:02 – 01:35:49:08
anything else to add to that

01:35:51:14 – 01:35:52:16
Emery here, I

01:35:52:16 – 01:35:54:05
would like to add a comment.

01:35:54:05 – 01:35:55:19
I think that

01:35:55:19 – 01:35:58:19
the community feels empowered

01:35:59:14 – 01:36:02:11
to have these options available.

01:36:02:11 – 01:36:05:00
So I think whether or not

01:36:05:00 – 01:36:07:02
everyone agrees or not,

01:36:07:02 – 01:36:08:18
the idea of having options

01:36:08:18 – 01:36:11:10
available to personalize their data

01:36:11:10 – 01:36:12:23
I think would be optimal

01:36:12:23 – 01:36:15:23
and well received in general

01:36:17:16 – 01:36:19:01
to say, okay, you know,

01:36:19:01 – 01:36:20:15
I see that this is happening,

01:36:20:15 – 01:36:22:10
but I’m just one person.

01:36:22:10 – 01:36:25:01
But that would be my my guess.

01:36:25:01 – 01:36:26:17
This is Tim.

01:36:26:17 – 01:36:28:22
I remember back in what was it, 2001

01:36:28:22 – 01:36:30:11
when I was in college,

01:36:30:11 – 01:36:33:02
we had speech recognition software

01:36:33:02 – 01:36:35:05
and it was called Dragon

01:36:35:05 – 01:36:37:04
Natural Speaking.

01:36:37:04 – 01:36:41:04
And so that required a lot of training,

01:36:41:12 – 01:36:44:24
a lot of feeding into the technology to

01:36:44:24 – 01:36:46:00
develop it.

01:36:46:00 – 01:36:47:22
But there was some control

01:36:47:22 – 01:36:49:18
of sound and clarity.

01:36:49:18 – 01:36:51:24
But I think today the technology

01:36:51:24 – 01:36:54:01
seems to have gotten even better,

01:36:54:01 – 01:36:57:06
and so there’s less of a curve there.

01:36:57:06 – 01:36:59:09
But I think it would be possible

01:36:59:09 – 01:37:01:16
for bias to still be involved.

01:37:01:16 – 01:37:03:09
And so, of course, we see

01:37:03:09 – 01:37:04:16
that it’s quite standard English,

01:37:04:16 – 01:37:06:05
especially in particular tool

01:37:06:05 – 01:37:07:03
I was just discussing.

01:37:07:03 – 01:37:10:08
So when it comes to dialect and accents

01:37:10:08 – 01:37:11:07
and that kind of thing,

01:37:11:07 – 01:37:12:13
for my understanding,

01:37:12:13 – 01:37:14:17
the bigger concern is coming from

01:37:14:17 – 01:37:15:20
the greater community

01:37:15:20 – 01:37:16:14
based on what we see

01:37:16:14 – 01:37:18:03
on speech recognition.

01:37:18:03 – 01:37:20:02
So we can only imagine

01:37:20:02 – 01:37:20:21
what it might look like

01:37:20:21 – 01:37:21:19
with sign language.

01:37:21:19 – 01:37:23:08
So with over 20 years

01:37:23:08 – 01:37:25:11
invested in these types of tools

01:37:25:11 – 01:37:27:10
and and seeing how that goes,

01:37:27:10 – 01:37:28:04
I think that

01:37:28:04 – 01:37:29:23
we need to take into consideration

01:37:29:23 – 01:37:32:04
that we would need better technology

01:37:32:04 – 01:37:35:04
and we would need, you know, that,

01:37:35:14 – 01:37:37:13
for example, leapfrog technology

01:37:37:13 – 01:37:40:13
where we would come in and

01:37:40:20 – 01:37:41:24
be able to say,

01:37:41:24 – 01:37:43:04
we’ve already seen these things

01:37:43:04 – 01:37:45:05
develop, we’ve seen how this has gone.

01:37:45:05 – 01:37:46:22
So maybe we wouldn’t

01:37:46:22 – 01:37:48:23
need as much time to catch up.

01:37:48:23 – 01:37:49:24
I would hope,

01:37:49:24 – 01:37:50:22
like I mentioned, that

01:37:50:22 – 01:37:53:22
that dragon technology’s like 20 years in

01:37:53:22 – 01:37:57:22
so I think that I’m not necessarily

01:37:57:22 – 01:38:00:12
into the technical aspect as much as,

01:38:00:12 – 01:38:01:02
but that’s just

01:38:01:02 – 01:38:02:17
based on my experience time.

01:38:02:17 – 01:38:04:06
I predict it could go.

01:38:07:12 – 01:38:08:24
Holly says the person that asked

01:38:08:24 – 01:38:10:08
that question added a comment

01:38:10:08 – 01:38:12:11
and said, Yes,

01:38:12:11 – 01:38:14:09
I know exactly what you’re talking about.

01:38:14:09 – 01:38:15:09
The dragon, naturally

01:38:15:09 – 01:38:16:10
speaking technology.

01:38:16:10 – 01:38:18:20
I’m familiar with that.

01:38:18:20 – 01:38:20:14
Okay.

01:38:20:14 – 01:38:23:14
Another question says,

01:38:23:23 – 01:38:26:23
I am a deaf leader in my community.

01:38:28:13 – 01:38:32:19
How can we build more of a

01:38:33:04 – 01:38:34:17
bond,

01:38:34:17 – 01:38:37:18
a stronger bond with the interpreters?

01:38:38:03 – 01:38:40:14
Because many of them are afraid

01:38:40:14 – 01:38:43:14
to reach out to the deaf?

01:38:45:01 – 01:38:47:08
And said General question.

01:38:47:08 – 01:38:49:10
Emery says, Can you repeat the question,

01:38:49:10 – 01:38:52:10
please?

01:38:53:08 – 01:38:54:13
It’s a general question.

01:38:54:13 – 01:38:55:16
Holly says,

01:38:55:16 – 01:38:58:16
I am a deaf leader in my community

01:38:59:10 – 01:39:02:06
and I want to know how we as the deaf

01:39:02:06 – 01:39:05:07
community, can build stronger bonds

01:39:06:19 – 01:39:09:18
with interpreters.

01:39:09:18 – 01:39:12:17
Many interpreters afraid,

01:39:12:17 – 01:39:15:17
and they don’t reach out to us and from

01:39:15:17 – 01:39:16:11
the deaf community,

01:39:19:01 – 01:39:21:14
Tim says.

01:39:21:14 – 01:39:24:08
I think that

01:39:24:08 – 01:39:26:18
it’s all comes back to trust

01:39:26:18 – 01:39:28:10
trust issues.

01:39:28:10 – 01:39:29:23
So from my understanding,

01:39:29:23 – 01:39:32:06
many deaf people are afraid

01:39:32:06 – 01:39:34:03
and resistant to technology

01:39:34:03 – 01:39:36:17
because they’re afraid that

01:39:36:17 – 01:39:38:03
this technology

01:39:38:03 – 01:39:39:10
would take away their ability

01:39:39:10 – 01:39:40:20
to have an informed choice

01:39:40:20 – 01:39:42:00
to make decisions.

01:39:42:00 – 01:39:44:03
Same things with same thing with video

01:39:44:03 – 01:39:45:05
remote interpreting.

01:39:45:05 – 01:39:47:04
A lot of deaf people didn’t want that

01:39:47:04 – 01:39:48:00
because,

01:39:48:00 – 01:39:48:05
you know,

01:39:48:05 – 01:39:49:23
they have to fight the system

01:39:49:23 – 01:39:51:22
to get an in-person interpreter.

01:39:51:22 – 01:39:53:11
So based on that experience

01:39:53:11 – 01:39:54:16
and, those challenges,

01:39:54:16 – 01:39:57:16
it’s caused a lot of issues with trust.

01:39:57:19 – 01:40:00:01
And

01:40:00:01 – 01:40:00:16
I know

01:40:00:16 – 01:40:03:03
especially when it comes to health care,

01:40:03:03 – 01:40:04:12
it’s already challenging

01:40:04:12 – 01:40:05:14
to make appointments

01:40:05:14 – 01:40:06:07
to come

01:40:06:07 – 01:40:09:07
in, to have access and all of that.

01:40:09:10 – 01:40:12:20
So it can be very exhausting and cause

01:40:12:20 – 01:40:13:15
deaf individuals

01:40:13:15 – 01:40:14:04
to feel like

01:40:14:04 – 01:40:15:08
they don’t want to go to the doctor

01:40:15:08 – 01:40:16:09
because they don’t want to deal

01:40:16:09 – 01:40:17:14
with all of that.

01:40:17:14 – 01:40:19:17
Now, when it comes to interpreters,

01:40:19:17 – 01:40:23:10
of course, there are some things

01:40:23:10 – 01:40:24:00
to consider,

01:40:24:00 – 01:40:25:21
like the code of ethics, code of coverage

01:40:25:21 – 01:40:26:18
reality.

01:40:26:18 – 01:40:28:13
And I think deaf people may feel,

01:40:28:13 – 01:40:29:09
you know what,

01:40:29:09 – 01:40:32:09
I would prefer to have a I, because

01:40:32:23 – 01:40:35:12
in this case, there’s no baggage.

01:40:35:12 – 01:40:36:15
I don’t have to deal with

01:40:36:15 – 01:40:39:15
the human aspect of trust issues.

01:40:39:16 – 01:40:42:11
I know that with AI

01:40:42:11 – 01:40:45:08
and that kind of collaboration and dialog

01:40:45:08 – 01:40:47:12
also, it can lead to the discussion

01:40:47:12 – 01:40:48:18
of what the meaning

01:40:48:18 – 01:40:50:12
of trust is, what confidentiality

01:40:50:12 – 01:40:51:14
should look like,

01:40:51:14 – 01:40:53:24
the code of ethics, how that applies,

01:40:53:24 – 01:40:55:04
and just to make sure

01:40:55:04 – 01:40:58:02
that if our expectations of AI

01:40:58:02 – 01:41:00:02
are high and

01:41:01:14 – 01:41:02:15
we need to know

01:41:02:15 – 01:41:03:09
those accurate

01:41:03:09 – 01:41:04:20
or what could we expect something

01:41:04:20 – 01:41:06:08
that we’ve had with the human experience.

01:41:06:08 – 01:41:09:08
So that’s another discussion to have.

01:41:10:15 – 01:41:12:03
Theresa I don’t know if you want to add

01:41:12:03 – 01:41:13:01
to that.

01:41:13:01 – 01:41:15:02
Theresa says Yes, I’m just thinking.

01:41:15:02 – 01:41:18:23
I think that maybe the first step

01:41:19:02 – 01:41:21:18
would be to start these discussions

01:41:21:18 – 01:41:24:15
and to have some time in smaller

01:41:24:15 – 01:41:25:16
communities

01:41:25:16 – 01:41:28:08
where everyone knows each other, right?

01:41:28:08 – 01:41:29:11
We are all familiar

01:41:29:11 – 01:41:30:23
with those types of situations

01:41:30:23 – 01:41:32:07
where the interpreters and all the deaf

01:41:32:07 – 01:41:33:04
people and the deaf people,

01:41:33:04 – 01:41:34:07
not the interpreters,

01:41:34:07 – 01:41:37:22
but maybe we would start with some kind

01:41:37:22 – 01:41:39:12
of, let’s say, for example,

01:41:39:12 – 01:41:43:07
have your local deaf

01:41:43:07 – 01:41:47:17
community groups and local or I.D.

01:41:47:19 – 01:41:49:10
or interpreting organizations

01:41:49:10 – 01:41:50:20
come together.

01:41:50:20 – 01:41:51:13
So, for example,

01:41:51:13 – 01:41:52:13
maybe they come together

01:41:52:13 – 01:41:53:18
and watch this video

01:41:53:18 – 01:41:55:17
and then they host a discussion

01:41:55:17 – 01:41:57:02
and ask questions.

01:41:57:02 – 01:41:59:22
But I’m just thinking, how can we start

01:41:59:22 – 01:42:01:17
to develop this discussion

01:42:01:17 – 01:42:03:02
and this dialog?

01:42:03:02 – 01:42:05:05
Because without the dialog,

01:42:05:05 – 01:42:07:08
there’s so many misunderstandings.

01:42:07:08 – 01:42:08:19
And I think that

01:42:08:19 – 01:42:10:19
with the dialog, misunderstandings

01:42:10:19 – 01:42:12:08
will still happen as well,

01:42:12:08 – 01:42:14:06
but is definitely an opportunity

01:42:14:06 – 01:42:14:23
to have

01:42:14:23 – 01:42:16:00
more understanding

01:42:16:00 – 01:42:17:03
and more of an opportunity

01:42:17:03 – 01:42:18:15
to listen to each other

01:42:18:15 – 01:42:20:20
and figure out how we can discuss

01:42:20:20 – 01:42:23:09
this together is huge.

01:42:23:09 – 01:42:24:13
It’s coming

01:42:24:13 – 01:42:26:08
and we need to be sure

01:42:26:08 – 01:42:28:18
that we know how to respond to this.

01:42:28:18 – 01:42:30:08
We need to start with

01:42:30:08 – 01:42:32:24
the grassroots community

01:42:32:24 – 01:42:34:08
and go from there.

01:42:34:08 – 01:42:35:22
So the grassroots

01:42:35:22 – 01:42:36:16
community is the heart

01:42:36:16 – 01:42:37:16
of our deaf community.

01:42:43:04 – 01:42:44:04
Thank you, Anne Marie.

01:42:44:04 – 01:42:44:24
And Jeff, would you

01:42:44:24 – 01:42:46:13
do you have anything you want to add?

01:42:46:13 – 01:42:49:13
Just saying I agree wholeheartedly.

01:42:50:04 – 01:42:51:20
Emery Here,

01:42:51:20 – 01:42:55:18
this one topic itself is just it.

01:42:56:03 – 01:42:57:09
It’s huge.

01:42:57:09 – 01:42:59:09
There’s no way to describe it.

01:42:59:09 – 01:43:00:05
Otherwise,

01:43:00:05 – 01:43:01:21
there are so many things

01:43:01:21 – 01:43:04:17
to look at the process itself,

01:43:04:17 – 01:43:08:02
the trust in the process, the transport

01:43:08:03 – 01:43:11:03
tenancy, the data collection, all of it

01:43:11:06 – 01:43:12:17
together.

01:43:12:17 – 01:43:13:17
But you know,

01:43:13:17 – 01:43:16:06
and how everything moves in tandem.

01:43:16:06 – 01:43:17:06
It’s an opportunity

01:43:17:06 – 01:43:20:06
to always create a safe space

01:43:20:23 – 01:43:23:06
for the community to come together

01:43:23:06 – 01:43:24:17
and to discuss.

01:43:24:17 – 01:43:25:19
And it’s important

01:43:25:19 – 01:43:26:08
for us

01:43:26:08 – 01:43:29:08
to look at that process as a whole.

01:43:31:13 – 01:43:33:04
Very good discussion.

01:43:33:04 – 01:43:35:11
Very good discussion,

01:43:35:11 – 01:43:35:24
Ali, saying,

01:43:35:24 – 01:43:38:24
okay, next question is about the research

01:43:38:24 – 01:43:42:14
process from last October.

01:43:42:14 – 01:43:45:14
You had three different webinar sessions,

01:43:45:18 – 01:43:48:15
the deaf participants in those sessions.

01:43:48:15 – 01:43:49:24
What were their backgrounds,

01:43:49:24 – 01:43:52:19
where were they from geographically?

01:43:52:19 – 01:43:54:07
And

01:43:55:22 – 01:43:58:22
demographically?

01:43:59:03 – 01:44:00:20
Just saying, go ahead.

01:44:00:20 – 01:44:01:07
Emery

01:44:01:07 – 01:44:03:21
So the backgrounds were very diverse.

01:44:03:21 – 01:44:05:18
We had some that were deaf interpreters,

01:44:05:18 – 01:44:07:13
deaf consumers,

01:44:07:13 – 01:44:11:05
deaf individuals who are professionals

01:44:11:10 – 01:44:14:08
working in the field of education,

01:44:14:08 – 01:44:16:07
working in the field of interpreting

01:44:16:07 – 01:44:17:21
very different varieties

01:44:17:21 – 01:44:20:21
of backgrounds and

01:44:21:12 – 01:44:23:04
areas as well.

01:44:23:04 – 01:44:24:22
Jeff Did you want to add some more?

01:44:24:22 – 01:44:25:20
Jeff Yes,

01:44:25:20 – 01:44:26:22
I think the next step

01:44:26:22 – 01:44:28:22
is to decide how we expand

01:44:28:22 – 01:44:30:11
and how we grow our audience

01:44:30:11 – 01:44:32:05
from those webinars

01:44:32:05 – 01:44:34:00
and to really include

01:44:34:00 – 01:44:35:21
even more of the community at large,

01:44:35:21 – 01:44:38:21
to have all those perspectives as well

01:44:40:06 – 01:44:41:09
in the saying yes,

01:44:41:09 – 01:44:42:23
the webinars were really

01:44:42:23 – 01:44:47:15
our first step into this

01:44:47:15 – 01:44:49:19
realm of testing out this,

01:44:49:19 – 01:44:50:21
looking at different things,

01:44:50:21 – 01:44:52:02
and this symposium

01:44:52:02 – 01:44:53:04
will just continue

01:44:53:04 – 01:44:55:21
to be a springboard into the future

01:44:55:21 – 01:44:57:17
and as long as we have a

01:44:57:17 – 01:44:59:13
I will be having these discussions

01:45:00:24 – 01:45:01:16
in raising.

01:45:01:16 – 01:45:02:11
I’d like to add

01:45:02:11 – 01:45:06:05
also that all of the individuals,

01:45:06:13 – 01:45:08:03
the participants here today,

01:45:08:03 – 01:45:09:16
you guys are critical

01:45:09:16 – 01:45:11:16
for this process as well.

01:45:11:16 – 01:45:13:01
We are not finished now

01:45:13:01 – 01:45:14:15
that this report is published.

01:45:14:15 – 01:45:16:05
This is just the beginning.

01:45:16:05 – 01:45:18:12
US and even all of your questions

01:45:18:12 – 01:45:20:04
have really spurred our thoughts

01:45:20:04 – 01:45:22:04
into how we move this forward

01:45:22:04 – 01:45:23:04
and what the next steps

01:45:23:04 – 01:45:24:06
are in this process.

01:45:24:06 – 01:45:27:06
We’ve just scratched the surface.

01:45:34:04 – 01:45:35:21
Okay, I’ll help you think.

01:45:35:21 – 01:45:36:23
We have navigated

01:45:36:23 – 01:45:38:06
through all of the questions

01:45:38:06 – 01:45:39:22
that we have for the day.

01:45:39:22 – 01:45:45:02
We do have one more, but it is

01:45:45:07 – 01:45:46:17
with this

01:45:46:17 – 01:45:49:17
webinar, complete this report, complete

01:45:49:21 – 01:45:52:01
the sharing of the recording

01:45:52:01 – 01:45:53:10
When will that be done?

01:45:53:10 – 01:45:57:08
The Deaf Safe

01:45:57:16 – 01:46:01:10
Advisory website, the report,

01:46:01:16 – 01:46:03:05
if that will be shared,

01:46:03:05 – 01:46:06:09
and how to register

01:46:06:09 – 01:46:09:09
for the Brown University Symposium.

01:46:09:16 – 01:46:10:14
Several people have asked

01:46:10:14 – 01:46:12:06
for that information as well.

01:46:14:18 – 01:46:17:11
Tim here, so I will answer that.

01:46:17:11 – 01:46:21:10
The report we have both the safe

01:46:21:18 – 01:46:25:19
I their report as well as ours,

01:46:25:19 – 01:46:27:00
and we’ve been working

01:46:27:00 – 01:46:29:08
with Katha research.

01:46:29:08 – 01:46:31:10
They have spent so much time

01:46:31:10 – 01:46:32:21
going through this project.

01:46:32:21 – 01:46:36:04
The surveys, volunteers, so much work

01:46:36:04 – 01:46:37:13
has gone into that report

01:46:37:13 – 01:46:38:04
and that will

01:46:38:04 – 01:46:40:00
and will be published as well.

01:46:40:00 – 01:46:41:15
We had a tech

01:46:41:15 – 01:46:43:12
a presentation scheduled yesterday,

01:46:43:12 – 01:46:45:11
but there was technological problems

01:46:45:11 – 01:46:47:00
and so we’re going to be rescheduling

01:46:47:00 – 01:46:47:24
that for Wednesday morning,

01:46:47:24 – 01:46:50:24
I believe, at 11 Eastern time.

01:46:51:00 – 01:46:54:02
And so with that being said,

01:46:54:02 – 01:46:55:13
I really encourage you all

01:46:55:13 – 01:46:57:12
to watch that presentation.

01:46:57:12 – 01:46:59:08
It’s about languages in general,

01:46:59:08 – 01:47:00:14
not just sign languages,

01:47:00:14 – 01:47:01:23
it’s languages in general.

01:47:01:23 – 01:47:03:17
And so that will be made

01:47:03:17 – 01:47:05:10
available to the public as well.

01:47:05:10 – 01:47:06:06
This one,

01:47:06:06 – 01:47:09:17
I believe we have some editing to do

01:47:09:23 – 01:47:12:10
and we will make it available here soon.

01:47:12:10 – 01:47:13:10
The presentation

01:47:13:10 – 01:47:14:01
next Wednesday

01:47:14:01 – 01:47:16:07
will also be available online.

01:47:16:07 – 01:47:18:22
Our website does have links

01:47:18:22 – 01:47:21:22
to Safe A.I.

01:47:22:01 – 01:47:23:11
Advisory Group,

01:47:23:11 – 01:47:25:04
and then we have our own Death

01:47:25:04 – 01:47:27:04
Advisory Group page

01:47:27:04 – 01:47:29:02
and the links to those.

01:47:29:02 – 01:47:30:08
I will thin them out

01:47:30:08 – 01:47:32:07
and make it available there

01:47:32:07 – 01:47:34:16
also the symposium

01:47:34:16 – 01:47:37:16
we are currently working on that platform

01:47:37:20 – 01:47:38:22
and the save

01:47:38:22 – 01:47:40:12
the date is just announced today.

01:47:40:12 – 01:47:42:16
This is our first announcement for that

01:47:42:16 – 01:47:44:19
I will send out a more formal

01:47:44:19 – 01:47:45:14
save the date

01:47:45:14 – 01:47:47:22
with information on registration

01:47:47:22 – 01:47:49:02
and the like.

01:47:49:02 – 01:47:50:05
It will be sent out.

01:47:50:05 – 01:47:52:08
That is a currently in process.

01:47:52:08 – 01:47:53:19
So Look forward to that.

01:47:55:06 – 01:47:55:21
I do.

01:47:55:21 – 01:47:56:20
Any of the other

01:47:56:20 – 01:47:58:06
advisory council members

01:47:58:06 – 01:47:58:21
have something

01:47:58:21 – 01:48:01:10
that they would like to add.

01:48:01:10 – 01:48:02:05
Theresa Thing

01:48:02:05 – 01:48:03:23
I just want to thank all of you

01:48:03:23 – 01:48:06:12
and Mary saying, yes, I agree. Thank you.

01:48:06:12 – 01:48:08:09
Thank you for your interest

01:48:08:09 – 01:48:09:04
in this topic.

01:48:09:04 – 01:48:10:04
Thank you for coming

01:48:10:04 – 01:48:12:03
and listening to our presentation.

01:48:12:03 – 01:48:13:15
We Appreciate it.

01:48:13:15 – 01:48:15:09
We we think that it’s wonderful

01:48:15:09 – 01:48:16:19
that all of you were involved

01:48:16:19 – 01:48:18:22
and we really appreciate everyone

01:48:18:22 – 01:48:20:12
that was involved in the study

01:48:20:12 – 01:48:21:04
just said yes.

01:48:21:04 – 01:48:24:04
Thank you so much.

01:48:24:11 – 01:48:25:17
Thank you, Holly, as well

01:48:25:17 – 01:48:26:21
for your time today.

01:48:26:21 – 01:48:29:06
We appreciate you joining us

01:48:29:06 – 01:48:32:06
in this presentation.

01:48:32:07 – 01:48:35:07
Thank you very much,

01:48:37:06 – 01:48:37:12
Holly.

01:48:37:12 – 01:48:37:19
Same.

01:48:37:19 – 01:48:39:09
I want to make sure that we have a clear

01:48:39:09 – 01:48:40:13
answer about

01:48:40:13 – 01:48:40:22


01:48:40:22 – 01:48:43:10
if this recording of the webinar

01:48:43:10 – 01:48:44:22
will be broadcast

01:48:44:22 – 01:48:46:09
and then

01:48:46:09 – 01:48:47:17
we will be sending up

01:48:47:17 – 01:48:49:06
a follow up email as well.

01:48:49:06 – 01:48:49:19
For everyone

01:48:49:19 – 01:48:50:16
who registered

01:48:50:16 – 01:48:53:16
with the website information

01:48:53:16 – 01:48:56:24
and Brown University Symposium

01:48:58:07 – 01:48:59:23
information

01:48:59:23 – 01:49:03:02
and what else was there.

01:49:03:02 – 01:49:03:24
I think that’s it.

01:49:03:24 – 01:49:06:24
So we will correct?

01:49:07:15 – 01:49:08:21
Yes, simply yes,

01:49:08:21 – 01:49:09:24
we will all just say yes,

01:49:09:24 – 01:49:12:24
it will be all available.

01:49:15:22 – 01:49:18:15
The webinar also that we have,

01:49:18:15 – 01:49:19:23
we have the video

01:49:19:23 – 01:49:21:03
recordings of the webinars,

01:49:21:03 – 01:49:22:13
if you’d like to see those as well.

01:49:22:13 – 01:49:25:13
We those available.

01:49:29:23 – 01:49:31:24
All right,

01:49:31:24 – 01:49:33:18
Tim thing, I believe this concludes

01:49:33:18 – 01:49:36:18
our meeting.

01:49:38:24 – 01:49:40:10
Just saying Thank you, everyone.

01:49:40:10 – 01:49:42:06
We appreciate your time.

01:49:42:06 – 01:49:43:18
Thank you so much. Bye bye.

 

Looking for the Perception Survey?

Play Video
Play Video
Play Video
Play Video
Play Video
Play Video
Play Video
Play Video

Articles & Resources