Friday, 6 December 2019

A case most curious

Unheard Stories from VSO on Vimeo.

Nobody has stated the very obvious, in that the community she belongs to (the Islamic one), is the higher priority, or, that such deaf do not even integrate with the wider deaf communities either.  Even the Jewish areas and black ethnic areas have own clubs and organisations e.g.  Wonder what her views are on her peers dressing up head to toe in black and hiding their faces?  

Hardly conducive to lip-reading is it? Or deaf being able to read facial features.  In my area we had to demand a different Dr wearing one at a hospital, no one had any idea what she was really saying.  In the end such staff ran separate consultations for their own (Hearing), people, it's hardly inclusion.  Deaf were accused of anti-Islamic discrimination you could not script it.  We rely on seeing the facial features.

Google’s Live Caption


Google Pixel 3a front straight handheld
Google is bringing the Pixel 4’s Live Caption tech to its Pixel 3 series, two months after unveiling the feature for the first time. 

The company began rolling the real-time transcription service out to Pixel 3 and Pixel 3a users on Tuesday. With the tap of a button, fans of Google’s 2018 and early 2019 models can now automatically generate captions for pretty much any audio or video content on their phones.

The feature was designed as a tool for users who are hard of hearing or simply need a bit of help following the audio on their phone, whether that is because they are learning a new language, trying not to wake a baby or struggling to hear their favourite podcast on a busy train. Read our review of the Pixel 4 “Live Caption wouldn’t have been possible without the Deaf and hard of hearing communities who helped guide us from the very beginning”, wrote Android Accessibility Product Manager Brian Kemler back in October. 

“Similar to how we designed Live Transcribe earlier this year, we developed Live Caption in collaboration with individuals from the community and partners like Gallaudet University, the world’s premier university for Deaf and hard of hearing people. An early Deaf tester, Naiajah Wilson, explained how Live Caption would impact her daily life: “Now I don’t have to wake up my mom or dad and ask what’s being said””. 

Perhaps one of the most exciting aspects to Live Caption is that it works offline – the app processes speech directly on your device. You can even adjust the size of the captions or drag them around the screen so, if you’re watching a video, you know they’ll never get in the way of anything important. Unfortunately, the feature is not yet able to transcribe phone calls. It is also only available in English right now, though Google does have plans to support more languages in the future. 

Live Caption was initially introduced alongside the Pixel 4 at the Made By Google event in October. The update will continue to roll out across Pixel 3 and 3a handsets this week.

Sign language, identity, and assistive technology.


“MIT is the best place to be an anthropologist studying issues of science and technology,
It would help if researchers did not automatically assume the ASL stance as being the same as anti-alleviation or indeed anti- implantation or use of technologies.  Online views mostly are those of activism NOT grassroots.  

If researchers want a topic/theme that tests their researching abilities then try the NON-signing deaf and serious hearing loss areas (The majority who survive without the angst, the culture or the support! and without the need for sign too), how DO they do this?  Anyone can cut and paste online from the usual suspects and get sidetracked with the smoke screen of deprivation, discriminations,  and cultural martyrdom.  This isn't 1880 its 2019.


For an undergraduate research project, Loh merged these two interests — sign language and the Middle East — and received a grant to study the pedagogical structure of a school for the deaf in Jordan, picking up some Jordanian Sign Language in the process to carry out the research. “Sign languages are different in every country,” Loh explains, “because they emerge naturally within communities. 

They develop individually and become different languages, just as spoken languages do. American Sign Language and British Sign Language, for example, are different sign languages even though these signers are all surrounded by English speakers.” Soon, however, Loh began to explore assistive technology and, in particular, cochlear implants. These devices are surgically implanted and bypass the normal acoustic hearing process with electronic signals; these stimulate the auditory nerve to provide a sense of sound to the user. “Implants were controversial within the deaf community in the United States at first,” says Loh, “and still are, to some extent. 

There was a fear of what they would mean for the future of the deaf community. There were scholars who described cochlear implants for the deaf as a form of cultural or linguistic genocide. That sounds like an extreme description, but it really does index the depth of attachment that people have to a sense of themselves as deaf. So, I started thinking about the implications that technology has in the world of the deaf and for their ability to navigate the world.”