Follow

sign language at a distance, human-computing interface meta, emerging need 

Oi, great and mighty fedi, whomst collectively knows much that is hidden from i/we, a lowly engineer...

...what are good options for integrating JUST sign language support into an app?

--

(This post brought to you by the extensive sign-language shared between myself & kiddo that, when i am outside the room or not paying attention when they try to communicate, their distress at being unable to communicate (still only a handful of verbalized words) or having nobody who understands it well enough to translate, and that pushes their anxiety levels (and subsequent externalizing of their internal chaos & frustration) into the space.)

===
What development kits are available, are there things that actually work at the moment, give me what you've got.

Due to...stress...among other thigns...I am having increasingly extended periods of nonverbal. During those times, my sign language use can become...difficult to read for people operating outside my hyperaccellerated internal time model. Even a simple computer has enough processing power to more than keep up with my rate of hand-words tho, and my first thought of combining something like the Leap Motion controller (adafruit.com/product/2106) and seemed...like it was going to run into some problems almost immediately. Most significantly my lack of knowledge about what currently works, because I'd like to start there, rather than devving in isolation as is my normal.

This is way outside my knowledge / experience level, but am certain that i can eventually tinker my way towards understanding and a solution of sorts, and I'm actively seeking advice here.

The goal is something capable of USB output in the form of english-language words, but the idea is a completely self-contained sensor & display that would allow me to set it down, make a series of signs, and then mildly correct the intent before showing it to someone, or sending in chat, etc.

(This is an interface, and comments pushing dogma or techno-theology will not be appreciated, already heard it a few times and still disagree with basically everything except , thanks.)

Boost okay, interactions and questions are okay, even just a link to that one thing you saw and saved about an adjacent topic might give me a place to evolve from.

Thanks, y'all.
Peace on your path.

sign language at a distance, human-computing interface meta, emerging need 

@jakimfett I just want to say that I'm LOVING this idea. Please pursue this <3

sign language at a distance, human-computing interface meta, emerging need 

@jakimfett I must admit, I have caught myself dreaming of a sign-language implementation of the uxntal assembler 😳

sign language at a distance, human-computing interface meta, emerging need 

@jakimfett Is that leap motion thing able to track everything for sign language? The videos seem to imply hand only, whereas I though sign language used most of the whole upper body.

If you have to buy hardware anyways, a Kinect might be a better starting point since it tracks the whole body.

"sign language gesture recognition" actually throws up a lot of results... I was doing OpenCV homework anyway, so I'll take a look

sign language at a distance, human-computing interface meta, emerging need 

sign language at a distance, human-computing interface meta, emerging need 

@jakimfett I just realized I have no idea if a Kinect can track individual fingers. Probably not, given its age.

This one seems to aim to not require anything more than a smartphone:
sciencedirect.com/science/arti

sign language at a distance, human-computing interface meta, emerging need 

@jakimfett
This apparently won some kind of contest and is under a CC license.

github.com/jackyjsy/CVPR21Chal

sign language at a distance, human-computing interface meta, emerging need 

@jakimfett
There are a bunch more under this GitHub tag github.com/topics/sign-languag

re: sign language at a distance, human-computing interface meta, emerging need 

@csepp you did some heavy lifting for me, thank you so much!

And yeah, I'd originally discounted the Kinect bc the easy to find ones are the older ones, and I'm optimizing this to work with like...two old smartphones that you prop up at 90ish degrees to one another. Flexible enough to swap in for the magic leap plus a kinect for improved accuracy once the software was "working as intended".

Gonna delve into this and see what will work with what I've got on hand, and steppingstone from there.

Thank you again for putting in this work.

re: sign language at a distance, human-computing interface meta, emerging need 

@jakimfett Np prob. UwU

I also forwarded it to my linguist friend who is studying sign language who sent me this:
sign-lang.uni-hamburg.de/hamno
Maybe it would be possible to decompose motions into "sign characters".

Anyways, I hope you figure something out! I quickly realized most of it requires machine learning which I have not studied, but I'll be curious how this progresses.

re: sign language at a distance, machine learning, human-computing interface meta, emerging need 

@csepp oh that's excellent, exactly the sort of evolution of the idea I was hoping had emerged in the last few years since I poked this.

Also, machine learning is...a bit of a grey area for me. I've avoided it, intentionally, because of how closely it approaches using my skills to enforce servitude, and because of how often I hear people talking about how the underlying mainstream approaches to ML/AI are taking a very much unethical-if-they-were-sentient-but-they-arent-so-who-cares sort of turn over the course of the last decade or so.

At one point in time, there was an effort to re-imagine neural nets as less of a tool and more of a curiosity, like one might make space for an ant farm or bird feeder, rather than as something to accomplish an output. I probably need to think and study this a bit more before I start getting too deep into my feelings about computers as companions rather than...idk. Whatever that other shyte is that mainstream does. XD

(soz for soapboxing <_<)

sign language at a distance, human-computing interface meta, emerging need 

@jakimfett I don't have any useful place to start but one thing to note is that sign language is not international. That might seem obvious but that doesn't just mean German Sign Language is different, but also BSL (British) is VERY different from ASL (American). Makes finding resources or even collaboration harder. Good luck in your endeavours though!

re: sign language at a distance, human-computing interface meta, emerging need 

@martyn yup.

The implementation is / would be in our local language (*definitely not english lol i did say it was unique between me and kiddo, kinda implies a nonstandard bias), and plan around making it easy for my kid to be swapping in their own algorithm for language, and ideally a way to train the recognition model locally too. And of course, making it easy for other kids to make it work on their shyte.

We'll see how far I get. Dreams are fragile things in this world, especially now.

Sign in to participate in the conversation
hackers.town

A bunch of technomancers in the fediverse. This arcology is for all who wash up upon it's digital shore.