View Single Post
Old 01-23-2011, 02:23 PM   #12
Lamplighter
Person who doesn't update the user title
 
Join Date: Jun 2010
Location: Bottom lands of the Missoula floods
Posts: 6,402
Quote:
Originally Posted by Clodfobble View Post
Er... because if you have enough muscle strength and cognition to operate an artificial speech machine with your jaw and tongue... you could just talk. There are, however, mouth-operated devices for people with physical disabilities that can move a wheelchair around, or reach for things.
Clod, I didn't say it very well.
I was trying to get across the idea that no matter what moments the person made
with their jaw and/or tongue, those movements could gradually be learned and interpreted (and filtered)
by the computer to be translated into speech or other more purposeful actions.
Sort of like this game

I've been off put by some learning situations where so many different actions
need to be coordinated and perfected... such as hand-eye.

I once saw a demonstration of an artificial hand where a deaf-blind person could finger-spell on the hand and a computer translate it into speech.
It also worked in the other direction... with speech-recognition the computer
would manipulate the artificial hand into finger-spelling to be read by the deaf-blind person.
Lamplighter is offline   Reply With Quote