Welcome back to Reading the Research, where I trawl
the Internet to find noteworthy research on autism and related subjects,
then discuss it in brief with bits from my own life, research, and
observations.
Today's article discusses a prototype piece of assistive technology that may help autistic individuals learn how to make better conversation.
Remember the Google Glass hubbub a few years back? Google developed and tested, essentially, a wearable computer built into a pair of glasses. There was a huge backlash of mostly negative feedback, including privacy concerns and safety concerns. After all, if you're driving and the glasses put something in the middle of your field of vision, how are you supposed to see the cars in front of you? And of course, nobody in Witness Protection wants someone near them taking pictures all willy-nilly. Same with a lot of plain folks, even some with nothing in particular to hide.
Apparently despite the eventual death of the project for public sale, "wearable assistive technology" isn't out of the picture entirely. There are a number of similar products listed in Wikipedia under the name "Smartglasses."
This particular study merely names Google Glass an one of the types of "wearable technology" that their prototype could be used with. They tested an app called "Holli" with 15 autistic children, asking them to be guided by the app while interacting socially. Holli was made to listen to a conversation, then provide possible appropriate responses for the user to give to move the conversation along. When the user gives a response, the provided responses disappear until the next exchange happens in the conversation.
Their simple example in the article was that, when the user was greeted with "Welcome," the app provides responses like "Hey," "Hello," and "Afternoon." The study went very well according to the article, with the app managing to complete most conversations without error, and the children having no real trouble following the promptings from the app. Basically, a successful trial run.
The researchers suggested further progress might be made in customizing the text size and location, and the app's speech recognition ability. Not everyone with autism can see with 20-20 vision, and of course some people don't enunciate well. But they seem optimistic about the matter.
While I recognize the importance of being able to carry on a conversation, and suspect such a device might have been helpful while I was growing up, I think they're missing the obvious possibility: body language. Assuming we can get past the privacy issues with cameras on glasses, of course.
Body language, you see, is unfortunately rather important to carrying out a conversation successfully. If a person is smiling, you want to know what kind of smile it is (mean, polite, friendly, happy), so you can adjust accordingly. Mean smiles mean you might be in trouble or someone might be making fun of you. Polite ones mean the person is being businesslike and you should handle your business efficiently and then leave them alone. Friendly ones mean you can consider making conversation a bit, but watch carefully for shifts in expression because not everyone wants to spend two minutes hearing about your day or your fascination with a video game. Happy smiles are awfully situational to me, and I tend to try to figure out whether they're happy about something unrelated to me, or if I've done something that made them happy.
That's one kind of facial expression. There's a ton of expressions that would be good to put into something like this: anger, fear, sadness, shock, tiredness, disgust, embarrassment, and thoughtfulness all come to mind as relatively different-looking expressions that could be programmed in.
Then there's gross body language. Like posture. People slouch or hunch, but they do so for multiple reasons. Some people hunch defensively. That can signal low self-esteem or being overwhelmed. I hunched when I stood, walked, and sat for most of my life. It wasn't on purpose. It was just what was natural to me, possibly because I was always depressed, always felt alone, and always felt defensive and overwhelmed. People also sometimes slouch if they're relaxed. They feel more comfortable like that than sitting bolt upright, or at least sitting straight. Finally, people can slouch in a threatening manner, a sort of "you don't scare me" relaxed-but-ready posture.
Does this seem complicated to you when I explain it? Yet most people take knowing all this for granted. Simply having the socially correct words displayed in front of you doesn't seem like enough to me. Helpful, maybe, in basic social situations. But you could do so much more.
Today's article discusses a prototype piece of assistive technology that may help autistic individuals learn how to make better conversation.
Remember the Google Glass hubbub a few years back? Google developed and tested, essentially, a wearable computer built into a pair of glasses. There was a huge backlash of mostly negative feedback, including privacy concerns and safety concerns. After all, if you're driving and the glasses put something in the middle of your field of vision, how are you supposed to see the cars in front of you? And of course, nobody in Witness Protection wants someone near them taking pictures all willy-nilly. Same with a lot of plain folks, even some with nothing in particular to hide.
Apparently despite the eventual death of the project for public sale, "wearable assistive technology" isn't out of the picture entirely. There are a number of similar products listed in Wikipedia under the name "Smartglasses."
This particular study merely names Google Glass an one of the types of "wearable technology" that their prototype could be used with. They tested an app called "Holli" with 15 autistic children, asking them to be guided by the app while interacting socially. Holli was made to listen to a conversation, then provide possible appropriate responses for the user to give to move the conversation along. When the user gives a response, the provided responses disappear until the next exchange happens in the conversation.
Their simple example in the article was that, when the user was greeted with "Welcome," the app provides responses like "Hey," "Hello," and "Afternoon." The study went very well according to the article, with the app managing to complete most conversations without error, and the children having no real trouble following the promptings from the app. Basically, a successful trial run.
The researchers suggested further progress might be made in customizing the text size and location, and the app's speech recognition ability. Not everyone with autism can see with 20-20 vision, and of course some people don't enunciate well. But they seem optimistic about the matter.
While I recognize the importance of being able to carry on a conversation, and suspect such a device might have been helpful while I was growing up, I think they're missing the obvious possibility: body language. Assuming we can get past the privacy issues with cameras on glasses, of course.
Body language, you see, is unfortunately rather important to carrying out a conversation successfully. If a person is smiling, you want to know what kind of smile it is (mean, polite, friendly, happy), so you can adjust accordingly. Mean smiles mean you might be in trouble or someone might be making fun of you. Polite ones mean the person is being businesslike and you should handle your business efficiently and then leave them alone. Friendly ones mean you can consider making conversation a bit, but watch carefully for shifts in expression because not everyone wants to spend two minutes hearing about your day or your fascination with a video game. Happy smiles are awfully situational to me, and I tend to try to figure out whether they're happy about something unrelated to me, or if I've done something that made them happy.
That's one kind of facial expression. There's a ton of expressions that would be good to put into something like this: anger, fear, sadness, shock, tiredness, disgust, embarrassment, and thoughtfulness all come to mind as relatively different-looking expressions that could be programmed in.
Then there's gross body language. Like posture. People slouch or hunch, but they do so for multiple reasons. Some people hunch defensively. That can signal low self-esteem or being overwhelmed. I hunched when I stood, walked, and sat for most of my life. It wasn't on purpose. It was just what was natural to me, possibly because I was always depressed, always felt alone, and always felt defensive and overwhelmed. People also sometimes slouch if they're relaxed. They feel more comfortable like that than sitting bolt upright, or at least sitting straight. Finally, people can slouch in a threatening manner, a sort of "you don't scare me" relaxed-but-ready posture.
Does this seem complicated to you when I explain it? Yet most people take knowing all this for granted. Simply having the socially correct words displayed in front of you doesn't seem like enough to me. Helpful, maybe, in basic social situations. But you could do so much more.
No comments:
Post a Comment