"Collaborative networked framework for the rehabilitation of children with Down's Syndrome" (PDF) is an old paper from the University of Averio in Portugal, but the project described is really interesting. Presented at the third International Conference on Disability, Virtual Reality and Associated Technology in Alghero, Italy in 2000, the authors propose "a multi-user virtual communication platform that enables rehabilitation and social integration of Down's Syndrome children."
UK charity AbilityNet is partnering with Excitim Ltd. to make PlayStation, PS2 and PS3 controllers that can be manipulated with head movement. Part of their "Dream" line of adapted toys for kids with disabilities, the Dream-Gamer comes with a motion-sensing baseball cap which " enable[s] individuals to use head movements to control aspects of the game such as moving left, right, forward or backwards."
A ScienceDaily article from May profiles a video game that helps people with disabilities learn to shop at a supermarket. The game's development team was led by three 2008 graduates of the Renssalaer Polytechnic Institute, and is part of the CapAbility Games Research Group. The game is called Casual Shopper and its supermarket is based on a local brick-and-mortar store called Price Chopper, right down to the blueprints:
A computer monitor set up directly in front of the user simulates the layout of the store, and a second monitor to the left displays a virtual shopping list. Users start the game by selecting a meal they’d like to make—such as a spaghetti dinner, a holiday ham, or even rotini with alfredo lobster sauce—and complete it when they’ve found all the items on their list.
I think simulating an actual place in a video game is a really neat project. Of course, I'm biased.
Helma van Rijn, a graduate student at the Delft University of Technology, developed a computerized toy to help young autistic children learn language. The project is called LINKX. A person can say a word (e.g. "fishbowl") into a kind of pictogram called a "speech-o-gram"; they then attach the speech-o-gram to the object it names. Kids can link special blocks to the speech-o-grams, which light up with colors and play sounds. Here's a video of LINKX in action:
(Note: While the video's spoken language is Dutch, English subtitles explain the action taking place).
Fragile X syndrome is the most common inherited cause of mental retardation that we know of. (Down Syndrome is more common, but only 3-4% of cases are inherited). Nevertheless, fragile X can take a long time to diagnose properly, because many doctors aren't familiar with its features. According to an article in last month's newsletter of UC Davis's Center for Information Technology Research in the Interest of Society (CITRIS), Dr. Randi Hagerman of the M.I.N.D. Institute is teaming up with media artists Greg Niemeyer and Kimiko Ryokai to create a video game that will not only help screen young children for some of the weaknesses associated with fragile X syndrome like visual-motor and visual-spatial skills, but could help kids improve those skills as well.
Last month, the Meaningful Play conference was held at Michigan State University, sponsored by MSU's Serious Games department. Papers from the proceedings are available online, including John Richardson's paper "The Social Construction Model of Interactive Gaming for Disabled Users." The abstract states:
"Though some pragmatic thought has been put into making computer and video games as accessible to the disabled as such media as film and music, there has been a paucity of research and discourse on the social construction model as it applies to interactive games. With this model, such media impacts the self-identity, social spheres, and coping mechanisms of users with mobility, orientation, and/or neurological challenges. I explain how, on a high-level and conceptual basis, this model emerges out of the generative experiences and inherent feedback components of the interactive game medium, and attempt to frame both the importance of and challenges in implementing greater accessibility from a development perspective. The intent is not to merely state how the industry is overlooking an important demographic, but also to explain how interactive games can play a supportive role in the enrichment of the lives of those within it."
PhD candidate Stephen Vickers and his team at De Montfort University in Leicester, UK are designing software for people with disabilities who use eye-gaze control to be better able to play video games. Here are videos of Stephen demonstrating the software, called Snap Clutch, in World of Warcraft and Second Life.
In a paper Vickers co-authored for the 4th Cambridge Workshop on Universal Access and Assistive Technology ("Gaze Interaction with Virtual On-Line Communities: Levelling the Playing Field for Disabled Users"—available in his list of publications), eye gaze was compared to mouse control in the game Second Life. While eye-gaze performed comparably in moving the avatar from place to place and in manipulating objects, it didn't do well for using applications within the game (thanks to the applications' tiny buttons) or communicating in-game via an on-screen keyboard.
This is video of a virtual disability simulation; its object is to get a character who uses a wheelchair from one end of a city to the other. The simulation uses the Cube 2 engine. It was designed in the summer of 2007 by Project Beta, a team of Philadelphia high school students involved in the Building Information Technology Skills (bITS) program. bITS is sponsored by the Information Technology and Society Research Group (ITSRG) at Temple University.
Comments are subject to approval/deletion based on the following criteria:
1) Treat all users with respect.
2) Post with an open-mind.
3) Do not insult and/or harass users.
4) Do not incite flame wars.
5) Do not troll and/or feed the trolls.
6) No excessive whining and/or complaining.