As anyone who has attempted a bakasana knows, perfecting certain yoga poses takes time, and watching an instructor twist and bend into position can help a lot.
The blind or sight-impaired, however, don't have the advantage of being able to see a teacher's movements.
Enter Eyes-Free Yoga, a software program out of the University of Washington that works with the cameras in Microsoft's Kinect motion sensor device to track users' position and offer spoken feedback in real time. "Rotate your shoulders left," it might say. "Lean sideways toward your left," "Bend your right leg further," or "Bring your arms closer to your head."
Currently, the virtual yoga instructor offers auditory input for six yoga poses, including Warrior I and II, Tree, and Chair, and contains about 30 different commands for improving each.
Project lead Kyle Rector, a UW doctoral student in computer science and engineering, conferred with a number of yoga teachers to nail the proper stance for each pose. She also did lots of yoga while writing the Eyes-Free Yoga code, testing and tweaking it by purposefully making mistakes while exercising.
--Julie Kientz, University of Washington professor
Using skeletal-tracking technology and basic geometry, the Kinect suggests alignment for a user's core, then proceeds to the head and neck area and the arms and legs. It also gives kudos when someone holds a pose correctly.
Rector and her collaborators -- Julie Kientz, a UW assistant professor in Human Centered Design & Engineering, and Cynthia Bennett, a research assistant in computer science and engineering -- detail their research (PDF) in the conference proceedings of the Association for Computing Machinery's SIGACCESS International Conference on Computers and Accessibility, which takes place in Bellevue, Wash., next week.
They tested the Eyes-Free Yoga on 16 blind and low-vision people, some of whom had never tried yoga before. Of the testers, 13 said they would recommend the "exergame." Though they suggested ways the researchers could improve the program, almost all said they would use it again.
Some tools for visually impaired yoga practitioners already exist, including a Visually Impaired Yoga Mat with raised and depressed sections strategically placed to guide a student's hands, feet, and head, and a yoga board that communicates through body sensations.
The UW researchers acknowledge that when it comes to more complex yoga poses, human intervention may need to augment the customized vocal commands of Eyes-Free Yoga. But they nonetheless regard it as a promising way to transform a typically visual-aided activity into something blind people can enjoy more easily.
"I see this as a good way of helping people who may not know much about yoga to try something on their own and feel comfortable and confident doing it," Kientz said in a statement. "We hope this acts as a gateway to encouraging people with visual impairments to try exercise on a broader scale."
This isn't the first time the Kinect or Kinect-like hardware has been tapped to assist the blind. It's been turned into a haptic navigation belt and used to give voice notifications of physical obstacles on a path. The Kinect has also been used to advance health and health education in other ways: enabling surgeons to view and manipulate medical images via gesture and voice control and helping to teach anatomy.
The UW team plans to make Eyes-Free Yoga available online so users, blind and sighted alike, can download the program, plug in their Kinect, and start getting into Tree pose. The team also is pursuing other fitness-related projects.