It's probably not all that surprising that brand new Stanford undergraduates would be interested in messing around with robots, computer programming and touch-based feedback systems, the subject of a freshmen seminar called "Haptics: Engineering Touch" taught by mechanical engineering professor Allison Okamura, PhD. But it was interesting, Okamura says, how many of her students chose to combine those elements into devices to help people with conditions like diabetes or blindness.
Her students are just starting out in college and many are unsure about their future plans, making their focus on assistive technologies all the more intriguing, says Okamura, a member of Stanford Bio-X and the Stanford Neurosciences Institute.
Okamura held an open house recently, where the class's six teams got a chance to show off devices they had created, which included a high-tech version of the game Operation intended to help train surgeons, a vibrating glove designed to give something like depth perception to blind people and to something called "Haptic Headband." A steady stream of curious graduate students, postdocs and professors came by to check out the scene, a testament to just how interesting the students' work was, Okamura says.
Among those curious visitors were university photographer Linda Cicero and me. We chronicled our visit in a story and photo gallery, including the shot above, which shows visiting scholar Rui Yang testing a depth-sensing glove, which could be used to help blind people map out their environment or workers in low-light situations, team member Alema Fitisemanu, not pictured, explained.
Previously: Replicating the sensitivity of human touch in robots and Touch-sensitive, self-healing synthetic skin could yield smarter prosthetics
Photo by L.A. Cicero