When our company’s co-founder encouraged all of our Engineering department to participate in the Google Glass Explorer contest, I thought about project ideas that could help people by using the unique features of this new augmented-reality technology. I remembered a project that some fellow students did during a robotics class that I took in graduate school. It used eye-tracking technology to remotely control the motors on a vehicle. After confirming that Google planned to embed eye-tracking technology in their new product, I realized this idea could work for applications such as wheelchairs.

My plan is to provide feedback about the wearer’s surroundings, including obstacles and suggested paths, and enable him or her to control the wheelchair with eye movements. The original student project used patterns of a user’s eyes being opened or closed to change between types of motion. For my project, I want to use subtle yet deliberate movements of the eye to let the user interact seamlessly with the surrounding environment. I think this technology could be life-changing for persons with disabilities. I hope that being able to work on this project with the support of the Google Glass Explorer program will help make it a reality.

I wrote up my idea, posted it on Google Plus with the #ifihadglass hashtag, and some fellow Wayfairians tweeted about it. Walter Frick of BostInno saw a tweet, did an interview with me, and then wrote an article about it. You can read the full story at these links:

BostInno: http://bit.ly/X7nD6b
Popular Science write-up: http://bit.ly/16unjVS