Skip to content

Latest commit

 

History

History
6 lines (4 loc) · 1.45 KB

README.md

File metadata and controls

6 lines (4 loc) · 1.45 KB

pp-sunspot-sunflare

Our project concept is to use the wireless capabilities and accelerometers of the SunSPOT platform to record and recognize human gestures for command and control applications. Users with limited mobility often have difficulty using traditional computer input devices such as keyboards and mice. Our gesture recognition framework, which we call the SunSPOT Framework and Language for Action Recognition, or SunFLARE, will allow users to dynamically capture movements that they are capable of making (tailoring to a number of physical disabilities) and associate these movements with actions.

We envision SunFLARE as an extensible plug-in framework that will allow developers to add control functionality particular to applications. Examples of the extensible applications that could take advantage of our framework (some of which we plan to develop as part of the semester project) include: manipulation of an on-screen keyboard, development of limited mobility games, and even motor control/physical therapy applications. See the Developer's Guide for more information on how to build 3rd party applications using the SunFLARE Service.

Not only will this project benefit the disabled community, but also it can benefit general users interested in associating a particular movement with action. Though out of the scope of our semester project, our gesture recording and plug-in framework could be used to rapidly develop motion response applications for SunSPOTS.