Skip to content

NachtSpyder04/Team-Arm-E-Hands-SIH-24

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Team-Arm-E-Hands-SIH-24

We read Electromyography (EMG) data using sensors, amplify it, filter it, perform ADC conversion, and process it. Using the analyzed data, we train a DL model that can understand what hand movement was performed to get those readings. Later, those readings are fed into a DL model to recognize and classify those motions. An ESP/Raspberry Pi will have the deep learning model running on it. Based on the predicted label, the ESP/Raspberry Pi will control the prosthetic arm, giving us an affordable but effective myoelectric prosthetic arm.

To run the project:

  1. Make sure the EMG data collection sensors are connected.
  2. Make sure you have ESP-IDF installed on your device.
  3. Go to src/EMG-Firmware.
  4. In the terminal, run the following commands:
    get_idf
    idf.py build
    idf.py flash monitor
    

This will get the EMG data collection code running. From the monitor, copy the IP address of the ESP/Raspberry Pi after updating the Wi-Fi SSID and password.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •