Skip to content

Commit

Permalink
Gesture Volume Control #401
Browse files Browse the repository at this point in the history
This project is made for issue #401
  • Loading branch information
amankumar100 committed Jun 17, 2023
1 parent cad33f2 commit 4324a56
Show file tree
Hide file tree
Showing 2 changed files with 68 additions and 0 deletions.
53 changes: 53 additions & 0 deletions Machine Learning/Gesture Volume Control/Code.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
import cv2
import mediapipe as mp
import numpy as np
import pycaw

# Initialize the mediapipe hand detection model
mp_hands = mp.solutions.hands
hands = mp_hands.Hands()

# Initialize the volume controller
volume = pycaw.Pycaw()
volMin, volMax = volume.GetVolumeRange()

# Create a video capture object
cap = cv2.VideoCapture(0)

while True:
# Capture a frame from the webcam
ret, frame = cap.read()

# Process the frame with the hand detection model
results = hands.process(frame)

# Check if any hands were detected
if results.multi_hand_landmarks:
# Get the landmarks for the first hand
hand = results.multi_hand_landmarks[0]

# Get the x and y coordinates of the thumb and index finger
thumb_x, thumb_y = hand.landmark[4].x, hand.landmark[4].y
index_x, index_y = hand.landmark[8].x, hand.landmark[8].y

# Calculate the distance between the thumb and index finger
length = np.hypot(thumb_x - index_x, thumb_y - index_y)

# Map the distance to the volume range
volume_level = np.interp(length, [30, 350], [volMin, volMax])

# Set the volume level
volume.SetMasterVolumeLevel(volume_level)

# Display the frame
cv2.imshow('Frame', frame)

# Press 'q' to quit
if cv2.waitKey(1) == ord('q'):
break

# Release the video capture object
cap.release()

# Close all windows
cv2.destroyAllWindows()
15 changes: 15 additions & 0 deletions Machine Learning/Gesture Volume Control/Readme.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
This code will use the MediaPipe hand detection model to track the user's hand gestures. The distance between the thumb and index finger will be used to control the volume level. The code will also display the frame from the webcam so that the user can see the hand gestures that they are making.

To run the code, you will need to install the following libraries:

OpenCV
MediaPipe
NumPy
pycaw

Once you have installed the libraries, you can run the code by saving it as a Python file and then running it from the command line. For example, if you saved the code as gesture_volume_control.py, you could run it by typing the following command into the command line:
Code snippet :

python gesture_volume_control.py

or directly run the program.

1 comment on commit 4324a56

@amankumar100
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please approve this one

Please sign in to comment.