A Python-based computer vision system that uses MediaPipe Pose to recognize body gestures like jumping, crouching, hand-joining, and horizontal movement to control games like Subway Surfers or Temple Run without touching a keyboard!
This project is designed to be extended to play endless runner games using your body:
Game | Gesture | Action Triggered |
---|---|---|
Subway Surfers | Jump | Character jumps |
Crouch | Character rolls | |
Move Left/Right | Character dodges | |
Temple Run | Hands Joined | Start or pause the game |
๐ก With some automation via
pyautogui
, gestures can be mapped to keystrokes (e.g., jump = up arrow, crouch = down arrow).
- โ Real-time pose tracking using MediaPipe
- ๐ Hand gesture detection (joined / not joined)
โ๏ธ Horizontal movement detection (left / center / right)- ๐ฆต Posture classification (jump / stand / crouch)
- ๐ธ Webcam-based gesture recognition
- ๐ง Easy integration with automation tools
- Python ๐
- OpenCV ๐ฅ
- MediaPipe ๐
- Matplotlib ๐
- PyAutoGUI (optional for automation) โจ๏ธ
-
Clone the repository:
git clone https://github.com/yourusername/pose-gesture-detection.git cd pose-gesture-detection
-
Install required packages:
pip install opencv-python mediapipe matplotlib pyautogui
-
Run the Program
python main.py
This will:
-
Open your webcam
-
Detect pose and classify gestures
-
Optionally simulate key presses using gestures
-
.
โโโ subwaySurfers.py # Main script to run the real-time system
โโโ README.md # Project documentation
Built using:
-
MediaPipe Pose
-
OpenCV
-
PyAutoGUI