This Human-Computer Interaction (HCI) project, developed using Python 3.6, enables users to control the mouse cursor with facial gestures via a regular webcam. No external hardware, sensors, or touch required.
- Completely hands-free mouse control
- Works with standard webcams
- No wearable gear or special sensors
- Squinting: Partial eye closure (like in bright light)
- Winking
- Head movement: Detects pitch and yaw
- Mouth opening: Small open-mouth gesture
Configuration for custom gestures is under development.
Big thanks to Adrian Rosebrock for his insightful blog posts, code snippets, and the imutils library which made this project feasible.
- Numpy 1.13.3
- OpenCV 3.2.0
- PyAutoGUI 0.9.36
- Dlib 19.4.0
- Imutils 0.4.6
-
Install dependencies:
-
Download the pretrained model from Dlib: shape_predictor_68_face_landmarks.dat.bz2 Extract and place the
.datfile in themodel/directory. -
Run the application:
python mouse-cursor-control.py
If you run into issues, feel free to open an issue.
Some gestures may feel awkward in public. As a person managing benign positional vertigo, I empathize and aim to improve gesture comfort. Community feedback for more socially acceptable gestures is welcome!
Facial landmarks are detected using Dlib's pretrained model, enabling recognition of actions like blinking, winking, and mouth movement.
A simple metric for detecting blinks:
if EAR <= threshold:
EYE_STATUS = 'CLOSE'Similar to EAR, MAR increases as the mouth opens.
The model uses:
- HOG + linear SVM for face detection
- Ensemble of Regression Trees for landmark prediction (68 facial landmarks)
Trained on: iBUG 300-W dataset
Note: The iBUG dataset license does not allow commercial use. Contact Imperial College London for commercial licensing information.







