Skip to content

This project enables hands-free mouse control using facial gestures detected via a webcam, powered by Python and Dlib. It recognizes actions like winking, squinting, and head movement to move the cursor without any external devices.

Notifications You must be signed in to change notification settings

saivishwanatha/Mouse-Cursor-Control-Using-Facial-Movements

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

No Hands? No Problem — Mouse Cursor Control Using Facial Movements

This Human-Computer Interaction (HCI) project, developed using Python 3.6, enables users to control the mouse cursor with facial gestures via a regular webcam. No external hardware, sensors, or touch required.

Features

  • Completely hands-free mouse control
  • Works with standard webcams
  • No wearable gear or special sensors

Supported Gestures

  • Squinting: Partial eye closure (like in bright light)
  • Winking
  • Head movement: Detects pitch and yaw
  • Mouth opening: Small open-mouth gesture

Configuration for custom gestures is under development.

Demo

Demo Demo Demo Demo

Special Thanks

Big thanks to Adrian Rosebrock for his insightful blog posts, code snippets, and the imutils library which made this project feasible.

Requirements

  • Numpy 1.13.3
  • OpenCV 3.2.0
  • PyAutoGUI 0.9.36
  • Dlib 19.4.0
  • Imutils 0.4.6

Setup Instructions

  1. Install dependencies:

  2. Download the pretrained model from Dlib: shape_predictor_68_face_landmarks.dat.bz2 Extract and place the .dat file in the model/ directory.

  3. Run the application:

    python mouse-cursor-control.py

If you run into issues, feel free to open an issue.

Usage Notes

Some gestures may feel awkward in public. As a person managing benign positional vertigo, I empathize and aim to improve gesture comfort. Community feedback for more socially acceptable gestures is welcome!

How It Works

Facial landmarks are detected using Dlib's pretrained model, enabling recognition of actions like blinking, winking, and mouth movement.

Eye-Aspect-Ratio (EAR)

A simple metric for detecting blinks:

if EAR <= threshold:
    EYE_STATUS = 'CLOSE'

Mouth-Aspect-Ratio (MAR)

Similar to EAR, MAR increases as the mouth opens.

Pretrained Model Details

The model uses:

  • HOG + linear SVM for face detection
  • Ensemble of Regression Trees for landmark prediction (68 facial landmarks)

Trained on: iBUG 300-W dataset

Note: The iBUG dataset license does not allow commercial use. Contact Imperial College London for commercial licensing information.

References

About

This project enables hands-free mouse control using facial gestures detected via a webcam, powered by Python and Dlib. It recognizes actions like winking, squinting, and head movement to move the cursor without any external devices.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages