Skip to content

Path planning algorithm implementation of a safe and intelligent robotic arm, which users can control using their eye-gaze.

Notifications You must be signed in to change notification settings

AnujithM/Eye-Gaze-Controlled-Robot.github.io

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Grid-based dynamic planner using monocular camera vision.

Python OpenCV YOLOv8 NumPy License
Path planning algorithms and upper limb segmentation using instance segmentation models are utilized to generate obstacle-free trajectories to the printing locations, mitigating interference from hand and forearm. This mainly uses a monocular camera to create a grid-based environment to create trajectories for the robot to follow.

Requirements

  • Python 3.x
  • OpenCV
  • NumPy
  • Requests
  • Supervision (sv)
  • Inference models (from inference.models.utils)

Installation

  1. Clone the repository:

    git clone https://github.com/AnujithM/Eye-Gaze-Controlled-Robot.github.io.git
    cd Eye-Gaze-Controlled-Robot.github.io/Planner
  2. Install the required Python packages:

    pip install opencv-python-headless numpy requests supervision

Usage

  1. Replace model_id and api_key in the script with your specific model ID and API key.

  2. Update the url variable with the correct MJPEG stream URL.

  3. Run the script:

    python DynamicAstar_V12.py
    

How It Works

  1. Fetch and Decode Frames: The script continuously fetches frames from the specified MJPEG stream URL and decodes them using OpenCV.

  2. Hand Segmentation: The frames are passed through a hand segmentation model to detect and segment hands in the frame.

  3. Grid and Adjacency List: A grid overlay is created on the frame, dividing it into cells. An adjacency list is maintained to represent the connections between the grid cells.

  4. Dynamic A Pathfinding:* The dynamic A* algorithm finds a path from a source cell to a goal cell, avoiding cells containing hands (red centroids).

  5. Frame Annotation: The segmented frame is annotated with the detected hands, grid cells, and the path found by the A* algorithm.

  6. Real-time Display: The annotated frame is displayed in real-time, showing the segmented hands, grid, and path.

Results

Acknowledgments

Parts of this project page were adopted from the Nerfies page.

Website License

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

About

Path planning algorithm implementation of a safe and intelligent robotic arm, which users can control using their eye-gaze.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published