项目作者: GajuuzZ

项目描述 :
AlphaPose + ST-GCN + SORT.
高级语言: Python
项目地址: git://github.com/GajuuzZ/Human-Falling-Detect-Tracks.git
创建时间: 2020-03-18T10:59:27Z
项目社区:https://github.com/GajuuzZ/Human-Falling-Detect-Tracks

开源协议:

下载


Human Falling Detection and Tracking

Using Tiny-YOLO oneclass to detect each person in the frame and use
AlphaPose to get skeleton-pose and then use
ST-GCN model to predict action from every 30 frames
of each person tracks.

Which now support 7 actions: Standing, Walking, Sitting, Lying Down, Stand up, Sit down, Fall Down.



Prerequisites

  • Python > 3.6
  • Pytorch > 1.3.1

Original test run on: i7-8750H CPU @ 2.20GHz x12, GeForce RTX 2070 8GB, CUDA 10.2

Data

This project has trained a new Tiny-YOLO oneclass model to detect only person objects and to reducing
model size. Train with rotation augmented COCO person keypoints dataset
for more robust person detection in a variant of angle pose.

For actions recognition used data from Le2i
Fall detection Dataset (Coffee room, Home) extract skeleton-pose by AlphaPose and labeled each action
frames by hand for training ST-GCN model.

Pre-Trained Models

Basic Use

  1. Download all pre-trained models into ./Models folder.
  2. Run main.py
    1. python main.py ${video file or camera source}

Reference