Generate fisher vector for video and sensor data using yael library.
This project is for multi-modal human activity recognition
- Yael library
- http://yael.gforge.inria.fr/gettingstarted.html
- Setup python interface of Yael
- http://yael.gforge.inria.fr/python_interface.html
Note: Need configure with
--enable-numpy
to allow some functions like fvec_to_numpy.
- Trajectories code
- http://lear.inrialpes.fr/people/wang/dense_trajectories
testfile_video.sh filename
testfile_sensor.sh filename
The program will output a label which indicates an activity.
- walking
- walking upstairs
- walking downstairs
- eating
- drinking
- push-ups
- runing in the gym
- working at PC
- reading
- sitting
Note: Not finished.
-
Video data processing:
-
Downsample the videos
-
Generate trajectory features
-
Sample ten percent data to form the codebook
-
Build Guassian Mixure Model (GMM)
-
Generate fisher vector
-
Classify fisher vectors using SVM
-
Sensor data processing:
-
Use sliding windows to pre-process the data
-
Sample ten percent data to form the codebook
-
Build Guassian Mixure Model (GMM)
-
Generate fisher vector
-
Classify fisher vectors using SVM
- TBA
Singapore University of Technology and Design (SUTD)