A data set for upper body orientation estimation of humans with continuous ground truth labels for the angle perpendicular to the ground
The NICR RGB-D Orientation Data Set
is a data set for upper body orientation estimation of humans with continous
ground truth labels for the angle perpendicular to the ground. Each data sample
was recorded with a Kinect2 device and thus, consists of a (depth aligned) RGB
and depth image
as well as of detailed label information. Furthermore, we already converted
the persons in both images into pcd-files which represent colored point clouds.
These clouds
can be opened with the Point Cloud Library. Each of
the 37 persons within the data set was tracked by an external
ARTTRACK tracking system in order to generate
precise ground truth labels. In total, the data set consists of more than
100,000 samples divided into training, validation and test.
The data set was captured in the lab for
virtual reality at our university.
For recording, we used multiple synchronized Kinect2 sensors simultaneously
and all persons were tracked with the external
ARTTRACK tracking system which consists of a
tracking device and four infrared cameras. The lab and the used sensors are
pictured in the image below.
There is a directory for each subset of data, i.e. a directory for training,
validation and test. In each of these are the following directories:
Each of these subdirectories contains a directory for each person of that subset
which holds the actual data.
Visit the NICR RGB-D Orientation Data Set web page.
The source code in this package is published under BSD 3-Clause
license, see license file for details. NOTE: This license only
covers the source code in this package, NOT the actual data set!
Extra Terms and Conditions apply to the data set, which must be aggreed to! See
Get access section and
the
NICR RGB-D Orientation Data Set
web page.
If you use our data set in your work, please cite the following paper:
Lewandowski, B., Seichter, D., Wengefeld, T., Pfennig, L., Drumm, H., Gross, H.-M.
Deep Orientation: Fast and Robust Upper Body Orientation Estimation for Mobile Robotic Applications.
in: IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), Macau, pp. 441-448, IEEE 2019
@InProceedings{Lewandowski-IROS-2019,
author = {Lewandowski, Benjamin and Seichter, Daniel and Wengefeld, Tim and Pfennig, Lennard and Drumm, Helge and Gross, Horst-Michael},
title = {Deep Orientation: Fast and Robust Upper Body Orientation Estimation for Mobile Robotic Applications},
booktitle = {IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), Macau},
year = {2019},
pages = {441--448},
publisher = {IEEE},
}
In any publication that uses the NICR RGB-D Orientation Data Set
(including online publications and Web sites), it is only permitted to represent
the persons p0, p1, p8, p11 and p12 in images and illustrations.
The illustration of all the other persons is strictly prohibited.
Install dependencies and python package (Python >= 3.6)
# clone repository
git clone https://github.com/tui-nicr/nicr-rgb-d-orientation-data-set.git
# option 1: install OpenCV 3 separately (optional) and data set package
pip install opencv-python==3.4.2.* [--user]
pip install /path/to/this/repository [--user]
# option 2: install data set package including OpenCV 3
pip install /path/to/this/repository[with_opencv] [--user]
Use data set (examples)
from nicr_rgb_d_orientation_data_set import load_set
import matplotlib.pyplot as plt
# load set, e.g. training set
data_set = load_set('/path/to/downloaded/data/set', set_name='training')
# extract all patches at once
data_set.extract_all_patches()
# load depth, rgb and mask patch of a single sample, e.g. the 11th sample
sample = data_set[10]
depth_patch = sample.get_depth_patch()
rgb_patch = sample.get_rgb_patch()
mask_patch = sample.get_mask_patch()
# visualize sample
fig = plt.figure()
ax = fig.add_subplot(1, 3, 1)
ax.imshow(depth_patch, cmap='gray')
ax.axis('off')
ax = fig.add_subplot(1, 3, 2)
ax.imshow(rgb_patch)
ax.axis('off')
ax = fig.add_subplot(1, 3, 3)
ax.imshow(mask_patch, cmap='binary_r')
ax.axis('off')
fig.suptitle(f'Angle: {sample.orientation}°')
fig.show()
For further examples, see Deep Orientation Repository.