MuCAR-3

MuCAR-3 is based on a Volkswagen Touareg. The vehicle is fully equipped with drive-by-wire capabilities.

The GOOSE DB contains ROS bags with annotated LiDAR point clouds and annotated images from the color and near-infrared prism camera behind the windshield.

The MuCAR-3 also includes 4 color cameras mounted close to the roof LiDAR system, providing a 360° view around the ego vehicle. Short-range radar (SRR) and mid-range radar (MRR) sensors are mounted on the vehicle. A INS/GNSS system allows for RTK localization accuracy.

Processed Data

The bag files containing labeled data will have the following topics:

Topic Type Framerate
/semantic_segmentation/camera/windshield/nir/image_rect sensor_msgs/Image ≈ 0.2 hz
/semantic_segmentation/camera/windshield/nir/image_rect/instances sensor_msgs/Image ≈ 0.2 hz
/semantic_segmentation/camera/windshield/vis/image_rect_color sensor_msgs/Image ≈ 0.2 hz
/semantic_segmentation/camera/windshield/vis/image_rect_color/instances sensor_msgs/Image ≈ 0.2 hz
/semantic_segmentation/lidar/vls128_roof/velodyne_points sensor_msgs/PointCloud2 ≈ 0.2 hz
/sensor/camera/windshield/nir/camera_info sensor_msgs/CameraInfo 10 hz
/sensor/camera/windshield/nir/image_rect sensor_msgs/Image 10 hz
/sensor/camera/windshield/vis/camera_info sensor_msgs/CameraInfo 10 hz
/sensor/camera/windshield/vis/image_rect_color sensor_msgs/Image 10 hz
/sensor/ins/oxts_rt3000/gps/fix sensor_msgs/NavSatFix 100 hz
/sensor/lidar/vls128_roof/velodyne_points sensor_msgs/PointCloud2 10 hz
/tf tf2_msgs/TFMessage
/tf_static tf2_msgs/TFMessage

Raw Data

Raw Data

The ROS bag files containing the raw sensor data are coming soon. As of now, the GOOSE-DB only contains the processed data mentioned above.

TF

The complete TF tree of the MuCAR-3 is available as a Graphviz .dot file for download. This can be viewed within the rqt_tf_tree plugin. In the following section, we will present subtrees of the MuCAR-3 TF to explain important aspects of it.

the VLS128 LiDAR is parent to most sensors

All side-facing LiDAR sensors, radar sensors and most camera sensors (except the roof thermal infrared camera) are calibrated in relation to the VLS128 roof LiDAR sensor. So the MuCAR-3 TF tree has these sensors as children of the VLS128 LiDAR's frame.

base_link is the tree root

Having the base_link as root frame in the TF tree violates the convention specified in ROS REP 105. We prefer the frame name utm over map to clarify that these are UTM coordinates. Our dead reckoning is done in the chassis-fixed vehicle/rear_axis frame, which serves as intermediate frame between odom and the root frame base_link. The utm frame is published based on the data from our INS system.

This TF tree structure allows for multiple odom frames at the same time (e.g. wheel odometry and visual odometry), which can be interesting for egomotion research.

Calibration

Extrinsic calibration between camera and the roof LiDAR scanner has been performed and can be read via the static transforms provided by TF. This fork of the velo2cam_calibration ROS package was used for the extrinsic calibration. The calibration board uses checkerboard corners to improve the AprilTag detection as described in [1]. The apriltags_tas ROS package is publicly available.

The side-facing LiDAR scanners were not extrinsicly calibrated. Instead the position on the ego vehicle was determined using a handheld 3D scanner.

Intrinsic calibration of the cameras are published via camera_info topics.

The camera parameters for all cameras mounted on the MuCAR-3 are also available in a OpenCV format (conversion was done using Yoayu Hu's script). All files are available here for direct use:

camera name ROS compatible file formats OpenCV compatible file formats
windshield_vis .yaml, .ini K, P, R, distortion
windshield_nir .yaml, .ini K, P, R, distortion
surround_front .yaml, .ini K, P, R, distortion
surround_right .yaml, .ini K, P, R, distortion
surround_back .yaml, .ini K, P, R, distortion
surround_left .yaml, .ini K, P, R, distortion
roof_ir .yaml, .ini K, P, R, distortion
marveye_left .yaml, .ini K, P, R, distortion

References

  1. Jan Kallwies, Bianca Forkel, and Hans-Joachim Wuensche. "Determining and Improving the Localization Accuracy of AprilTag Detection" International Conference on Robotics and Automation (ICRA 2020)