Collection of tools to manage ROSbag recordings data from AV vehicle
This script generates metadata for ROSbag MCAP files. The metadata is compiled into a resources.json
file that complies with the EIDF requirements
- Reads MCAP files and extracts metadata such as:
- Duration of the log
- Topics and message counts
- File size
- File hash (MD5)
- Generates a JSON file (
resources.json
) with metadata for all MCAP files in a given directory. - Metadata includes:
- File name
- Identifier
- Description
- Format
- License
- Size
- Hash
- Issued date
- Modified date
- Duration
- Topics and message counts
Ensure all dependencies are installed. You can use the following command to install required packages:
pip install mcap
To generate the metadata JSON file, follow these steps:
-
Place all your MCAP files in a directory.
-
The default directory is
/recorded_datasets/edinburgh
-
Run the script:
python metadata_generator.py
If you want to generate metadata for files in a specified path, run the script:
python metadata_generator.py -p path/to/file
The script will generate a resources.json
file in the specified directory. This JSON file will contain metadata for each MCAP file in the directory.
This script automates the process of uploading rosbags from the IPAB-RAD autonomous vehicle server to a cloud instance within the EIDF (Edinburgh International Data Facility) infrastructure. It streamlines data collection and transfer by first compressing the rosbags using the MCAP CLI, and then uploading the compressed files. This ensures efficient handling and storage of large datasets generated by vehicle sensors.
-
Install Python dependencies:
pip install colorlog paramiko paramiko_jump
-
Install MCAP CLI
v0.0.47
for rosbag compression:wget -O $HOME/mcap https://github.com/foxglove/mcap/releases/download/releases%2Fmcap-cli%2Fv0.0.47/mcap-linux-amd64 chmod +x $HOME/mcap
-
Set up an FTP server by following this guide
- Install
lftp
:sudo apt update && sudo apt install lftp
To execute the script from the host machine, run:
python3 -m upload_rosbags.upload_rosbags \
--config ./upload_rosbags/upload_config.yaml \
--lftp-password <host_user_password> \
# Add --debug flag to set DEBUG log level
The configuration file accepts the following parameters:
local_host_user
(str): Username for the host machine.local_hostname
(str): IP address or hostname of the host machine (interface connected to the internet).local_rosbags_directory
(str): Path to the directory on the host machine containing the rosbags.cloud_ssh_alias
(str): SSH alias for the cloud server defined in~/.ssh/config
. If unset,cloud_user
andcloud_hostname
must be provided.cloud_user
(str): Username for the cloud target machine. Ignored ifcloud_ssh_alias
is defined and valid.cloud_hostname
(str): Hostname or IP of the cloud target machine. Ignored ifcloud_ssh_alias
is defined and valid.cloud_upload_directory
(str): Destination directory on the cloud server for uploading compressed files.mcap_bin_path
(str): Full path to themcap
CLI binary.mcap_compression_chunk_size
(int): Chunk size in bytes used during MCAP compression.compression_parallel_workers
(int): Number of parallel worker threads for compression.compression_queue_max_size
(int): Maximum number of compressed rosbags allowed in the queue at any time.
See upload_config.yaml for a sample configuration.
The script logs its actions to a file named <timestamp>_rosbag_upload.log
.
This script automates the merging of ROS2 bag files using the ros2 bag convert
command.
The merging is based on a .yaml
file that complies with rosbag2 guidelines.
Before running the Python script, ensure that the Docker container is set up by running the dev.sh
script:
./dev.sh
Note: dev.sh
will automatically mount the /mnt/vdb/data
directory as the rosbags
directory in the container. This is the current default SSD location on the EIDF cloud machine. Use the -p
flag to specify a different directory.
Once the Docker container is running, you can execute the Python script with the following command:
merge_rosbags.py --input <input_directory> --config <config_file.yaml> --range <start:end>
--input <input_directory>
: The directory containing.mcap
files to be merged.--config <config_file.yaml>
: Path to the YAML configuration file used for the merge.- Please refer to the
rosbag_util
directory androsbag2
Converting bagsREADME
for further reference.
- Please refer to the
--range <start:end>
(optional): Range of rosbag split indices to process, in the formatstart:end
.
Example:
If we have the following directory structure:
└── rosbags
├── 2024_10_24-14_15_33_haymarket_1
│ ├── 2024_10_24-14_15_33_haymarket_1_0.mcap
│ ├── 2024_10_24-14_15_33_haymarket_1_1.mcap
│ ├── 2024_10_24-14_15_33_haymarket_1_2.mcap
│ ├── 2024_10_24-14_15_33_haymarket_1_3.mcap
│ ├── 2024_10_24-14_15_33_haymarket_1_4.mcap
...
We can run the script as:
merge_rosbags.py --input ./rosbags/2024_10_24-14_15_33_haymarket_1 --range 0:2 --config ./rosbag_util/mapping_merge_params.yaml
# Or omit the range to merge all available rosbags
merge_rosbags.py --input ./rosbags/2024_10_24-14_15_33_haymarket_1 --config ./rosbag_util/mapping_merge_params.yaml
The resulting merged .mcap
file will be saved within the specified directory under a sub-directory named the same as the value of the uri
parameter defined in the YAML file. The file name will be in the format <current_rosbag_name>_<uri>_<from-to>
. If no range is specified, the name will be <current_rosbag_name>_<suffix>
.
└── rosbags
├── 2024_10_24-14_15_33_haymarket_1
│ ├── 2024_10_24-14_15_33_haymarket_1_0.mcap
│ ├── 2024_10_24-14_15_33_haymarket_1_1.mcap
│ ├── 2024_10_24-14_15_33_haymarket_1_2.mcap
│ ├── 2024_10_24-14_15_33_haymarket_1_3.mcap
│ ├── 2024_10_24-14_15_33_haymarket_1_4.mcap
│ └── mapping
├── 2024_10_24-14_15_33_haymarket_mapping_0-2.mcap
└── 2024_10_24-14_15_33_haymarket_mapping.mcap # Larger file
...