-
Notifications
You must be signed in to change notification settings - Fork 12
Description
Hello,
I would like to ask you a question regarding the conversion from a mondgodb to rosbag.
I am storing a number of topics (34 to be precise) into a mongodb using the following scenario file:
context: "default"
storage:
method: "database"
config: "default"
data: {
topics: {
topic_1: "topic_1_ros",
topic_2: "topic_2_ros",
...
topic_34: "topic_34_ros"
}
}
collection:
method: "timer"
timer_delay: 1
The specific scenario and the configuration with the timer of 1 second delay generates a lot of data and documents in the DB (something like 150 GBs for 24 hours of recording).
I want to pick some and convert them to rosbag, based on specific timestamps, i.e. take the items from the DB between 14:00 and 15:00 in that day. To do that I am using the convert.py rosnode with a query like that as argument
-q '{"_ts_meta.sys_time": {"$gt": 1680009972.091473, "$lt": 1680009992.091473}}'
The problem with this is that it takes way too much time and most of the time resources to execute this query and generate the corresponding bag file, and I was wondering if this is due to the big amount of data that is stored in the DB or the fact that it is deployed in a docker container or if it's something else that I am missing or doing wrong.
I would really appreciate some help to figure out what is the issue there.
Thank you