You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We (Intempora, editors of the RTMaps and IVS software tools) would need to be able to open several MCAP files simultaneously in a single LichtBlick instance and that the data from those MCAP files is fed into the dashboard in a synchronized manner as if it were a single file.
We could call this feature the "multi-source" capability.
The use case is as follows: consider we have a certain recording containing raw sensor data (and actually TeraBytes, and PetaBytes of such recordings).
This recording later on gets enriched with additional streams (e.g. ground truths streams, with 3D objects, segmented images, lane markings..., or just situational tags). When generated, these additional streams would be stored into a separate MCAP file.
For now we have to merge the streams in a single MCAP file which introduces a lot of overhead in the post-processing workflow (particularly when some of these additional streams, e.g. ground truth streams, are only available on a short sequence of the initial recording).
A next step in the multi-source approach would be to be able to stream data via websockets along with playing back an MCAP file.
Use case details:
A sensor data recording is stored in a format that is not MCAP (e.g. MDF4, RTMaps .rec, .dat, rosbag, ...)
We want to display the data in Lichtblick.
The signal streams (e.g. vehicle speed, accelerations, GPS trace etc.) are lightweight and can easily be exported to MCAP to constitue an "IterableSource" (where the full plot or full trajectory is loaded at file opening for displaying the data throughout the full recording).
The heavier streams (e.g. video and point clouds typically) may not need to be exported to the MCAP file and could be streamed directly from some software that reads the original format and streams the data to Lichtblick.
That would also save significant storage and post-processing resources as well.
This would additionally require the possibility to emit the time control commands (jumping in time, pausing, fast forward etc) to the said software via topics for instance so that the streaming source software has a chance to remain synched with the LichtBlick dashboard and feed the right data at the right time.
Summary of the complete feature request:
display data from multiple sources simultaneously (several MCAP files + streams via websockets, some being "IterableSources" and others being "BufferedSources").
make sure the Lichtblick time control can be extended to some other software running in parallel and managing some of the data reading and streaming.
Do these features ring a bell to anyone?
May we mutualize efforts on this in a medium term? It may not be the most urgent features we have on the table but it probably requires some preparation, exchanges, training workshops maybe on the internal architecture of the tool.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hi everyone,
We (Intempora, editors of the RTMaps and IVS software tools) would need to be able to open several MCAP files simultaneously in a single LichtBlick instance and that the data from those MCAP files is fed into the dashboard in a synchronized manner as if it were a single file.
We could call this feature the "multi-source" capability.
The use case is as follows: consider we have a certain recording containing raw sensor data (and actually TeraBytes, and PetaBytes of such recordings).
This recording later on gets enriched with additional streams (e.g. ground truths streams, with 3D objects, segmented images, lane markings..., or just situational tags). When generated, these additional streams would be stored into a separate MCAP file.
For now we have to merge the streams in a single MCAP file which introduces a lot of overhead in the post-processing workflow (particularly when some of these additional streams, e.g. ground truth streams, are only available on a short sequence of the initial recording).
A next step in the multi-source approach would be to be able to stream data via websockets along with playing back an MCAP file.
Use case details:
A sensor data recording is stored in a format that is not MCAP (e.g. MDF4, RTMaps .rec, .dat, rosbag, ...)
We want to display the data in Lichtblick.
The signal streams (e.g. vehicle speed, accelerations, GPS trace etc.) are lightweight and can easily be exported to MCAP to constitue an "IterableSource" (where the full plot or full trajectory is loaded at file opening for displaying the data throughout the full recording).
The heavier streams (e.g. video and point clouds typically) may not need to be exported to the MCAP file and could be streamed directly from some software that reads the original format and streams the data to Lichtblick.
That would also save significant storage and post-processing resources as well.
This would additionally require the possibility to emit the time control commands (jumping in time, pausing, fast forward etc) to the said software via topics for instance so that the streaming source software has a chance to remain synched with the LichtBlick dashboard and feed the right data at the right time.
Summary of the complete feature request:
Do these features ring a bell to anyone?
May we mutualize efforts on this in a medium term? It may not be the most urgent features we have on the table but it probably requires some preparation, exchanges, training workshops maybe on the internal architecture of the tool.
Best regards,
Beta Was this translation helpful? Give feedback.
All reactions