Tutorial 4: Slow motion replay

In this tutorial, you will learn how to manage the time model of Controller to generate slow-motion replays of RAW files.

As explained in the previous tutorials, event-based cameras do not produce frames, but a stream of independent events. To visualize these events, we must artificially build a frame by accumulating events over time.

Creating frames out of event data (for debug, display, etc.) typically requires to define 3 main parameters:

  • accumulation time: this is the duration used to accumulate events in a frame

  • render rate: this defines the frequency at which frames are generated. For an event sensor this rate is a pure applicative parameter. It can typically range from 10Hz to 10KHz and more.

  • display rate: this defines the frequency at which frames are displayed in the application. If this rate is smaller than the render rate, then the replay of the file will be in slow motion.

Slow motion factor, display and render rate are linked with following equation: \(\text{slow motion factor} = \frac{\text{render rate}}{\text{display rate}}\)

While render rate is typically defined using the event timestamps time reference, display rate uses the application time which is typically an absolute external reference, also called wall clock time. To be able to perform slow motion replay, these two time references must be properly synchronized.

In this tutorial, we will show how to replay a file and parameterize accumulation time, display rate and slow motion factor while managing time synchronization between the Controller and the wall clock.

Initialization

Render rate vs accumulation time

Let’s consider the following examples:

  • Config A: 25 Hz rendering of frames accumulated over 40 ms

  • Config B: 250 Hz rendering of frames accumulated over 4 ms

In both cases, accumulation time exactly corresponds to the period of the rendering. This scheme is called full accumulation.

However, in config B, events are accumulated over a very short 4 ms period. If the scene is the same between config A and config B, frames generated in B will have less events, and it could be harder to see/understand what’s happening in B. To compensate for this, one could use:

  • Config C: 250 Hz rendering of frames accumulated over 10ms.

In config C, events are accumulated over a period that exceeds the rendering period. It means that a single event is displayed multiple times, one for each successive frames it belongs to. We call this scheme over accumulation.

Conversely, shorter accumulation times are also possible and is referred to as under accumulation. In this case, the display will not show all the events in the period between two frames.

While over accumulation can be useful to help “understand” the scene for the user, it can lead into a loss in temporal sharpness. On the other hand, under accumulation can be used to look at very fast scenes and reduce cluttering in the visualization.

In the following, to ease the configuration of accumulation time we will use a parameter called accumulation time factor.

  • \(1\) : full accumulation

  • \(<1\) : under accumulation

  • \(>1\) : over accumulation

Let’s configure the main parameters of the replay below.

import metavision_designer_engine as mvd_engine
import metavision_designer_core as mvd_core
import metavision_hal as mv_hal
import time  # Wall clock time synchronization

INPUT_PATH = 'PATH_TO_RAW'

# Main replay parameters for real-time, conventional replay.
#DISPLAY_RATE = 25.0
#SLOW_MO_FACTOR = 1.
#ACCUMULATION_FACTOR = 1.

# Main replay parameters for Slow mo
DISPLAY_RATE = 25.0 # Screen FPS
SLOW_MO_FACTOR = 5.# Replay 5 times slower. Change to any value (including <1 to speed up)
ACCUMULATION_FACTOR = 0.5 # Default to 'under accumulation'

# Now compute all other parameters
RENDER_RATE = DISPLAY_RATE * SLOW_MO_FACTOR
RENDER_PERIOD_US = int(1e6/RENDER_RATE)
DISPLAY_PERIOD_US = int(1e6/DISPLAY_RATE)
ACC_TIME = int(RENDER_PERIOD_US * ACCUMULATION_FACTOR)

print('FPS to render: ' + str(RENDER_RATE))
print('Rendering period (us): ' + str(RENDER_PERIOD_US))
print('Display period (us): ' + str(DISPLAY_PERIOD_US))
print('Frame acc time (us): ' + str(ACC_TIME))

Building the pipeline

Let’s now build the main pipeline, similarly to the Simplest Player tutorial <./03-simplest-player.rst>: image0

# Check if RAW
is_raw = INPUT_PATH.endswith('.raw')

# Instantiate Controller
controller = mvd_engine.Controller(True)

# Instantiate custom CD producer, depending on RAW or DAT format
cd_producer = None

if is_raw:
    # Open RAW file using Metavision Hardware Abstraction Layer
    mv_reader = mv_hal.DeviceDiscovery.open_raw_file(INPUT_PATH)
    if mv_reader is None:
        raise RuntimeError('Failed to open RAW file: ' + INPUT_PATH)

    # We need to access the **I_EventsStream** interface in order to start reading and streaming events from the file.
    i_events_stream = mv_reader.get_i_events_stream()
    i_events_stream.start()

    # Add interface to controller, to poll events from file
    mv_interface = mvd_core.HalDeviceInterface(mv_reader, 1e-3, 0)
    controller.add_device_interface(mv_interface)

    # Create Producer
    cd_producer = mvd_core.CdProducer(mv_interface)

# Else, assume .dat file
else:
    cd_producer = mvd_core.FileProducer(INPUT_PATH)
    width = cd_producer.get_width()
    height = cd_producer.get_height()

# Add producer to the pipeline
controller.add_component(cd_producer, "CD Producer")

# Create Frame Generator with provided accumulation time
frame_gen = mvd_core.FrameGenerator(cd_producer)
frame_gen.set_dt(ACC_TIME)
controller.add_component(frame_gen, "FrameGenerator")

# Create image display window
img_display = mvd_core.ImageDisplayCV(frame_gen)
img_display.set_name("Metavision Designer Events Player")
controller.add_component(img_display,"ImgDisplay")
controller.add_renderer(img_display, mvd_engine.Controller.RenderingMode.SimulationClock, RENDER_RATE)
controller.enable_rendering(True)
controller.enable_sparse_rendering(True)

Let’s now run the pipeline. In this example, we want the Controller time referred as the simulation time to be different from the wall clock time. To do this, we will perform Controller run steps in a different ways compared to the previous tutorials:

  • Steps will be tuned to achieve the rendering of exactly 1 frame

  • Controller will execute asynchronously (this is done by setting the last argument of run() to False) to prevent Controller to sync with wall clock.

  • Finally, we will do a manual synchronization to the wall clock, to wait for the display period we want to achieve.

Let’s define how we want the pipeline to be run:

def run_pipeline(controller, render_period_us=40000, display_period_us=40000):
     # Run pipeline & print execution statistics
    controller.set_slice_duration(render_period_us)
    controller.set_batch_duration(render_period_us)
    while not (controller.is_done()):

        # Sync to wall clock, to have simu "wait" for display
        step_start_time = time.time()

        # Render frame in a single Step
        controller.run(False)

        # Then compute the time we want to sleep, to display the SlowMo
        # Depending on the time taken by the controller, we might have to sleep longer or shorter
        elapsed = time.time() - step_start_time
        tosleep = 1e-6*display_period_us - elapsed
        if tosleep >0:
            time.sleep(tosleep)

    controller.print_stats(False)

run_pipeline(controller,RENDER_PERIOD_US, DISPLAY_PERIOD_US)

Output

The expected output is the following:

We finally delete the graphical components, to stop the tutorial properly.

del img_display

Note

This tutorial was created using Jupiter Notebooks

Download the source code.