# Tutorial 1: Display events¶

This tutorial shows how to read events from a file and how to display them using Metavision Designer.

## Initialization¶

First, let’s import Metavision Designer modules:

import metavision_designer_engine as mvd_engine
import metavision_designer_core as mvd_core


Let’s define the input_path variable containing a path to a DAT file. If needed, update the input_path variable to point to a valid DAT file on your local filesystem.

Note

Metavision Designer supports DAT and RAW files, and streaming from a live camera. DAT files can be loaded using the core functionalities of Designer whereas RAW files and a live camera require the use of Metavision HAL. For now, we will focus on DAT files, see this tutorial for instructions on how to open RAW files and using a live camera.

input_path = "PATH_TO_DAT"

from os import path
if not(path.exists(input_path) and path.isfile(input_path)):
raise OSError("Provided input path '{}' does not exist or is not a file.".format(input_path))
if not input_path.endswith('.dat'):
raise OSError("Provided input path '{}' must be a DAT file.".format(input_path))


## Building the pipeline¶

We will now create our first pipeline. A pipeline is a graph of components that are registered to a controller.

• component: a block that consume and/or produce events. It can be linked to other components to build an application. Components can have multiple kind of inputs, but only one kind of output. This output can be connected to the input of any number of components.

• controller: manages components of the pipeline. It is responsible for the scheduling of the pipeline, data distribution and the overall graph execution.

In this tutorial we will implement the following pipeline, using three components:

Let’s now instantiate the components of the pipeline.

The FileProducer component is used to parse the content of a DAT file. It can be used also to query the width and height of the data.

Creation of FileProducer requires 1 argument: 1. Path to the DAT file to read: input_path

cd_prod = mvd_core.FileProducer(input_path)
width = cd_prod.get_width()
height = cd_prod.get_height()
print('Input stream resolution %d x %d' % (width, height))


Event-based cameras do not produce frames, but a stream of independent events. To visualize these events, we must artificially build a frame by accumulating events over time. This can be done in different ways, but the easiest is to create a binary frame: we start with a frame where each pixel is set to zero and we set a one on the corresponding pixel every time we receive an event.

To do this, we can use the FrameGenerator, a component that accumulates events into a frame. This frame can be used for display, image writing, etc. To operate, it needs a CD event source, such as the FileProducer component prod_cd we created just before.

frame_gen = mvd_core.FrameGenerator(cd_prod)


Finally, let’s instantiate the ImageDisplayCV component, to display the frames generated by frame_gen.

img_disp = mvd_core.ImageDisplayCV(frame_gen)


Now we have all the components of our pipeline. Let’s create a controller and register all the components.

To register components of the pipeline to the controller, call the add_component method of the Controller class. You can add an optional string as second argument, to name each component. This can be used later if, for example, you want to print some statistics of the pipeline.

controller = mvd_engine.Controller()


Each component processes data at a fixed frequency. We will see how to set this frequency in the following sections. Components work in parallel: while one component is processing a buffer of events, the previous component is already processing the next buffer. This is done in a transparent way, no need to deal with the complexity of threads and data sharing.

While this behavior is typically desirable, as it increases the overall performance of the pipeline, there are situations in which we need to execute a component at a different frequency than the main pipeline. The typical examples are tools for visualization: we need to produce and display images at a fixed frequency.

Components that can work at a fixed configurable frequency are called renderers. The name comes from the fact that most of these components are made for rendering, but they could actually be used any time we need to create periodic information.

In our example, we need to indicate that img_disp is a renderer, and that we need a periodic output at 25 Hz (the frame rate for displaying).

controller.add_renderer(img_disp, mvd_engine.Controller.RenderingMode.SimulationClock, 25.)


We also need to inform the controller that we want to enable the renderers on the pipeline (in our case only img_disp, but any added renderer will be activated).

controller.enable_rendering(True)


We are all set! Let’s now run the pipeline and display the events.

#Set time simulated in each slice
controller.set_slice_duration(10000)
#Set maximum amount of time processed in each call of run()
controller.set_batch_duration(100000)
while not (controller.is_done()):
controller.run(True)


## Output¶

The expected output is the following:

We finally delete the graphical components, to stop the tutorial properly.

del img_disp


Note

This tutorial was created using Jupiter Notebooks