Tutorial 2: Filter noise events

The goal of this tutorial is to create a pipeline to visually compare a stream of event with and without a noise filter. We will learn how to share the output of a component to multiple other components and how to work with components that take multiple inputs. The goal is to create two parallel stream of events and compose two frame generators in a single display.

We will use the ActivityNoiseFilter class and the FrameComposer:

  • ActivityNoiseFilter - This component can be used to remove noise events: it removes isolated events, that is, events that do not have another event in their neighborhood in a given time window. This neighborhood is composed of the 8 pixels surrounding a given pixel.

  • FrameComposer - This component combines multiple image sources to generate a composite view.

We will implement the following pipeline: image0

Let’s look at the code.

Building the pipeline

import metavision_designer_engine as mvd_engine
import metavision_designer_core as mvd_core
import metavision_designer_cv as mvd_cv

# Change this path to point to a DAT file with CD events
input_path = 'PATH_TO_DAT'

from os import path
if not(path.exists(input_path) and path.isfile(input_path)):
    raise OSError("Provided input path '{}' does not exist or is not a file.".format(input_path))
if not input_path.endswith('.dat'):
    raise OSError("Provided input path '{}' must be a DAT file.".format(input_path))

# Create the controller
controller = mvd_engine.Controller()

# Setup of FileProducer
cd_prod = mvd_core.FileProducer(input_path)
width = cd_prod.get_width()
height = cd_prod.get_height()
print('Input stream resolution %d x %d' % (width, height))

# Add component to the controller
controller.add_component(cd_prod, "DAT file reader")

So far, we created the controller and loaded the file using the FileProducer component.

Let’s now configure ActivityNoiseFilter. It requires the following arguments:

  • event source: component generating CD events

  • time_window_length: length of the time window for activity filtering (in us)

# ActivityNoiseFilter configuration
time_window_length = 10000 # duration in us
cd_filtered = mvd_cv.ActivityNoiseFilter(cd_prod, time_window_length)
controller.add_component(cd_filtered, "Noise filter")

Let’s now configure the visualization output. We need 2 FrameGenerator instances and 1 FrameComposer instance. Each FrameGenerator will create a frame from its input: one will have the filtered events, the other the non-filtered events. The FrameComposer will be used to combine them in a single image for visualization.

To instantiate the composer, we need to define the color of background in the BGR format. Here we want the background to be black, so we pass (0,0,0).

To compose images to build a single view, we use the add_image method of the FrameComposer. We need to pass the x,y coordinates of where we want our image to be placed in the final composite view (with (0,0) corresponding to the top left corner) and the width and height of the image in the final composition. Other parameters are available, check the Designer API for more information.

# Frame generators
frame_gen = mvd_core.FrameGenerator(cd_prod)
controller.add_component(frame_gen, "Standard frame generator")
filtered_frame_gen = mvd_core.FrameGenerator(cd_filtered)
controller.add_component(filtered_frame_gen, "Filtered frame generator")

# Composer creation
bg_blue = 0
bg_green = 0
bg_red = 0
composer = mvd_core.FrameComposer(bg_blue, bg_green, bg_red)

# Composer sub-image positions
frame_x, frame_y = 0,0
composer.add_image(frame_gen, frame_x, frame_y, width, height)
print('Will display unfiltered events at %d,%d' %(frame_x, frame_y))

filtered_x, filtered_y = width,0
composer.add_image(filtered_frame_gen, filtered_x, filtered_y, width, height)
print('Will display filtered events at %d,%d' %(filtered_x, filtered_y))

# Composer geometry automatic computation
display_width = composer.get_total_width()
display_height = composer.get_total_height()
print('Overall display window is %d x %d pixels' % (display_width, display_height))


controller.add_component(composer, "Image composer")

Now let’s create a renderer taking inputs from the composer, and run the application.

display = mvd_core.ImageDisplayCV(composer)
controller.add_component(display, "Display")

# Renderer
controller.add_renderer(display, mvd_engine.Controller.RenderingMode.SimulationClock, 25.)
controller.enable_rendering(True)

controller.set_slice_duration(10000)
controller.set_batch_duration(100000)
while not (controller.is_done()):
    controller.run(True)

Output

This is the expected output, on the left the raw input, on the right the filtered output:

We finally delete the graphical components, to stop the tutorial properly.

del display

Note

This tutorial was created using Jupiter Notebooks

Download the source code.