Getting Started

Installation

Get started with Metavision Intelligence. Sign-up for Metavision Essential and follow the installation procedure matching your Operating System.

If you need more features and customized commercial support, check out the Metavision Intelligence Plans.

Experience Event-Based Vision

Metavision Player

Pre-installed in Metavision Essentials, Metavision Player is a ready-to-use application, with a Graphical User Interface (GUI) to help you start with event-based vision. With Metavision Player, you can replay, record, convert to video format and even zoom in time to experience the high temporal-resolution of event-based data!

If you don’t have an event-based camera, check out our free Datasets and choose from a large selection of recordings from various applications scenarios.

Metavision Designer

Whenever ready to explore the benefits of event-based vision in real applications, get started with Metavision Designer. Metavision Designer offers a simple Python 3 API that allows you to create simple application pipelines using a large library of pre-built components.

Get familiar with Metavision Designer API with our selection of tutorials using free downloadable Jupyter Notebooks.

Metavision SDK

Ready to dive into the development of advanced event-based applications? Metavision SDK will provide you with a large set of algorithms available through a C++ API. Start with one of our code samples to discover all the features available in the SDK.

Event-Based Concepts

Event Generation

Event-Based sensors are arrays of pixels trying to mimic the behavior of a biological retina. As opposed to traditional imaging techniques where light is sampled in a deterministic time-driven process, event-based pixels continuously sample the incoming light and generate signal only when the light level changes. They rely on a Contrast Detector (CD) which emits events each time light level changes.

The Contrast Detector is performed as a fast continuous-time logarithmic photoreceptor with asynchronous signal processing. It continuously monitors photocurrent for changes of illumination and responds with an ON or OFF event that represents an increase or decrease in intensity exceeding given thresholds [Posch11].

CD event generation

On top of the figure, we see the logarithm of the photocurrent (or light level) in red. Each time this photocurrent exceeds a given threshold, an event is generated (in blue, on the bottom figure).

Event Numerical Format

Every CD event is labelled with:

  • position of the pixel in the sensor array (x,y)

  • polarity:

    • polarity=1 (CD ON) corresponds to light changes from darker to lighter

    • polarity=0 (CD OFF) corresponds to light changes from lighter to darker

  • timestamp, expressed in μs (microseconds)

An event data stream can be considered as a sequence of (x,y,p,t) tuples.

For more information on the data format, please see Decoding and Data Formats.

Event-Based Vision

The first vision task to solve with event-based is to find a way to visualize CD events. Using a 3-dimensional XYT (X, Y, Time) space is a the best way to capture the temporal continuity of event-based data.

However, it is still possible to reconstruct a frame from CD events whenever needed.

Visualizing events in XYT space

Visualizing data in XYT space shows multiple advantages of event-based data, like time-space continuity, absence of blur, hyper-fast reaction to tiny changes in the scene.

Here is an example of visualizing CD events acquired by a camera looking at a rotating disc in XYT space:

CD events in XYT space

CD ON events are shown with light-blue color, and CD OFF events are shown with dark-blue color. The time-axis allows to show CD events acquired during the last 1 second. The most recent events are shown in front and older events in the back.

Try XYT visualization yourself with Metavision Designer XYT Sample.

Generating frames from CD events

To generate a frame at a precise time T, CD events should be accumulated over a longer period of time (between the time=T-dt and time=T), because the number of CD events occurred at the precise time=T (with a microsecond precision) could be limited. Note that dt is usually called accumulation time.

In order to generate a frame with the accumulated events, a frame can be initialized with its background color at first (e.g. dark blue). Then, for each CD event occurring between the time=T-dt and time=T, a white pixel is stored if the polarity of the CD event is positive, and a light blue pixel if the polarity is negative.

An example of visualizing CD events acquired by a camera looking at a rotating disc in a frame:

Frame from CD events

Try framed visualization yourself, and see how to control accumulation time with Metavision Player.