Note

This C++ sample has a corresponding Python sample.

Dense Optical Flow Sample using C++

The Computer Vision API can be used to compute dense optical flow of objects moving in front of the camera. The dense optical flow is computed for every events, contrary to what is done in the Sparse Optical Flow sample where flow is estimated on clusters of events. To have a summary of the optical flow algorithms available, check the “Available Optical Flow Algorithms” section below.

The sample metavision_dense_optical_flow.cpp shows how to implement a pipeline for computing dense optical flow.

The source code of this sample can be found in <install-prefix>/share/metavision/sdk/cv/cpp_samples/metavision_dense_optical_flow when installing Metavision SDK from installer or packages. For other deployment methods, check the page Path of Samples.

Expected Output

The sample visualizes events and the output optical flow using colors indicating the edge normal direction and the magnitude of motion:

Expected Output from Metavision Dense Optical Flow Sample

The sample can also generate a video with the output flow.

How to start

You can directly execute pre-compiled binary installed with Metavision SDK or compile the source code as described in this tutorial.

To start the sample based on recorded data, provide the full path to a RAW or HDF5 event file (here, we use a file from our Sample Recordings):

Linux

./metavision_dense_optical_flow -i driving_sample.hdf5 --flow-type TripletMatching

Windows

metavision_dense_optical_flow.exe -i driving_sample.hdf5 --flow-type TripletMatching

Note

As explained in the Sparse Optical Flow Code Overview, by default a filter algorithm (Metavision::SpatioTemporalContrastAlgorithmT) is applied to reduce the noise in the event stream. Depending on your input file, this might not be useful (or it could even suppress most of the events if another noise filter was already applied when recording the event file). To disable this filter, use the command line option --sw-stc-threshold 0

To start the sample based on the live stream from your camera, run:

Linux

./metavision_dense_optical_flow

Windows

metavision_dense_optical_flow.exe

To start the sample on live stream with some camera settings (like the biases mentioned above, or ROI, Anti-Flicker, STC etc.) loaded from a JSON file, you can use the command line option --input-camera-config (or -j):

Linux

./metavision_dense_optical_flow -j path/to/my_settings.json

Windows

metavision_dense_optical_flow.exe -j path\to\my_settings.json

To check for additional options:

Linux

./metavision_dense_optical_flow -h

Windows

metavision_dense_optical_flow.exe -h

Available Optical Flow Algorithms

This sample enables comparing several dense optical flow algorithms: Plane Fitting flow, Triplet Matching flow and Time Gradient flow. The SDK API also offers an alternate Optical Flow algorithm: Sparse Optical flow. This alternate algorithm is demonstrated in the Sparse Flow C++ Sample.

The main differences between those flow algorithms are the following:

  • Plane Fitting optical flow:

    • is based on plane-fitting in local neighborhood in time surface

    • is a simple and efficient algorithm, but run on all events hence is costly on high event-rate scenes

    • estimated flow is subject to noise and represents motion along edge normal (not full motion)

  • Triplet Matching optical flow:

    • is based on finding aligned events triplets in local neighborhood

    • is a simple and very efficient algorithm, but run on all events hence is costly on high event-rate scenes

    • estimated flow is subject to noise and represents motion along edge normal (not full motion)

  • Time Gradient optical flow:

    • is based on computing a spatio-temporal gradient on the local time surface using a fixed look-up pattern (i.e. it is essentially a simplified version of the Plane Fitting algorithm in which we only consider the pixels in a cross_shaped region (x0 +/- N, y0 +/- N) instead of a full NxN area around the pixel)

    • is a simple and very efficient algorithm, but run on all events hence is costly on high event-rate scenes

    • estimated flow is subject to noise and represents motion along edge normal (not full motion)

  • Sparse optical flow:

    • is based on tracking of small edge-like features

    • is more complex but staged algorithm, leading to higher efficiency on high event-rate scenes

    • estimated flow represents actual motion, but requires fine tuning and compatible features in the scene

See also

To know more about those flow algorithms, you can check the paper about Plane Fitting Flow, the paper about Triplet Matching Flow and the patent about CCL Sparse Flow