Note

This C++ sample has a corresponding Python sample.

Dense Optical Flow Sample using C++

The Computer Vision API can be used to compute dense optical flow of objects moving in front of the camera. The dense optical flow is computed for every events, contrary to what is done in the Sparse Optical Flow sample where flow is estimated on clusters of events. To have a summary of the optical flow algorithms available, check the “Available Optical Flow Algorithms” section below.

The sample metavision_dense_optical_flow.cpp shows how to implement a pipeline for computing dense optical flow.

The source code of this sample can be found in <install-prefix>/share/metavision/sdk/cv/cpp_samples/metavision_dense_optical_flow when installing Metavision SDK from installer or packages. For other deployment methods, check the page Path of Samples.

Expected Output

The sample visualizes events and the output optical flow using colors indicating the edge normal direction and the magnitude of motion:

Expected Output from Metavision Dense Optical Flow Sample

The sample can also generate a video with the output flow.

How to start

You can directly execute pre-compiled binary installed with Metavision SDK or compile the source code as described in this tutorial.

To start the sample based on recorded data, provide the full path to a RAW or HDF5 event file (here, we use a file from our Sample Recordings):

Linux

./metavision_dense_optical_flow -i driving_sample.hdf5 --flow-type TripletMatching

Windows

metavision_dense_optical_flow.exe -i driving_sample.hdf5 --flow-type TripletMatching

Note

As explained in the Sparse Optical Flow Code Overview, by default a filter algorithm (Metavision::SpatioTemporalContrastAlgorithmT) is applied to reduce the noise in the event stream. Depending on your input file, this might not be useful (or it could even suppress most of the events if another noise filter was already applied when recording the event file). To disable this filter, use the command line option --sw-stc-threshold 0

To start the sample based on the live stream from your camera, run:

Linux

./metavision_dense_optical_flow

Windows

metavision_dense_optical_flow.exe

To start the sample on live stream with some camera settings (like the biases mentioned above, or ROI, Anti-Flicker, STC etc.) loaded from a JSON file, you can use the command line option --input-camera-config (or -j):

Linux

./metavision_dense_optical_flow -j path/to/my_settings.json

Windows

metavision_dense_optical_flow.exe -j path\to\my_settings.json

To check for additional options:

Linux

./metavision_dense_optical_flow -h

Windows

metavision_dense_optical_flow.exe -h

Available Optical Flow Algorithms

This sample enables comparing several dense optical flow algorithms: Plane Fitting flow, Triplet Matching flow and Time Gradient flow. The SDK API also offers alternatives with Sparse Optical Flow as well as Machine Learning Flow inference, all described here.