Applications and Tools
Explore the capabilities of the Metavision SDK by choosing one of the following paths:
Applications: Discover various applications that leverage our advanced event-based optimized algorithms.
Tools: Explore our ready-to-use tools designed for working with both event-based devices and recorded files.
Technical Resources: Access detailed information to develop an optimal setup tailored to your specific application using Prophesee technology on our Knowledge Center page.
Applications
Metavision SDK can be used for a variety of application fields:
Name |
Description |
Samples |
---|---|---|
Particle Size Monitoring |
Control, count and measure the size of objects moving at very high speed in a channel or a conveyor. Get instantaneous quality statistics in your production line, to control your process. |
|
Object Tracking |
Track moving objects in the field of view. Leverage the low data-rate and sparse information provided by event-based sensors to track objects with low compute power. |
|
Vibration Monitoring |
Monitor vibration frequencies continuously, remotely, with pixel precision by tracking the temporal evolution of every pixel in a scene. |
|
Spatter Tracking |
Track small particles with spatter-like motion |
|
High-Speed Counting |
Count objects at unprecedented speeds, high accuracy, generating less data and without any motion blur. |
|
Edgelet Tracking |
Track 3D edges and/or Fiducial markers for your AR/VR application |
|
Optical Flow |
Understand motion through continuous pixel-by-pixel tracking and not sequential frame by frame analysis. |
metavision_sparse_optical_flow in C++ metavision_dense_optical_flow in C++ |
Jet Monitoring |
Monitor jets (i.e. dots) that are being dispensed |
|
Active Marker 2D Tracking |
Track Active Markers in 2D. |
|
Active Marker 3D Tracking |
Track Active Markers in 3D. |
|
ArUco Marker Tracking |
Track ArUco Markers |
Note
For optimal results when working with the samples listed above, it is essential to ensure the quality of the event stream and minimize background noise. To address these aspects, you can use:
Hardware Settings/Filters: adjust biases and configure the hardware ESP for better performance.
Software Filters: leverage the techniques demonstrated in the samples Noise Filtering and Filtering to further enhance your results.
For the ESP, note that the AFK and STC functions are available in both hardware and software versions.
Metavision SDK enables the manipulation of event-based datasets and the design of event-based neural networks. With our pre-trained models, you can perform inference for a variety of use cases, including:
Name |
Description |
Samples |
---|---|---|
Detection Inference |
Leverage our pretrained automotive model written in pytorch, and experiment live detection & tracking |
|
Optical Flow Inference |
Predict optical flow from Event-Based data leveraging our pretrained Flow Model, customized data loader and collections of loss function and visualization tools |
|
Corner Detection & Tracking Inference |
Detect and track corners in an event stream with high efficiency. This method can generate stable keypoints and very long tracks |
|
Gesture Classification Inference |
Run a live Rock Paper Scissors game with our pre-trained model on live stream or on event-based recordings. |
Note
Currently there is no sample dedicated to speed measurement. However, if you want to estimate the velocity of objects or an event stream for your application,
you can have a look at the sample metavision_sparse_optical_flow
(C++ version and
Python version) or the sample metavision_spatter_tracking
(C++ version
and Python version).
Tools
Metavision SDK comes with a set of ready-to-use tools:
Name |
Description |
---|---|
Advanced Graphical User Interface to visualize and record data streamed by Prophesee-compatible event-based vision systems |
|
Basic Graphical User Interface to visualize and record data streamed by Prophesee-compatible event-based vision systems |
|
Displays events in a 3D space |
|
Displays event rate from an event-based camera or from a recorded file |
|
Focuses an event-based camera using a blinking pattern |
|
Estimates intrinsics and extrinsics parameters of EB camera using a blinking pattern |
|
Computes the 4x4 “world to Camera” transformation matrix |
|
Prints information on the OS, connected devices and installed software |
|
Prints information on the installed software (version, date etc.) |
|
Prints information about a RAW, DAT or HDF5 event file |
|
Converts a RAW, DAT or HDF5 event file to a CSV file |
|
Converts a RAW or DAT file to an HDF5 event file |
|
Converts a RAW, DAT or HDF5 event file to an AVI video |
|
Converts a RAW or HDF5 event file to a DAT-formatted file |
|
Detects and masks active pixels for GenX320 sensor |
Note
The source code for all these tools is available, allowing you to customize their behavior to suit your specific needs or use them as examples to kickstart your own programming projects. Hence, these tools (with the exception of Studio) are also included in our Code Samples section.