Metavision Studio is the perfect tool to start with, whether you have an event-based camera or not. It is included in our free evaluation version Metavision Essentials.
It features a Graphical User Interface allowing to visualize and record data streamed by Prophesee-compatible event-based vision systems. You can try it out by playing one of the RAW files provided in our sample recordings. If you own one of our Evaluation Kits or a camera from our partner, you can visualize the events, adjust the display parameters and tune all the camera settings.
Stream events from a live camera
Visualize events from a recording in the frame rate of your choice (normal, slow-motion, high-speed…)
Configure display parameters (accumulation time, color theme)
Control sensor pixels settings (biases)
Set a Region Of Interest (ROI)
Access advanced sensor features (Anti-Flicker, Spatio-Temporal Contrast Filter and Event Rate Controller)
Record data from a live camera
Cut recording to keep only relevant data
Export events recording to AVI video
Metavision Studio can be started by typing
metavision_studio in the command line prompt of your operating system.
On Windows, you can also launch it from the Windows Menu or the Search Bar.
On your first connexion, you will be offered to follow a guided tour to discover how to use Studio:
This guide gives you information to start reading and recording event-based data as well as configuring your camera settings. Note that you will be able to access it anytime by selecting “Help > Getting Started Guide” menu item.
First visualization of a recording
To start with Studio, download a pre-recorded file from our Sample Recordings. For example, choose hand_spinner.raw which is a turning hand spinner recorded using an EVK1 Gen31. When you open the file in Studio, it will automatically be played. The default color theme is a black background with ON and OFF events displayed respectively as white and blue pixels.
Now, open the Settings panel on the right and look at the Information, Statistics and Display sections. In the Information section, you can see that it was recorded with a Gen31 sensor of VGA resolution (640x480). In the Statistics section, you can see a live estimation of the data rate which is quite constant for this recording (around 2.5 Mev/s). Finally, in the Display section, you can change the way the frames are built and displayed from the events. For example, change Accumulation Time to 1ms and Frame Rate to 1000 (0.03x) to see the recording in slow motion:
First visualization from live camera
If you own an event-based camera, plug it and open it with Studio. You should now see live events streamed from your camera. You can adjust the color theme to your liking in the Display section of the settings. In the image below, we chose the Light theme: ON and OFF events are displayed respectively as blue and black pixels on a white background. If you are pointing your camera to a static scene, you won’t see anything besides some background noise. To get some relevant events, you can point the camera to yourself and wave your hand:
The image shown above is quite sharp and does not show much background noise. To reach such a result, focus your camera by adjusting the aperture and focus distance of your objective (availability of those settings depends on your objective). To reduce the background noise, you can open the biases settings and adjust bias_fo while following the event rate and the display to see how the noise is impacted. To get more information about the biases, please refer to the sensor biases page.
After focusing and bias adjustment, Studio might still show an image with many unexpected events like in this example:
This could be caused by your Lightning conditions. Some artificial lights are flickering and creating many events on the sensors. In the image above, you can even see some horizontal artefacts caused by the Event Rate Controller (ERC) of the sensor (Gen4.0 and newer) that is configured to limit the event rate to 20 Mev/s. In such a situation, change your lightning device if you have the possibility to do so. The best non-flickering light source is any halogen lighting, but if you want a LED source, you must double check it is not flickering (lots of them are using PWM modulation for dimming, and produce flicker). If you can not choose your lightning device, then on Gen4.1 and IMX636 sensors, you can enable the Anti-Flicker sensor filter available in the Settings panel. To get more information on flicker mitigation, check our Knowledge Center.
First recording from live camera
Now that you started streaming from a live camera, you can to do your first recordings. Note that some checks should performed to get some good data:
Camera installation: when possible, mount your camera on a tripod stand to avoid any spurious motion during acquisition
Lightning conditions: like mentioned in the previous section, make sure you don’t have unexpected events due to flickering light source.
Focus adjustment: to help you focusing your camera, you can use the metavision_blinking_pattern_focus application
Sensor pixels settings (biases): depending on your applications requirements and conditions (higher speed, lower background activity, higher contrast sensitivity threshold, etc.), you should adjust the camera biases.
Region Of Interest: whenever possible you should limit the area of the sensor to the pixels that might gather relevant events for your application. For example, if you are tracking vehicle on a road and you are unable to adjust your objective lens to see only the road, you can configure a ROI in Metavision Studio to exclude non-relevant areas (sidewalks, sky etc.).
Events filtering: if you camera is using a Gen4 or IMX636 sensor, leverage the Event Signal Processing (ESP) block that provide some event filtering features: Anti-Flicker, Event Burst Filter and Event Rate Controller (ERC).
When you are satisfied with your setup, start doing some recordings with Studio. You will get some RAW files that you can then play-back in Studio with different FPS and accumulation time.
Out of curiosity, you can take a look at the actual events contained in those RAW files. To do so, use the metavision_raw_to_csv sample to generate a CSV file and check its content to see the x,y,p,t tuples:
$ more my_first_recording.csv 382,341,0,5012 548,716,0,5031 990,625,0,5042 1162,122,0,5043 524,665,0,5043 987,606,1,5049 209,73,0,5052 878,504,0,5062 (...)
In this example, we see that the very first event recorded was an ON event located at coordinates (x=382,y=341) and timestamped at t=5012us.
To go further, you can now explore our various Code Samples that show our algorithms in action. Most of them can take RAW files are input, so you will be able to see the samples output on your own data!
Metavision Studio is leveraging Electron framework that is using multiple 3rd party dependencies. A list of these Licenses is included in the Metavision T&C’s located in the folder share/metavision/licensing within your installation path.