HAL Showcase Sample
The sample metavision_hal_showcase.cpp
shows how to use Metavision HAL API for visualizing events stream.
The source code of this sample can be found in <install-prefix>/share/metavision/hal/cpp_samples/metavision_hal_showcase
when installing Metavision SDK from installer or packages. For other deployment methods, check the page
Path of Samples.
Note
The related C++ code shows how to use Metavision SDK classes to record a RAW file as well as set and save bias parameters. To achieve the same operations with Python, you can refer to this link .
Expected Output
The sample visualizes CD events on the screen.
How to start
First, compile the sample as described in this tutorial.
To start the sample based on the live stream from your camera, run:
Linux
./metavision_hal_showcase
Windows
metavision_hal_showcase.exe
To start the sample based on recorded data, provide the full path to a RAW file (here, we use
the file spinner.raw
from our Sample Recordings):
Linux
./metavision_hal_showcase -i spinner.raw
Windows
metavision_hal_showcase.exe -i spinner.raw
To check for additional options:
Linux
./metavision_hal_showcase -h
Windows
metavision_hal_showcase.exe -h
Code Overview
Visualization of Events
To apply processing on every received CD event, you need to register a callback on the decoder, as shown in the following code snippet:
// Get the handler of CD events
Metavision::I_EventDecoder<Metavision::EventCD> *i_cddecoder =
device->get_facility<Metavision::I_EventDecoder<Metavision::EventCD>>();
if (i_cddecoder) {
// Register a lambda function to be called on every CD events
i_cddecoder->add_event_buffer_callback(
[&event_analyzer](const Metavision::EventCD *begin, const Metavision::EventCD *end) {
event_analyzer.process_events(begin, end);
});
}
To set up the display frame, the sensor size can be retrieved as follows:
Metavision::I_Geometry *i_geometry = device->get_facility<Metavision::I_Geometry>();
if (!i_geometry) {
std::cerr << "Could not retrieve geometry." << std::endl;
return 4;
}
std::cout << "Device geometry : " << i_geometry->get_width() << "x" << i_geometry->get_height() << std::endl;
Then, the data needs to be streamed from the device to the decoder:
// Here we polled data, so we can launch decoding
auto raw_data = i_eventsstream->get_latest_raw_data();
// This will trigger callbacks set on decoders: in our case EventAnalyzer.process_events
if (raw_data) {
i_eventsstreamdecoder->decode(raw_data);
}
Configuring Trigger In and Trigger Out
You can configure Trigger In and Trigger Out interfaces using Metavision::I_TriggerIn
and Metavision::I_TriggerOut
classes.
If you are not familiar with the triggers, check the Trigger Interfaces page).
Essentially, Trigger Out facility provides an output signal and Trigger In captures an external signal and merges it (as events) with the sensor native data stream.
Trigger In can also capture the output signal of Trigger Out with a specific loopback mechanism.
To configure the Trigger In facility, you may enable the main, auxiliary or loopback (if available) channel with the channel
parameter.
This code snippet shows how to get the trigger facilities. As you can see, we enable the loopback mechanism when the Trigger In facility supports the
corresponding channel and the facility I_TriggerOut
can be enabled. Otherwise, it will require an external signal generator to produce external
trigger events:
// On camera providing Trigger Out, we enable it and duplicate the signal on Trigger In using the loopback channel
// On the other cameras, we enable Trigger In, but we will need to plug a signal generator to create trigger events
// and we also set the camera as Master so that we can test the Sync Out signal if needed.
Metavision::I_TriggerOut *i_trigger_out = device->get_facility<Metavision::I_TriggerOut>();
Metavision::I_TriggerIn *i_trigger_in = device->get_facility<Metavision::I_TriggerIn>();
if (i_trigger_in) {
auto channels = i_trigger_in->get_available_channels();
if (channels.find(Metavision::I_TriggerIn::Channel::Loopback) != channels.end() && i_trigger_out) {
std::cout << "Trigger loopback enabled" << std::endl;
i_trigger_in->enable(Metavision::I_TriggerIn::Channel::Loopback);
i_trigger_out->set_period(100000);
i_trigger_out->set_duty_cycle(0.5);
i_trigger_out->enable();
} else if (i_camera_synchronization) {
std::cout << "Could not enable trigger loopback" << std::endl;
i_trigger_in->enable(Metavision::I_TriggerIn::Channel::Main);
i_camera_synchronization->set_mode_master();
}
}
Recording a RAW File
The Metavision::I_EventsStream
class provides a function
Metavision::I_EventsStream::log_raw_data()
to record all events received from the camera to a RAW file.
The following code snippet shows how to start data recording:
Metavision::I_EventsStream *i_eventsstream = device->get_facility<Metavision::I_EventsStream>();
if (i_eventsstream) {
if (out_raw_file_path != "") {
i_eventsstream->log_raw_data(out_raw_file_path);
}
} else {
std::cerr << "Could not initialize events stream." << std::endl;
return 3;
}
To stop recording, the method Metavision::I_EventsStream::stop_log_raw_data()
has to be called.
Note
If you use this sample to record data, you should perform some checks and configuration to optimize the quality of the data you will collect. Please refer to the section Recording from live camera from Metavision Studio page where we mention camera installation, lighting conditions, focus adjustment and multiple camera settings (biases, ROI, Anti-Flicker etc.).