SDK Analytics Python bindings API

class metavision_sdk_analytics.CountingAlgorithm(self: metavision_sdk_analytics.CountingAlgorithm, width: int, height: int, cluster_ths: int, accumulation_time_us: int = 1)None

Class to count objects using Metavision Counting API.

add_line_counters(self: metavision_sdk_analytics.CountingAlgorithm, rows: list)None

Adds new lines to count objects

process_events(self: metavision_sdk_analytics.CountingAlgorithm, events_np: numpy.ndarray[metavision_sdk_base._EventCD_decode])None

Processes a buffer of events for later frame generation

events_np

numpy structured array of events whose fields are (‘x’, ‘y’, ‘p’, ‘t’). Note that this order is mandatory

reset_counters(self: metavision_sdk_analytics.CountingAlgorithm)None

Resets the count of all lines.

set_output_callback(self: metavision_sdk_analytics.CountingAlgorithm, arg0: object)None

Function to pass a callback to get the last count

class metavision_sdk_analytics.CountingCalibration(self: metavision_sdk_analytics.CountingCalibration)None

Class representing the counting calibration.

static calibrate(width: int, height: int, object_min_size: float = 5, object_average_speed: float = 5, distance_object_camera: float = 300, horizontal_fov: float = 56.0, vertical_fov: float = 44.0, travelled_pix_distance_during_acc_time: int = 9)tuple

Finds optimal parameters for the counting algorithm.

width

Sensor’s width in pixels

height

Sensor’s height in pixels

object_min_size

Approximate largest dimension of the smallest object (in mm). The value must be positive. It will be refined during the calibration

object_average_speed

Approximate average speed of an object to count (in m/s). It will be refined during the calibration.

distance_object_camera

Average distance between the flow of objects to count and the camera (in mm) Camera must look perpendicular to the object falling plane. It will be refined during the calibration

horizontal_fov

Horizontal field of view (half of the solid angle perceived by the sensor along the horizontal axis, in degrees)

vertical_fov

Vertical field of view (half of the solid angle perceived by the sensor along the vertical axis, in degrees)

travelled_pix_distance_during_acc_time

Distance (in pixels) travelled during the accumulation time

class metavision_sdk_analytics.CountingDrawingHelper(self: metavision_sdk_analytics.CountingDrawingHelper)None

Class that superimposes line counting results on events.

add_line_counters(self: metavision_sdk_analytics.CountingDrawingHelper, rows: list)None

Adds new line counter ordinates

rows

list of line ordinates

draw(self: metavision_sdk_analytics.CountingDrawingHelper, ts: int, count: int, image: numpy.ndarray)None

Updates data to display.

ts

Current timestamp

count

Last object count

output_img

Output image

class metavision_sdk_analytics.DominantFrequencyEventsAlgorithm(self: metavision_sdk_analytics.DominantFrequencyEventsAlgorithm, min_frequency: float, max_frequency: float, frequency_precision: float, min_count: int)None

Class computing the dominant frequency from frequency events

compute_dominant_value(self: metavision_sdk_analytics.DominantFrequencyEventsAlgorithm, input_frequency_events_np: numpy.ndarray[Metavision::Event2dFrequency<float>])tuple

Computes the dominant frequency from frequency events

class metavision_sdk_analytics.DominantPeriodEventsAlgorithm(self: metavision_sdk_analytics.DominantPeriodEventsAlgorithm, min_period: float, max_period: float, period_precision: float, min_count: int)None

Class computing the dominant period from period events

compute_dominant_value(self: metavision_sdk_analytics.DominantPeriodEventsAlgorithm, input_period_events_np: numpy.ndarray[Metavision::Event2dPeriod<float>])tuple

Computes the dominant period from period events

class metavision_sdk_analytics.DominantValueMapAlgorithm(self: metavision_sdk_analytics.DominantValueMapAlgorithm, min_value: float, max_value: float, precision_val: float, min_count: int)None

Class computing the dominant value of a map

Constructor.

We split the range [ min_val, max_val ] to get values spaced apart by precision_val . Bins are centered around these values and are of width step so that consecutive bins touch each other. For example, given the range [3, 5] and precision_val = 1, it will compute the bin centers {3, 4, 5}, the boundaries of which are given by {2.5, 3.5, 4.5, 5.5}

min_val

Minimum included value (lower bound of the histogram bins)

max_val

Maximum included value (upper bound of the histogram bins)

precision_val

Width of the bins of the histogram (same unit as the value to estimate)

min_count

Minimum size of a given bin in the histogram to be eligible as dominant

note

The histogram bins are initialized during the class construction and won’t change dynamically.

compute_dominant_value(self: metavision_sdk_analytics.DominantValueMapAlgorithm, value_map: numpy.ndarray)tuple

Computes the dominant value from a value map

class metavision_sdk_analytics.EventJet(self: metavision_sdk_analytics.EventJet)None

Struct representing a detected jet event.

property count

Jet number.

property previous_jet_dt

Time since the beginning of the last jet (in us). A negative value means this time-difference isn’t defined yet because so far there has only been at most one jet.

property t

Timestamp of the beginning of the jet (in us).

class metavision_sdk_analytics.EventJetAlarm.AlarmType(self: metavision_sdk_analytics.EventJetAlarm.AlarmType, value: int)None

Types of jet monitoring alarms.

Members:

JetNotDetected

JetTooEarly

TooManyJets

class metavision_sdk_analytics.EventSpatterClusterBuffer(self: metavision_sdk_analytics.EventSpatterClusterBuffer, size: int = 0)None

Constructor

numpy(self: metavision_sdk_analytics.EventSpatterClusterBuffer, copy: bool = False)numpy.ndarray[Metavision::EventSpatterCluster]
Copy

if True, allocates new memory and returns a copy of the events. If False, use the same memory

resize(self: metavision_sdk_analytics.EventSpatterClusterBuffer, size: int)None

resizes the buffer to the specified size

size

the new size of the buffer

class metavision_sdk_analytics.EventSpatterClusterView
numpy(self: metavision_sdk_analytics.EventSpatterClusterView, copy: bool = False)numpy.ndarray[Metavision::EventSpatterCluster]
Copy

if True, allocates new memory and returns a copy of the events. If False, use the same memory

class metavision_sdk_analytics.EventTrackingDataBuffer(self: metavision_sdk_analytics.EventTrackingDataBuffer, size: int = 0)None

Constructor

numpy(self: metavision_sdk_analytics.EventTrackingDataBuffer, copy: bool = False)numpy.ndarray[Metavision::EventTrackingData]
Copy

if True, allocates new memory and returns a copy of the events. If False, use the same memory

resize(self: metavision_sdk_analytics.EventTrackingDataBuffer, size: int)None

resizes the buffer to the specified size

size

the new size of the buffer

class metavision_sdk_analytics.EventTrackingDataView
numpy(self: metavision_sdk_analytics.EventTrackingDataView, copy: bool = False)numpy.ndarray[Metavision::EventTrackingData]
Copy

if True, allocates new memory and returns a copy of the events. If False, use the same memory

class metavision_sdk_analytics.FrequencyMapAsyncAlgorithm(self: metavision_sdk_analytics.FrequencyMapAsyncAlgorithm, width: int, height: int, filter_length: int = 7, min_freq: float = 10.0, max_freq: float = 150.0, diff_thresh_us: int = 1500)None

Class that estimates the pixel-wise frequency of vibrating objects using Metavision Vibration API.

process_events(self: metavision_sdk_analytics.FrequencyMapAsyncAlgorithm, events_np: numpy.ndarray[metavision_sdk_base._EventCD_decode])None

Processes a buffer of events for later frame generation

events_np

numpy structured array of events whose fields are (‘x’, ‘y’, ‘p’, ‘t’). Note that this order is mandatory

set_output_callback(self: metavision_sdk_analytics.FrequencyMapAsyncAlgorithm, output_cb: object)None

Sets a callback to get the output frequency map.

output_cb

Callback to call

property update_frequency

Sets the frequency at which the algorithm generates the frequency map.

Freq

Frequency at which the frequency map will be generated

class metavision_sdk_analytics.HeatMapFrameGeneratorAlgorithm(self: metavision_sdk_analytics.HeatMapFrameGeneratorAlgorithm, min_value: float, max_value: float, value_precision: float, width: int, height: int, unit_str: str = '', do_invert_cmap: bool = True)None

Class that produces a BGR image of a floating point value map.

A colormap bar at the bottom of the image shows the color convention. Pixels for which a value was not computed are shown in black. Also pixels outside the defined minimum/maximum values are shown in black too.

Constructor.

min_value

Minimum value

max_value

Maximum value

value_precision

Precision used to determine the number of decimal digits to display (0.5, 0.01, …)

width

Sensor’s width (in pixels)

height

Sensor’s height (in pixels)

unit_str

String representation of the unit of measurement, e.g. Hz, us, m/s …

cmap

Colormap used to colorize the value map

do_invert_cmap

Flip the uchar values before applying the color map

property full_height

Returns the full generated image’s height.

property full_width

Returns the full generated image’s width.

generate_bgr_heat_map(self: metavision_sdk_analytics.HeatMapFrameGeneratorAlgorithm, value_map: numpy.ndarray, out_image_bgr: numpy.ndarray)None

Draws the value map with a bar showing the colormap convention.

value_map

Input value map, 1 floating point channel (CV_32FC1)

out_image_bgr

Output image

get_output_image(self: metavision_sdk_analytics.HeatMapFrameGeneratorAlgorithm)numpy.ndarray[numpy.uint8]
class metavision_sdk_analytics.JetMonitoringAlarmConfig(self: metavision_sdk_analytics.JetMonitoringAlarmConfig)None

Jet monitoring alarm parameters.

property alarm_on_count

Activates/deactivates the alarm that is triggered when the jet count exceeds the expected_count value.

property alarm_on_cycle

Activates/deactivates alarm on cycle time.

property cycle_tol_percentage

Tolerance for estimated cycle time, in percentage of expected_cycle_ms.

property expected_cycle_ms

Expected cycle time (in ms).

property max_expected_count

Maximum expected number of jets.

set_expected_cycle_ms(self: metavision_sdk_analytics.JetMonitoringAlarmConfig, expected_cycle_ms: float, cycle_tol_percentage: float)None

Activates alarm on cycle time.

set_max_expected_count(self: metavision_sdk_analytics.JetMonitoringAlarmConfig, max_expected_count: int)None

Activates the alarm that is triggered when the jet count exceeds the expected_count value.

class metavision_sdk_analytics.JetMonitoringAlgorithm(self: metavision_sdk_analytics.JetMonitoringAlgorithm, algo_config: metavision_sdk_analytics.JetMonitoringAlgorithmConfig, alarm_config: metavision_sdk_analytics.JetMonitoringAlarmConfig)None

Class that detects, counts, and timestamps jets that are being dispensed.

The algorithm starts by splitting the Region Of Interest (ROI) provided by the user into three parts. On the one hand, the central ROI is used to detect jets by identifying peaks in the event-rate. On the other hand, the two surrounding ROIs are used to analyze the background activity.

Jet Monitoring results are provided through callbacks to which the user can subscribe. The first two provide the Jet Monitoring results: - JetCallback : when a jet is detected - AlarmCallback: when an alarm is raised

While the other two provide contextual information on the time slice that has just been processed: - SliceCallback: detailed information about the time slice (See

JetMonitoringSliceData)

  • AsyncCallback: end-timestamp and number of events of the time slice

Constructor.

algo_config

Jet monitoring parameters

alarm_config

Jet monitoring alarm parameters

process_events(self: metavision_sdk_analytics.JetMonitoringAlgorithm, events_np: numpy.ndarray[metavision_sdk_base._EventCD_decode])None

Processes a buffer of events for later frame generation

events_np

numpy structured array of events whose fields are (‘x’, ‘y’, ‘p’, ‘t’). Note that this order is mandatory

reset_state(self: metavision_sdk_analytics.JetMonitoringAlgorithm)None

Resets internal state.

set_on_alarm_callback(self: metavision_sdk_analytics.JetMonitoringAlgorithm, arg0: object)None

Sets the callback that is called when an alarm is raised.

cb

Callback processing a const reference of EventJetAlarm

set_on_async_callback(self: metavision_sdk_analytics.JetMonitoringAlgorithm, arg0: object)None

Sets the callback that is called at the end of each slice to provide AsyncAlgorithm-related data.

cb

Callback processing the time slice duration and the number of events processsed during time slice

set_on_jet_callback(self: metavision_sdk_analytics.JetMonitoringAlgorithm, arg0: object)None

Sets the callback that is called when a jet is detected.

cb

Callback processing a const reference of EventJet

set_on_slice_callback(self: metavision_sdk_analytics.JetMonitoringAlgorithm, arg0: object)None

Sets the callback that is called at the end of each slice to provide JetMonitoring-related data.

cb

Callback processing a const reference of JetMonitoringSliceData

class metavision_sdk_analytics.JetMonitoringAlgorithm.ROI(self: metavision_sdk_analytics.JetMonitoringAlgorithm.ROI, value: int)None

Members:

DETECTION

BG_NOISE_1

BG_NOISE_2

TOTAL

class metavision_sdk_analytics.JetMonitoringAlgorithmConfig(*args, **kwargs)

Jet monitoring algorithm parameters.

Overloaded function.

  1. __init__(self: metavision_sdk_analytics.JetMonitoringAlgorithmConfig, detection_roi: tuple, nozzle_orientation: Metavision::JetMonitoringAlgorithmConfig::Orientation) -> None

  2. __init__(self: metavision_sdk_analytics.JetMonitoringAlgorithmConfig, detection_roi: tuple, nozzle_orientation: Metavision::JetMonitoringAlgorithmConfig::Orientation, time_step_us: int, accumulation_time_us: int, th_up_kevps: int, th_down_kevps: int, th_up_delay_us: int, th_down_delay_us: int) -> None

get_detection_roi(self: metavision_sdk_analytics.JetMonitoringAlgorithmConfig)tuple

Gets the detection ROI.

property nozzle_orientation

Nozzle orientation in the image reference frame. Jets are moving either upwards, downwards, leftwards or rightwards.

class metavision_sdk_analytics.JetMonitoringAlgorithmConfig.Orientation(self: metavision_sdk_analytics.JetMonitoringAlgorithmConfig.Orientation, value: int)None

Members:

Down

Up

Left

Right

class metavision_sdk_analytics.JetMonitoringDrawingHelper(self: metavision_sdk_analytics.JetMonitoringDrawingHelper, camera_roi: tuple, jet_roi: tuple, nozzle_orientation: metavision_sdk_analytics.JetMonitoringAlgorithmConfig.Orientation)None

Class that superimposes jet monitoring results on events.

Constructor.

camera_roi

Region of interest used by the camera (Left x, Top y, width, height)

jet_roi

Region of interest used by the jet-monitoring algorithm to detect jets (Left x, Top y, width, height)

nozzle_orientation

Nozzle orientation

draw(self: metavision_sdk_analytics.JetMonitoringDrawingHelper, ts: int, count: int, er_kevps: int, image: numpy.ndarray)None

Updates data to display.

ts

Current timestamp

count

Last object count

er_kevps

Event rate in k-ev per second

output_img

Output image

class metavision_sdk_analytics.JetMonitoringSliceData(self: metavision_sdk_analytics.JetMonitoringSliceData)None

Structure that holds the data obtained by processing a time slice with JetMonitoringAlgorithm.

class metavision_sdk_analytics.LineClusterDrawingHelper(self: metavision_sdk_analytics.LineClusterDrawingHelper)None

Class that superimposes event-clusters on the horizontal lines drawn on an image filled with events.

draw(*args, **kwargs)

Overloaded function.

  1. draw(self: metavision_sdk_analytics.LineClusterDrawingHelper, image: numpy.ndarray, line_clusters: metavision_sdk_analytics.LineClustersOutputView) -> None

Draws colored segments along an horizontal line.

InputIt

Iterator to a cluster such as LineClusterWithId. Required class members are x_begin, x_end and id. The ID refers to the ordinate of the line cluster :output_img: Output image

first

First line cluster to display

last

Last line cluster to display

  1. draw(self: metavision_sdk_analytics.LineClusterDrawingHelper, image: numpy.ndarray, line_clusters: metavision_sdk_analytics.LineClustersOutputBuffer) -> None

Draws colored segments along an horizontal line.

InputIt

Iterator to a cluster such as LineClusterWithId. Required class members are x_begin, x_end and id. The ID refers to the ordinate of the line cluster :output_img: Output image

first

First line cluster to display

last

Last line cluster to display

class metavision_sdk_analytics.LineClusterTrackingConfig(self: metavision_sdk_analytics.LineClusterTrackingConfig, bitsets_buffer_size: int, cluster_ths: int = 3, num_clusters_ths: int = 4, min_inter_clusters_distance: int = 1, learning_rate: float = 1.0, max_dx_allowed: float = 5.0, max_nbr_empty_rows: int = 0)None

Struct representing the parameters used to instantiate a LineClusterTracker inside PsmAlgorithm.

Constructor.

bitsets_buffer_size

Size of the bitset circular buffer (accumulation_time = bitsets_buffer_size *precision_time_us )

cluster_ths

Minimum width (in pixels) below which clusters of events are considered as noise

num_clusters_ths

Minimum number of cluster measurements below which a particle is considered as noise

min_inter_clusters_distance

Once small clusters have been removed, merge clusters that are closer than this distance. This helps dealing with dead pixels that could cut particles in half. If set to 0, do nothing

learning_rate

Ratio in the weighted mean between the current x position and the observation. This is used only when the particle is shrinking, because the front of the particle is always sharp while the trail might be noisy. 0.0 is conservative and does not take the observation into account, whereas 1.0 has no memory and overwrites the cluster estimate with the new observation. A value outside ]0,1] disables the weighted mean, and 1.0 is used instead.

max_dx_allowed

Caps x variation at this value. A negative value disables the clamping. This is used only when the particle is shrinking, because the front of the particle is always sharp while the trail might be noisy.

max_nbr_empty_rows

Number of consecutive empty measurements that is tolerated

property bitsets_buffer_size

Size of the bitset circular buffer (accumulation_time = bitsets_buffer_size *precision_time_us ).

property cluster_ths

Minimum width (in pixels) below which clusters of events are considered as noise.

property learning_rate

Ratio in the weighted mean between the current x position and the observation. This is used only when the particle is shrinking, because the front of the particle is always sharp while the trail might be noisy. 0.0 is conservative and does not take the observation into account, whereas 1.0 has no memory and overwrites the cluster estimate with the new observation. A value outside ]0,1] disables the weighted mean, and 1.0 is used instead.

property max_dx_allowed

Caps x variation at this value. A negative value disables the clamping. This is used only when the particle is shrinking, because the front of the particle is always sharp while the trail might be noisy.

property max_nbr_empty_rows

Number of consecutive empty measurements that is tolerated.

property min_inter_clusters_distance

Once small clusters have been removed, merge clusters that are closer than this distance. This helps dealing with dead pixels that could cut particles in half. If set to 0, do nothing.

property num_clusters_ths

Minimum number of cluster measurements below which a particle is considered as noise.

class metavision_sdk_analytics.LineClustersOutputBuffer(self: metavision_sdk_analytics.LineClustersOutputBuffer, size: int = 0)None

Constructor

numpy(self: metavision_sdk_analytics.LineClustersOutputBuffer, copy: bool = False)numpy.ndarray[Metavision::LineClusterWithId]
Copy

if True, allocates new memory and returns a copy of the events. If False, use the same memory

resize(self: metavision_sdk_analytics.LineClustersOutputBuffer, size: int)None

resizes the buffer to the specified size

size

the new size of the buffer

metavision_sdk_analytics.LineClusterWithId

This is the numpy.dtype to represent numpy structured arrays of LineClusterWithId

class metavision_sdk_analytics.LineClustersOutputView
numpy(self: metavision_sdk_analytics.LineClustersOutputView, copy: bool = False)numpy.ndarray[Metavision::LineClusterWithId]
Copy

if True, allocates new memory and returns a copy of the events. If False, use the same memory

class metavision_sdk_analytics.LineParticleTrack

Structure storing information about a track of a particle matched over several rows.

property id

Track id.

property particle_size

Estimated size of the particle.

property t

Track timestamp.

property traj_coef_a

Coefficient a in the Linear Model X = a*Y + b.

property traj_coef_b

Coefficient b in the Linear Model X = a*Y + b.

class metavision_sdk_analytics.LineParticleTrackDrawingHelper(self: metavision_sdk_analytics.LineParticleTrackDrawingHelper, persistence_time_us: int)None

Class that superimposes Particle Size Measurement results on an image filled with events.

Constructor.

persistence_time_us

Time interval (in the events-clock) during which particle contours remain visible in the visualization. Since a track is sent only once, it will appear only on one frame if we don’t keep results in memory. We need to know how long we want to display these detected tracks

draw(self: metavision_sdk_analytics.LineParticleTrackDrawingHelper, ts: int, image: numpy.ndarray, tracks: metavision_sdk_analytics.LineParticleTrackingOutput)None

Stores and draws particle tracks.

InputTrackIt

An iterator type over a track of type LineParticleTrack :ts: Detection timestamp

output_img

Output image

begin

Begin iterator to the particle tracks to display

end

Past-end iterator to the particle tracks to display

class metavision_sdk_analytics.LineParticleTrackingConfig(self: metavision_sdk_analytics.LineParticleTrackingConfig, is_going_down: bool, dt_first_match_ths: int, tan_angle_ths: float = 1.0, matching_ths: float = 0.5)None

Struct representing the parameters used to instantiate a LineParticleTracker inside PsmAlgorithm.

Constructor.

is_going_down

True if the particle is falling, false if it’s going upwards

dt_first_match_ths

Maximum allowed duration to match the 2nd particle of a track

tan_angle_ths

Tangent of the angle with the vertical beyond which two particles on consecutive lines can’t be matched

matching_ths

Minimum similarity score in [0,1] needed to match two particles

property dt_first_match_ths

Maximum allowed duration to match the 2nd particle of a track.

property is_going_down

True if the particle is falling, false if it’s going upwards.

property matching_ths

Minimum similarity score in [0,1] needed to match two particles.

property tan_angle_ths

Tangent of the angle with the vertical beyond which two particles on consecutive lines can’t be matched.

class metavision_sdk_analytics.LineParticleTrackingOutput(self: metavision_sdk_analytics.LineParticleTrackingOutput)None

Class collecting information about LineParticle tracks.

property global_counter

Number of particles that have been matched over several lines.

property last_count_ts

Timestamp of the last count (in us).

class metavision_sdk_analytics.PeriodMapAsyncAlgorithm(self: metavision_sdk_analytics.PeriodMapAsyncAlgorithm, width: int, height: int, filter_length: int = 7, min_period: float = 6500, max_period: float = 100000.0, diff_thresh_us: int = 1500)None

Class that estimates the pixel-wise period of vibrating objects using Metavision Vibration API.

process_events(self: metavision_sdk_analytics.PeriodMapAsyncAlgorithm, events_np: numpy.ndarray[metavision_sdk_base._EventCD_decode])None

Processes a buffer of events for later frame generation

events_np

numpy structured array of events whose fields are (‘x’, ‘y’, ‘p’, ‘t’). Note that this order is mandatory

set_output_callback(self: metavision_sdk_analytics.PeriodMapAsyncAlgorithm, output_cb: object)None

Sets a callback to get the output period map.

output_cb

Callback to call

property update_frequency

Sets the frequency at which the algorithm generates the period map.

Freq

Frequency at which the period map will be generated

class metavision_sdk_analytics.PsmAlgorithm(self: metavision_sdk_analytics.PsmAlgorithm, width: int, height: int, rows: list, detection_config: Metavision::LineClusterTrackingConfig, tracking_config: Metavision::LineParticleTrackingConfig, num_process_before_matching: int)None

Class that both counts objects and estimates their size using Metavision Particle Size Measurement API.

process_events(self: metavision_sdk_analytics.PsmAlgorithm, events: numpy.ndarray[metavision_sdk_base._EventCD_decode], ts: int, tracks: metavision_sdk_analytics.LineParticleTrackingOutput, line_clusters: metavision_sdk_analytics.LineClustersOutputBuffer)None

Processes a buffer of events and retrieves the detected particles in one call.

InputIt

Read-Only input event iterator type. Works for iterators over buffers of EventCD or equivalent :it_begin: Iterator to the first input event

it_end

Iterator to the past-the-end event

ts

Upper bound timestamp of the events time slice

tracks

Detected particles for each line

line_clusters

Detected clusters for each line

reset(self: metavision_sdk_analytics.PsmAlgorithm)None

Resets line cluster trackers and line particle trackers.

class metavision_sdk_analytics.SpatterTrackerAlgorithm(self: metavision_sdk_analytics.SpatterTrackerAlgorithm, width: int, height: int, cell_width: int, cell_height: int, untracked_threshold: int = 5, activation_threshold: int = 10, apply_filter: bool = True, max_distance: int = 50, min_size: int = 1, max_size: int = 2147483647, min_track_time: int = 0, static_memory_us: int = 0, max_size_variation: int = 100, min_dist_moving_obj_pxl: int = 0, filter_type: metavision_sdk_analytics.SpatterTrackingConfig.FilterType = <FilterType.FilterNegative: 1>)None

Class that tracks spatter clusters using Metavision SpatterTracking API.

Builds a new SpatterTrackerAlgorithm object.

width

Sensor’s width (in pixels)

height

Sensor’s height (in pixels)

config

Spatter tracker’s configuration

add_nozone(self: metavision_sdk_analytics.SpatterTrackerAlgorithm, center: numpy.ndarray[numpy.int32], radius: int, filter_inside: bool = True)None

Adds a region that isn’t used for tracking.

center

Center of the region

radius

Radius of the region

inside

True if the region to filter is inside the defined shape, false otherwise

property get_cluster_count

Returns the current number of clusters.

Returns

The current number of clusters

process_events(*args, **kwargs)

Overloaded function.

  1. process_events(self: metavision_sdk_analytics.SpatterTrackerAlgorithm, events: numpy.ndarray[metavision_sdk_base._EventCD_decode], ts: int, clusters: metavision_sdk_analytics.EventSpatterClusterBuffer) -> None

Processes a buffer of events and retrieves the detected clusters in one call.

InputIt

Read-Only input event iterator type. Works for iterators over buffers of EventCD or equivalent :it_begin: Iterator to the first input event

it_end

Iterator to the past-the-end event

ts

Upper bound timestamp of the events time slice

clusters

Detected clusters

  1. process_events(self: metavision_sdk_analytics.SpatterTrackerAlgorithm, events_np: numpy.ndarray[metavision_sdk_base._EventCD_decode]) -> None

Processes a buffer of events for later frame generation

events_np

numpy structured array of events whose fields are (‘x’, ‘y’, ‘p’, ‘t’). Note that this order is mandatory

set_nozone(self: metavision_sdk_analytics.SpatterTrackerAlgorithm, arg0: numpy.ndarray[numpy.int32], arg1: int)None

Sets a no-track zone to not consider events in this region

class metavision_sdk_analytics.TrackingAlgorithm(self: metavision_sdk_analytics.TrackingAlgorithm, sensor_width: int, sensor_height: int, tracking_config: metavision_sdk_analytics.TrackingConfig)None

Class that tracks objects using Metavision Tracking API.

Builds a new TrackingAlgorithm object.

sensor_width

Sensor’s width.

sensor_height

Sensor’s height.

config

Tracking’s configuration.

static get_empty_output_buffer()metavision_sdk_analytics.EventTrackingDataBuffer
property max_size

Size of the biggest trackable object

property max_speed

Speed of the fastest trackable object

property min_size

Size of the smallest trackable object

property min_speed

Speed of the slowest trackable object

process_events(*args, **kwargs)

Overloaded function.

  1. process_events(self: metavision_sdk_analytics.TrackingAlgorithm, arg0: metavision_sdk_core.RollingEventCDBuffer, arg1: metavision_sdk_analytics.EventTrackingDataBuffer) -> None

  2. process_events(self: metavision_sdk_analytics.TrackingAlgorithm, arg0: numpy.ndarray[metavision_sdk_base._EventCD_decode], arg1: metavision_sdk_analytics.EventTrackingDataBuffer) -> None

  3. process_events(self: metavision_sdk_analytics.TrackingAlgorithm, arg0: metavision_sdk_base.EventCDBuffer, arg1: metavision_sdk_analytics.EventTrackingDataBuffer) -> None

class metavision_sdk_analytics.TrackingConfig(self: metavision_sdk_analytics.TrackingConfig)None
class metavision_sdk_analytics.TrackingConfig.ClusterMaker(self: metavision_sdk_analytics.TrackingConfig.ClusterMaker, value: int)None

Members:

SimpleGrid

MedoidShift

class metavision_sdk_analytics.TrackingConfig.DataAssociation(self: metavision_sdk_analytics.TrackingConfig.DataAssociation, value: int)None

Members:

Nearest

IOU

class metavision_sdk_analytics.TrackingConfig.KalmanModel(self: metavision_sdk_analytics.TrackingConfig.KalmanModel, value: int)None

Members:

ConstantVelocity

ConstantAcceleration

class metavision_sdk_analytics.TrackingConfig.KalmanPolicy(self: metavision_sdk_analytics.TrackingConfig.KalmanPolicy, value: int)None

Members:

AdaptiveNoise

MeasurementTrust

class metavision_sdk_analytics.TrackingConfig.MotionModel(self: metavision_sdk_analytics.TrackingConfig.MotionModel, value: int)None

Members:

Simple

Instant

Smooth

Kalman

class metavision_sdk_analytics.TrackingConfig.Tracker(self: metavision_sdk_analytics.TrackingConfig.Tracker, value: int)None

Members:

Ellipse

ClusterKF

class metavision_sdk_analytics.TrackingConfig.EllipseUpdateFunction(self: metavision_sdk_analytics.TrackingConfig.EllipseUpdateFunction, value: int)None

Members:

Uniform

Gaussian

SignedGaussian

TruncatedGaussian

class metavision_sdk_analytics.TrackingConfig.EllipseUpdateMethod(self: metavision_sdk_analytics.TrackingConfig.EllipseUpdateMethod, value: int)None

Members:

PerEvent

EllipseFitting

GaussianFitting

EllipseFittingFull

GaussianFittingFull

metavision_sdk_analytics.draw_tracking_results(*args, **kwargs)

Overloaded function.

  1. draw_tracking_results(ts: int, spatter_clusters: numpy.ndarray[Metavision::EventSpatterCluster], image: numpy.ndarray) -> None

Draws spatter cluster events.

Drawing function used to draw tracking results from the SpatterTrackerAlgorithm.

  1. draw_tracking_results(ts: int, tracking_results: numpy.ndarray[Metavision::EventTrackingData], image: numpy.ndarray) -> None

Draws tracking data events.

Drawing function used to draw tracking results from the @ref TrackingAlgorithm (i.e. @ref EventTrackingData). Results are drawn as bounding boxes with tracked objects’ ids beside.