SDK Analytics Python bindings API

class metavision_sdk_analytics.CountingAlgorithm

Class to count objects using Metavision Counting API.

add_line_counters(self: metavision_sdk_analytics.CountingAlgorithm, rows: list)None

Adds new lines to count objects

process_events(self: metavision_sdk_analytics.CountingAlgorithm, events_np: numpy.ndarray[metavision_sdk_base._EventCD_decode])None

Processes a buffer of events for later frame generation

events_np

numpy structured array of events

reset_counters(self: metavision_sdk_analytics.CountingAlgorithm)None

Resets the count of all lines.

set_output_callback(self: metavision_sdk_analytics.CountingAlgorithm, arg0: object)None

Function to pass a callback to get the last count

class metavision_sdk_analytics.CountingCalibration

Class representing the counting calibration.

static calibrate(width: int, height: int, object_min_size: float = 5, object_average_speed: float = 5, distance_object_camera: float = 300, horizontal_fov: float = 56.0, vertical_fov: float = 44.0, travelled_pix_distance_during_acc_time: int = 9)tuple

Finds optimal parameters for the counting algorithm.

width

Sensor’s width in pixels

height

Sensor’s height in pixels

object_min_size

Approximate largest dimension of the smallest object (in mm). The value must be positive. It will be refined during the calibration

object_average_speed

Approximate average speed of an object to count (in m/s). It will be refined during the calibration.

distance_object_camera

Average distance between the flow of objects to count and the camera (in mm) Camera must look perpendicular to the object falling plane. It will be refined during the calibration

horizontal_fov

Horizontal field of view (half of the solid angle perceived by the sensor along the horizontal axis, in degrees)

vertical_fov

Vertical field of view (half of the solid angle perceived by the sensor along the vertical axis, in degrees)

travelled_pix_distance_during_acc_time

Distance (in pixels) travelled during the accumulation time

class metavision_sdk_analytics.CountingDrawingHelper

Class that superimposes line counting results on events.

add_line_counters(self: metavision_sdk_analytics.CountingDrawingHelper, rows: list)None

Adds new line counter ordinates

rows

list of line ordinates

draw(self: metavision_sdk_analytics.CountingDrawingHelper, ts: int, count: int, image: numpy.ndarray)None

Updates data to display.

ts

Current timestamp

count

Last object count

output_img

Output image

class metavision_sdk_analytics.DominantFrequencyEventsAlgorithm

Class computing the dominant frequency from frequency events

compute_dominant_value(self: metavision_sdk_analytics.DominantFrequencyEventsAlgorithm, input_frequency_events_np: numpy.ndarray[Metavision::Event2dFrequency<float>])tuple

Computes the dominant frequency from frequency events

class metavision_sdk_analytics.DominantPeriodEventsAlgorithm

Class computing the dominant period from period events

compute_dominant_value(self: metavision_sdk_analytics.DominantPeriodEventsAlgorithm, input_period_events_np: numpy.ndarray[Metavision::Event2dPeriod<float>])tuple

Computes the dominant period from period events

class metavision_sdk_analytics.DominantValueMapAlgorithm

Class computing the dominant value of a map

compute_dominant_value(self: metavision_sdk_analytics.DominantValueMapAlgorithm, value_map: numpy.ndarray)tuple

Computes the dominant value from a value map

class metavision_sdk_analytics.EventJet

Struct representing a detected jet event.

property count

Jet number.

property previous_jet_dt

Time since the beginning of the last jet (in us). A negative value means this time-difference isn’t defined yet because so far there has only been at most one jet.

property t

Timestamp of the beginning of the jet (in us).

class metavision_sdk_analytics.EventJetAlarm.AlarmType

Types of jet monitoring alarms.

Members:

JetNotDetected

JetTooEarly

TooManyJets

class metavision_sdk_analytics.EventSpatterClusterBuffer
numpy(self: metavision_sdk_analytics.EventSpatterClusterBuffer, copy: bool = False)numpy.ndarray[Metavision::EventSpatterCluster]
Copy

if True, allocates new memory and returns a copy of the events. If False, use the same memory

class metavision_sdk_analytics.EventSpatterClusterView
numpy(self: metavision_sdk_analytics.EventSpatterClusterView, copy: bool = False)numpy.ndarray[Metavision::EventSpatterCluster]
Copy

if True, allocates new memory and returns a copy of the events. If False, use the same memory

class metavision_sdk_analytics.EventTrackingDataBuffer
numpy(self: metavision_sdk_analytics.EventTrackingDataBuffer, copy: bool = False)numpy.ndarray[Metavision::EventTrackingData]
Copy

if True, allocates new memory and returns a copy of the events. If False, use the same memory

class metavision_sdk_analytics.EventTrackingDataView
numpy(self: metavision_sdk_analytics.EventTrackingDataView, copy: bool = False)numpy.ndarray[Metavision::EventTrackingData]
Copy

if True, allocates new memory and returns a copy of the events. If False, use the same memory

class metavision_sdk_analytics.FrequencyMapAsyncAlgorithm

Class that estimates the pixel-wise frequency of vibrating objects using Metavision Vibration API.

process_events(self: metavision_sdk_analytics.FrequencyMapAsyncAlgorithm, events_np: numpy.ndarray[metavision_sdk_base._EventCD_decode])None

Processes a buffer of events for later frame generation

events_np

numpy structured array of events

set_output_callback(self: metavision_sdk_analytics.FrequencyMapAsyncAlgorithm, output_cb: object)None

Sets a callback to get the output frequency map.

output_cb

Callback to call

property update_frequency

Sets the frequency at which the algorithm generates the frequency map.

Freq

Frequency at which the frequency map will be generated

class metavision_sdk_analytics.HeatMapFrameGeneratorAlgorithm

Class that produces a BGR image of a floating point value map.

A colormap bar at the bottom of the image shows the color convention. Pixels for which a value was not computed are shown in black. Also pixels outside the defined minimum/maximum values are shown in black too.

property full_height

Returns the full generated image’s height.

property full_width

Returns the full generated image’s width.

generate_bgr_heat_map(self: metavision_sdk_analytics.HeatMapFrameGeneratorAlgorithm, value_map: numpy.ndarray, out_image_bgr: numpy.ndarray)None

Draws the value map with a bar showing the colormap convention.

value_map

Input value map, 1 floating point channel (CV_32FC1)

out_image_bgr

Output image

get_output_image(self: metavision_sdk_analytics.HeatMapFrameGeneratorAlgorithm)numpy.ndarray[numpy.uint8]
class metavision_sdk_analytics.JetMonitoringAlarmConfig

Jet monitoring alarm parameters.

property alarm_on_count

Activates/deactivates the alarm that is triggered when the jet count exceeds the expected_count value.

property alarm_on_cycle

Activates/deactivates alarm on cycle time.

property cycle_tol_percentage

Tolerance for estimated cycle time, in percentage of expected_cycle_ms.

property expected_cycle_ms

Expected cycle time (in ms).

property max_expected_count

Maximum expected number of jets.

set_expected_cycle_ms(self: metavision_sdk_analytics.JetMonitoringAlarmConfig, expected_cycle_ms: float, cycle_tol_percentage: float)None

Activates alarm on cycle time.

set_max_expected_count(self: metavision_sdk_analytics.JetMonitoringAlarmConfig, max_expected_count: int)None

Activates the alarm that is triggered when the jet count exceeds the expected_count value.

class metavision_sdk_analytics.JetMonitoringAlgorithm

Class that detects, counts, and timestamps jets that are being dispensed.

The algorithm starts by splitting the Region Of Interest (ROI) provided by the user into three parts. On the one hand, the central ROI is used to detect jets by identifying peaks in the event-rate. On the other hand, the two surrounding ROIs are used to analyze the background activity.

Jet Monitoring results are provided through callbacks to which the user can subscribe. The first two provide the Jet Monitoring results: - JetCallback : when a jet is detected - AlarmCallback: when an alarm is raised

While the other two provide contextual information on the time slice that has just been processed: - SliceCallback: detailed information about the time slice (See

JetMonitoringSliceData)

  • AsyncCallback: end-timestamp and number of events of the time slice

process_events(self: metavision_sdk_analytics.JetMonitoringAlgorithm, events_np: numpy.ndarray[metavision_sdk_base._EventCD_decode])None

Processes a buffer of events for later frame generation

events_np

numpy structured array of events

reset_state(self: metavision_sdk_analytics.JetMonitoringAlgorithm)None

Resets internal state.

set_on_alarm_callback(self: metavision_sdk_analytics.JetMonitoringAlgorithm, arg0: object)None

Sets the callback that is called when an alarm is raised.

cb

Callback processing a const reference of EventJetAlarm

set_on_async_callback(self: metavision_sdk_analytics.JetMonitoringAlgorithm, arg0: object)None

Sets the callback that is called at the end of each slice to provide AsyncAlgorithm-related data.

cb

Callback processing the time slice duration and the number of events processsed during time slice

set_on_jet_callback(self: metavision_sdk_analytics.JetMonitoringAlgorithm, arg0: object)None

Sets the callback that is called when a jet is detected.

cb

Callback processing a const reference of EventJet

set_on_slice_callback(self: metavision_sdk_analytics.JetMonitoringAlgorithm, arg0: object)None

Sets the callback that is called at the end of each slice to provide JetMonitoring-related data.

cb

Callback processing a const reference of JetMonitoringSliceData

class metavision_sdk_analytics.JetMonitoringAlgorithm.ROI

Members:

DETECTION

BG_NOISE_1

BG_NOISE_2

TOTAL

class metavision_sdk_analytics.JetMonitoringAlgorithmConfig

Jet monitoring algorithm parameters.

get_detection_roi(self: metavision_sdk_analytics.JetMonitoringAlgorithmConfig)tuple

Gets the detection ROI.

property nozzle_orientation

Nozzle orientation in the image reference frame. Jets are moving either upwards, downwards, leftwards or rightwards.

class metavision_sdk_analytics.JetMonitoringAlgorithmConfig.Orientation

Members:

Down

Up

Left

Right

class metavision_sdk_analytics.JetMonitoringDrawingHelper

Class that superimposes jet monitoring results on events.

draw(self: metavision_sdk_analytics.JetMonitoringDrawingHelper, ts: int, count: int, er_kevps: int, image: numpy.ndarray)None

Updates data to display.

ts

Current timestamp

count

Last object count

er_kevps

Event rate in k-ev per second

output_img

Output image

class metavision_sdk_analytics.JetMonitoringSliceData

Structure that holds the data obtained by processing a time slice with JetMonitoringAlgorithm.

class metavision_sdk_analytics.LineClusterDrawingHelper

Class that superimposes event-clusters on the horizontal lines drawn on an image filled with events.

draw(*args, **kwargs)

Overloaded function.

  1. draw(self: metavision_sdk_analytics.LineClusterDrawingHelper, image: numpy.ndarray, line_clusters: metavision_sdk_analytics.LineClustersOutputView) -> None

Draws colored segments along an horizontal line.

InputIt

Iterator to a cluster such as LineClusterWithId. Required class members are x_begin, x_end and id. The ID refers to the ordinate of the line cluster :output_img: Output image

first

First line cluster to display

last

Last line cluster to display

  1. draw(self: metavision_sdk_analytics.LineClusterDrawingHelper, image: numpy.ndarray, line_clusters: metavision_sdk_analytics.LineClustersOutputBuffer) -> None

Draws colored segments along an horizontal line.

InputIt

Iterator to a cluster such as LineClusterWithId. Required class members are x_begin, x_end and id. The ID refers to the ordinate of the line cluster :output_img: Output image

first

First line cluster to display

last

Last line cluster to display

class metavision_sdk_analytics.LineClusterTrackingConfig

Struct representing the parameters used to instantiate a LineClusterTracker inside PsmAlgorithm.

property bitsets_buffer_size

Size of the bitset circular buffer (accumulation_time = bitsets_buffer_size *precision_time_us ).

property cluster_ths

Minimum width (in pixels) below which clusters of events are considered as noise.

property learning_rate

Ratio in the weighted mean between the current x position and the observation. This is used only when the particle is shrinking, because the front of the particle is always sharp while the trail might be noisy. 0.0 is conservative and does not take the observation into account, whereas 1.0 has no memory and overwrites the cluster estimate with the new observation. A value outside ]0,1] disables the weighted mean, and 1.0 is used instead.

property max_dx_allowed

Caps x variation at this value. A negative value disables the clamping. This is used only when the particle is shrinking, because the front of the particle is always sharp while the trail might be noisy.

property max_nbr_empty_rows

Number of consecutive empty measurements that is tolerated.

property min_inter_clusters_distance

Once small clusters have been removed, merge clusters that are closer than this distance. This helps dealing with dead pixels that could cut particles in half. If set to 0, do nothing.

property num_clusters_ths

Minimum number of cluster measurements below which a particle is considered as noise.

property precision_time_us

Time duration between two asynchronous processes (us).

class metavision_sdk_analytics.LineClustersOutputBuffer
numpy(self: metavision_sdk_analytics.LineClustersOutputBuffer, copy: bool = False)numpy.ndarray[Metavision::LineClusterWithId]
Copy

if True, allocates new memory and returns a copy of the events. If False, use the same memory

class metavision_sdk_analytics.LineClustersOutputView
numpy(self: metavision_sdk_analytics.LineClustersOutputView, copy: bool = False)numpy.ndarray[Metavision::LineClusterWithId]
Copy

if True, allocates new memory and returns a copy of the events. If False, use the same memory

class metavision_sdk_analytics.LineParticleTrack

Structure storing information about a track of a particle matched over several rows.

property id

Track id.

property particle_size

Estimated size of the particle.

property t

Track timestamp.

property traj_coef_a

Coefficient a in the Linear Model X = a*Y + b.

property traj_coef_b

Coefficient b in the Linear Model X = a*Y + b.

class metavision_sdk_analytics.LineParticleTrackDrawingHelper

Class that superimposes Particle Size Measurement results on an image filled with events.

draw(self: metavision_sdk_analytics.LineParticleTrackDrawingHelper, ts: int, image: numpy.ndarray, tracks: metavision_sdk_analytics.LineParticleTrackingOutput)None

Stores and draws particle tracks.

InputTrackIt

An iterator type over a track of type LineParticleTrack :ts: Detection timestamp

output_img

Output image

begin

Begin iterator to the particle tracks to display

end

Past-end iterator to the particle tracks to display

class metavision_sdk_analytics.LineParticleTrackingConfig

Struct representing the parameters used to instantiate a LineParticleTracker inside PsmAlgorithm.

property dt_first_match_ths

Maximum allowed duration to match the 2nd particle of a track.

property is_going_down

True if the particle is falling, false if it’s going upwards.

property matching_ths

Minimum similarity score in [0,1] needed to match two particles.

property tan_angle_ths

Tangent of the angle with the vertical beyond which two particles on consecutive lines can’t be matched.

class metavision_sdk_analytics.LineParticleTrackingOutput

Class collecting information about LineParticle tracks.

property global_counter

Number of particles that have been matched over several lines.

property last_count_ts

Timestamp of the last count (in us).

class metavision_sdk_analytics.PeriodMapAsyncAlgorithm

Class that estimates the pixel-wise period of vibrating objects using Metavision Vibration API.

process_events(self: metavision_sdk_analytics.PeriodMapAsyncAlgorithm, events_np: numpy.ndarray[metavision_sdk_base._EventCD_decode])None

Processes a buffer of events for later frame generation

events_np

numpy structured array of events

set_output_callback(self: metavision_sdk_analytics.PeriodMapAsyncAlgorithm, output_cb: object)None

Sets a callback to get the output period map.

output_cb

Callback to call

property update_frequency

Sets the frequency at which the algorithm generates the period map.

Freq

Frequency at which the period map will be generated

class metavision_sdk_analytics.PsmAlgorithm

Class that both counts objects and estimates their size using Metavision Particle Size Measurement API.

process_events(self: metavision_sdk_analytics.PsmAlgorithm, events_np: numpy.ndarray[metavision_sdk_base._EventCD_decode])None

Processes a buffer of events for later frame generation

events_np

numpy structured array of events

reset(self: metavision_sdk_analytics.PsmAlgorithm)None

Resets line cluster trackers and line particle trackers.

set_output_callback(self: metavision_sdk_analytics.PsmAlgorithm, arg0: object)None

Function to pass a callback to get updated instances of LineParticleTrackingOutput for particles count, sizes and trajectories, and LineCluster for the event-clusters aggregated along the rows.

output_cb

Function to call

note

The generated objects will be passed to the callback as a non constant references, meaning that the user is free to copy it or swap it using std::swap. In case of a swap with a non initialized object, it will be automatically initialized

class metavision_sdk_analytics.SpatterTrackerAlgorithm

Class that tracks spatter clusters using Metavision SpatterTracking API.

property get_cluster_count

Returns the current number of clusters.

Returns

The current number of clusters

process_events(self: metavision_sdk_analytics.SpatterTrackerAlgorithm, events_np: numpy.ndarray[metavision_sdk_base._EventCD_decode])None

Processes a buffer of events for later frame generation

events_np

numpy structured array of events

set_nozone(self: metavision_sdk_analytics.SpatterTrackerAlgorithm, center: numpy.ndarray[numpy.int32], radius: int)None

Sets the region that isn’t processed.

center

Center of the region

radius

Radius of the region

set_output_callback(self: metavision_sdk_analytics.SpatterTrackerAlgorithm, arg0: object)None

Function to pass a callback to get the output cluster events

set_output_callback_read_only(self: metavision_sdk_analytics.SpatterTrackerAlgorithm, arg0: object)None

Function to pass a callback to get the output cluster events. The output cluster events buffer is passed as a “view” on the algorithm’s internal buffer. This both meansthat the user doesn’t take ownership of the underlying data and no copy is involved.

class metavision_sdk_analytics.TrackingAlgorithm

Class that tracks objects using Metavision Tracking API.

property max_size

Size of the biggest trackable object

property max_speed

Speed of the fastest trackable object

property min_size

Size of the smallest trackable object

property min_speed

Speed of the slowest trackable object

process_events(self: metavision_sdk_analytics.TrackingAlgorithm, events_np: numpy.ndarray[metavision_sdk_base._EventCD_decode])None

Processes a buffer of events for later frame generation

events_np

numpy structured array of events

set_output_callback(self: metavision_sdk_analytics.TrackingAlgorithm, output_cb: object)None

Sets a callback to retrieve the list of tracked objects (see EventTrackingData) when the tracker is updated (see set_update_frequency).

note

The generated vector will be passed to the callback as a non constant reference, meaning that the client is free to copy it or swap it. In case of a swap, the swapped vector will be automatically cleaned. :output_cb: Function to call

property update_frequency

Sets the frequency at which the algorithm generates the output

class metavision_sdk_analytics.TrackingConfig.ClusterMaker

Members:

SimpleGrid

MedoidShift

class metavision_sdk_analytics.TrackingConfig.DataAssociation

Members:

Nearest

IOU

class metavision_sdk_analytics.TrackingConfig.EllipseUpdateFunction

Members:

Uniform

Gaussian

SignedGaussian

TruncatedGaussian

class metavision_sdk_analytics.TrackingConfig.EllipseUpdateMethod

Members:

PerEvent

EllipseFitting

GaussianFitting

EllipseFittingFull

GaussianFittingFull

class metavision_sdk_analytics.TrackingConfig.KalmanModel

Members:

ConstantVelocity

ConstantAcceleration

class metavision_sdk_analytics.TrackingConfig.KalmanPolicy

Members:

AdaptiveNoise

MeasurementTrust

class metavision_sdk_analytics.TrackingConfig.MotionModel

Members:

Simple

Instant

Smooth

Kalman

class metavision_sdk_analytics.TrackingConfig.Tracker

Members:

Ellipse

ClusterKF

metavision_sdk_analytics.draw_tracking_results(*args, **kwargs)

Overloaded function.

  1. draw_tracking_results(ts: int, spatter_clusters: numpy.ndarray[Metavision::EventSpatterCluster], image: numpy.ndarray) -> None

Draws spatter cluster events.

Drawing function used to draw tracking results from the SpatterTrackerAlgorithm.

  1. draw_tracking_results(ts: int, tracking_results: numpy.ndarray[Metavision::EventTrackingData], image: numpy.ndarray) -> None

Draws tracking data events.

Drawing function used to draw tracking results from the @ref TrackingAlgorithm (i.e. @ref EventTrackingData). Results are drawn as bounding boxes with tracked objects’ ids beside.