SDK ML Utils API

Convenience class used to save features to an HDF5 tensor file.

class metavision_ml.utils.h5_writer.HDF5Writer(filename, dataset_name, shape, dtype=<class 'numpy.uint8'>, attrs={}, mode='w', store_as_uint8=False)

Convenience class used to save features to an HDF5 tensor file.

https://docs.h5py.org/en/stable/high/dataset.html

Parameters
  • filename (string) – Path to the destination file

  • dataset_name (string) – name of the dataset to write.

  • shape (int List) – shape of the features written to disks, the actual shape of the dataset is [-1,] + shape (since the total number of features written to disk is not known at initialisation time.)

  • dtype (np.dtype) – dtype specifying the features precision.

  • {dictionary} (attrs) – dictionary of attributes for the dataset. It consists in metadata that needs to be contained in the result file.

  • mode (string) – mode for opening the file. Defaults to write “w”.

  • store_as_uint8 (boolean) – if True, casts to byte before storing to save space. Only supports 0-1 normalized data.

index

correspond to the number of feature already written to HDF5.

Type

int

Examples

>>> f = HDF5Writer("example.h5", "data", [2, 480, 320], dtype=np.uint8)
>>> f.write(np.empty((15, 2, 480, 320), dtype=np.uint8))
>>> f.write(np.zeros((12, 2, 480, 320), dtype=np.uint8))
>>> f.close()
set_cursor(index)

Sets the cursor of where to write the next features. Use with caution !

Can be used to overwrite or drop some already written data.

Parameters

index (int) – new cursor position.

Examples

>>> # ... some feature were written
>>> hdf5_writer.set_cursor(hdf5_writer.index - 1)  # drop the last frame
>>> hdf5_writer.set_cursor(0)  #  ! drop all previously written features !
write(array, last_timestamps=None)

Appends an array of features to the dataset.

The underlying HDF5 dataset gets extended when necessary.

Parameters
  • array (np.ndarray) – feature array, its shape must be [*,] + self.shape and its dtype must be convertible to the dtype of the dataset.

  • last_timestamps (np.ndarray) – timestamps of the last events for each feature slice, its shape must be equal to array.shape[0]

Tools common to training main functions.

metavision_ml.utils.main_tools.check_input_power_2(event_input_height, event_input_width, height=None, width=None)

Checks that the provided height and width are indeed a negative power of two of the input.

Parameters
  • event_input_height (int) – height of the sensor in pixels.

  • event_input_width (int) – width of the sensor in pixels.

  • height (int) – desired height of features after rescaling

  • width (int) – desired width of features after rescaling

metavision_ml.utils.main_tools.get_original_size_file(path)

Returns the couple (height, width) of a file.

Parameters

path (string) – File path, either a DAT, RAW or HDF5 file.

metavision_ml.utils.main_tools.infer_preprocessing(params, h5path=None)

Infer the preprocessing parameters via reading attributes of first found HDF5 file.

Parameters
  • params – struct containing training parameters.

  • h5path (string) – optional path of an HDF5 file from the dataset, its attributes are used to override the preprocessing parameters.

Returns

tensor shape of a single item of a batch (num time bins, num channels, height, width) preprocess (string): name of the preprocessing used. delta_t (int): duration of a temporal bin in us. mode (string): mode to generate the HDF5 data (“delta_t” or “n_events”) n_events (int): number of events in the slice (if mode==”n_events”), 0 otherwise preprocess_kwargs : preprocessing args used to generate the HDF5 data

Return type

array_dim (int List)