stytra package

Subpackages

Submodules

stytra.utilities module

class stytra.utilities.Database[source]

Bases: object

inset_experiment_data(exp_data)[source]
Parameters

exp_data (the data collector dictionary) –

Returns

Return type

index of database entry

class stytra.utilities.FramerateRecorder(n_fps_frames=5)[source]

Bases: object

update_framerate()[source]

Calculate the framerate every n_fps_frames frames.

class stytra.utilities.FrameProcess(name='', n_fps_frames=10)[source]

Bases: multiprocessing.context.Process

A basic class for a process that deals with frames. It provides framerate calculation.

Parameters

n_fps_frames – the maximal number of frames to use to calculate framerate

update_framerate()[source]
stytra.utilities.prepare_json(it, **kwargs)[source]

Used to create a dictionary which will be safe to put in MongoDB

Parameters
  • it – the item which will be recursively sanitized

  • **kwargs

    convert_datetime: bool

    if datetiems are to be converted to strings for JSON serialization

    eliminate_df: bool

    remove dataframes from the dictionary

stytra.utilities.get_default_args(func)[source]

Find default arguments of functions

Parameters

func

stytra.utilities.strip_values(it)[source]
Parameters

it

stytra.utilities.interpolate_nan(a)[source]
stytra.utilities.get_classes_from_module(input_module, parent_class)[source]

Find all the classes in a module that are children of a parent one.

Parameters
  • input_module – module object

  • parent_class – parent class object

Returns

OrderedDict of subclasses found

Return type

type

stytra.utilities.recursive_update(d, u)[source]

Simple recursive update of dictionaries, from StackOverflow

Parameters
  • d – dict to update

  • u – new values

Returns

stytra.utilities.reduce_to_pi[source]

Puts an angle or array of angles inside the (-pi, pi) range

stytra.utilities.save_df(df, path, fileformat)[source]

Saves the dataframe in one of the supported formats

Parameters
  • df

  • path

  • fileformat

Module contents

class stytra.Stytra(recording=None, exec=True, app=None, **kwargs)[source]

Bases: object

Stytra application instance. Contains the QApplication and constructs the appropriate experiment object for the specified parameters

Parameters
  • protocol (Protocol) – the protocols to be made available from the dropdown

  • display (dict) –

    full_screenbool

    displays the stimulus full screen on the secondary monitor, otherwise it is in a window

    window_sizetuple(int, int)

    optional specification of the size of the stimulus display area

  • camera (dict) –

    video_filestr

    or

    type: str

    supported cameras are “ximea” (with the official API) “avt” (With the Pymba API) “spinnaker” (PointGray/FLIR) “mikrotron” (via NI Vision C API)

    rotation: int

    how many times to rotate the camera image by 90 degrees to get the right orientation, matching the projector

    downsampling: int

    how many times to downsample the image (for some ximea cameras)

    roi: tuple of int (x, y, w, h)

    ROI for cameras that support it

    max_buffer_length: int, default 1000

    the maximal length of the replay buffer in frames, can to be adjusted depending on the memory of the computer and the camera resolution and framerate

  • tracking (dict) –

    preprocessing_method: str, optional

    ”prefilter” or “bgsub”

    method: str

    one of “tail”, “eyes” or “fish”

    estimator: str or class
    for closed-loop experiments: either “vigor” for embedded experiments

    or “position” for freely-swimming ones. A custom estimator can be supplied.

  • recording (bool (False) or dict) –

    for video-recording experiments
    extension: mp4 (default) or h5

    take care, if saving as h5 all frames are first stored in memory, potentially overfilling it

    kbit_rate: int

    for mp4 format, target kilobits per second of video

  • embedded (bool) – if not embedded, use circle calibrator to match the camera and projector

  • dir_assets (str) – the location of assets used for stimulation (pictures, videos, models for closed loop etc.)

  • dir_save (str) – directory where the experiment data will be saved

  • metadata_animal (class) – subclass of AnimalMetadata adding information from a specific lab (species, genetic lines, pharmacological treatments etc.)

  • metadata_general (class) – subclass of GeneralMetadata, containing lab-specific information (setup names, experimenter names…)

  • record_stim_framerate (int) – if non-0 recodrds the displayed stimuli into an array which is saved alongside the other data.

  • trigger (object) – a trigger object, synchronising stimulus presentation to imaging acquisition

  • n_tracking_processes (int) – number of tracking processes to be used. Using more than 1 can improve performance but also cause issues in state-dependent tracking functions.