stytra.tracking package

Submodules

stytra.tracking.eyes module

Authors: Andreas Kist, Luigi Petrucco

class stytra.tracking.eyes.EyeTrackingMethod(*args, **kwargs)[source]

Bases: stytra.tracking.pipelines.ImageToDataNode

General eyes tracking method.

name = 'eyes'

stytra.tracking.fish module

class stytra.tracking.fish.FishTrackingMethod(*args, **kwargs)[source]

Bases: stytra.tracking.pipelines.ImageToDataNode

changed(vals)[source]
reset()[source]
stytra.tracking.fish.points_to_angles[source]
stytra.tracking.fish.fish_start[source]

stytra.tracking.online_bouts module

class stytra.tracking.online_bouts.BoutState(state, vel, i_inbout, i_below, n_after)

Bases: tuple

i_below

Alias for field number 3

i_inbout

Alias for field number 2

n_after

Alias for field number 4

state

Alias for field number 0

vel

Alias for field number 1

stytra.tracking.online_bouts.find_bouts_online[source]

Online bout detection

Parameters
  • velocities

  • coords

  • initial_state

  • bout_coords

stytra.tracking.pipelines module

class stytra.tracking.pipelines.NodeOutput(messages, data)

Bases: tuple

data

Alias for field number 1

messages

Alias for field number 0

class stytra.tracking.pipelines.PipelineNode(*args, **kwargs)[source]

Bases: anytree.node.node.Node

reset()[source]
changed(vals)[source]
setup()[source]
output_type_changed
strpath
process(*inputs) → stytra.tracking.pipelines.NodeOutput[source]
class stytra.tracking.pipelines.ImageToImageNode(*args, **kwargs)[source]

Bases: stytra.tracking.pipelines.PipelineNode

output_type_changed
acknowledge_changes()[source]
class stytra.tracking.pipelines.SourceNode(*args, **kwargs)[source]

Bases: stytra.tracking.pipelines.ImageToImageNode

class stytra.tracking.pipelines.ImageToDataNode(*args, **kwargs)[source]

Bases: stytra.tracking.pipelines.PipelineNode

output_type_changed
acknowledge_changes()[source]
class stytra.tracking.pipelines.Pipeline[source]

Bases: object

headers_to_plot
setup(tree=None)[source]

Due to multiprocessing limitations, the setup is run separately from the constructor

diagnostic_image
serialize_changed_params()[source]
serialize_params()[source]
deserialize_params(rec_params)[source]
recursive_run(node: stytra.tracking.pipelines.PipelineNode, *input_data)[source]
run(input)[source]

stytra.tracking.preprocessing module

Preprocessing functions, take the current image, some state (optional, used for backgorund subtraction) and parameters and return the processed image

class stytra.tracking.preprocessing.Prefilter(*args, **kwargs)[source]

Bases: stytra.tracking.pipelines.ImageToImageNode

class stytra.tracking.preprocessing.BackgroundSubtractor(*args, **kwargs)[source]

Bases: stytra.tracking.pipelines.ImageToImageNode

reset()[source]

stytra.tracking.simple_kalman module

stytra.tracking.simple_kalman.predict_inplace[source]
stytra.tracking.simple_kalman.update_inplace[source]

stytra.tracking.tail module

class stytra.tracking.tail.TailTrackingMethod(*args, **kwargs)[source]

Bases: stytra.tracking.pipelines.ImageToDataNode

General tail tracking method.

changed(vals)[source]
reset()[source]
class stytra.tracking.tail.CentroidTrackingMethod(*args, **kwargs)[source]

Bases: stytra.tracking.tail.TailTrackingMethod

Center-of-mass method to find consecutive segments.

stytra.tracking.tail.find_fish_midline[source]

Finds a midline for a fish image, with the starting point and direction

Parameters
  • im – param xm:

  • ym – param angle:

  • r – param m: (Default value = 9)

  • n_points – return: (Default value = 20)

  • xm

  • angle

  • m – (Default value = 3)

class stytra.tracking.tail.AnglesTrackingMethod[source]

Bases: stytra.tracking.tail.TailTrackingMethod

Angular sweep method to find consecutive segments.

detect(im, tail_start: <lightparam.Param object at 0x0000017084693828>, n_segments: <lightparam.Param object at 0x0000017084957A58>, tail_length: <lightparam.Param object at 0x0000017084957BE0>, **extraparams)[source]

Tail tracing based on min (or max) detection on arches. Wraps _tail_trace_core_ls. Speed testing: 20 us for a 514x640 image without smoothing, 300 us with smoothing.

Parameters
  • img – input image

  • tail_start – tail starting point (x, y) (Default value = (0)

  • tail_length – tail length (Default value = (1)

  • n_segments – number of segments (Default value = 7)

  • dark_tail – True for inverting image colors (Default value = False)

  • im

  • 0)

  • 1)

  • image_scale – (Default value = 1)

stytra.tracking.tracking_process module

class stytra.tracking.tracking_process.TrackingProcess(in_frame_queue, finished_signal: multiprocessing.context.BaseContext.Event = None, pipeline=None, processing_parameter_queue=None, output_queue=None, recording_signal=None, gui_framerate=30, max_mb_queue=100, **kwargs)[source]

Bases: stytra.utilities.FrameProcess

A class which handles taking frames from the camera and processing them,

as well as dispatching a subset for display

process_internal(frame)[source]

Apply processing function to current frame with self.processing_parameters as additional inputs.

Parameters

frame – frame to be processed;

Returns

processed output

Return type

type

retrieve_params()[source]
run()[source]

Loop where the tracking function runs.

send_to_gui(frametime, frame)[source]

Sends the current frame to the GUI queue at the appropriate framerate

class stytra.tracking.tracking_process.DispatchProcess(in_frame_queue, finished_evt: multiprocessing.context.BaseContext.Event = None, dispatching_set_evt: multiprocessing.context.BaseContext.Event = None, gui_framerate=30, gui_dispatcher=False, **kwargs)[source]

Bases: stytra.utilities.FrameProcess

A class which handles taking frames from the camera and dispatch them to both a separate process (e.g. for saving a movie) and to a gui for display

run()[source]

Loop where the tracking function runs.

send_to_gui(frametime, frame)[source]

Sends the current frame to the GUI queue at the appropriate framerate

Module contents