stytra.tracking package¶
Submodules¶
stytra.tracking.eyes module¶
Authors: Andreas Kist, Luigi Petrucco
-
class
stytra.tracking.eyes.EyeTrackingMethod(*args, **kwargs)[source]¶ Bases:
stytra.tracking.pipelines.ImageToDataNodeGeneral eyes tracking method.
-
name= 'eyes'¶
-
stytra.tracking.fish module¶
stytra.tracking.online_bouts module¶
stytra.tracking.pipelines module¶
-
class
stytra.tracking.pipelines.NodeOutput(messages, data)¶ Bases:
tuple-
data¶ Alias for field number 1
-
messages¶ Alias for field number 0
-
-
class
stytra.tracking.pipelines.PipelineNode(*args, **kwargs)[source]¶ Bases:
anytree.node.node.Node-
output_type_changed¶
-
strpath¶
-
-
class
stytra.tracking.pipelines.ImageToImageNode(*args, **kwargs)[source]¶ Bases:
stytra.tracking.pipelines.PipelineNode-
output_type_changed¶
-
-
class
stytra.tracking.pipelines.ImageToDataNode(*args, **kwargs)[source]¶ Bases:
stytra.tracking.pipelines.PipelineNode-
output_type_changed¶
-
stytra.tracking.preprocessing module¶
Preprocessing functions, take the current image, some state (optional, used for backgorund subtraction) and parameters and return the processed image
stytra.tracking.simple_kalman module¶
stytra.tracking.tail module¶
-
class
stytra.tracking.tail.TailTrackingMethod(*args, **kwargs)[source]¶ Bases:
stytra.tracking.pipelines.ImageToDataNodeGeneral tail tracking method.
-
class
stytra.tracking.tail.CentroidTrackingMethod(*args, **kwargs)[source]¶ Bases:
stytra.tracking.tail.TailTrackingMethodCenter-of-mass method to find consecutive segments.
-
stytra.tracking.tail.find_fish_midline[source]¶ Finds a midline for a fish image, with the starting point and direction
- Parameters
im – param xm:
ym – param angle:
r – param m: (Default value = 9)
n_points – return: (Default value = 20)
xm –
angle –
m – (Default value = 3)
-
class
stytra.tracking.tail.AnglesTrackingMethod[source]¶ Bases:
stytra.tracking.tail.TailTrackingMethodAngular sweep method to find consecutive segments.
-
detect(im, tail_start: <lightparam.Param object at 0x0000017084693828>, n_segments: <lightparam.Param object at 0x0000017084957A58>, tail_length: <lightparam.Param object at 0x0000017084957BE0>, **extraparams)[source]¶ Tail tracing based on min (or max) detection on arches. Wraps _tail_trace_core_ls. Speed testing: 20 us for a 514x640 image without smoothing, 300 us with smoothing.
- Parameters
img – input image
tail_start – tail starting point (x, y) (Default value = (0)
tail_length – tail length (Default value = (1)
n_segments – number of segments (Default value = 7)
dark_tail – True for inverting image colors (Default value = False)
im –
0) –
1) –
image_scale – (Default value = 1)
-
stytra.tracking.tracking_process module¶
-
class
stytra.tracking.tracking_process.TrackingProcess(in_frame_queue, finished_signal: multiprocessing.context.BaseContext.Event = None, pipeline=None, processing_parameter_queue=None, output_queue=None, recording_signal=None, gui_framerate=30, max_mb_queue=100, **kwargs)[source]¶ Bases:
stytra.utilities.FrameProcess- A class which handles taking frames from the camera and processing them,
as well as dispatching a subset for display
-
class
stytra.tracking.tracking_process.DispatchProcess(in_frame_queue, finished_evt: multiprocessing.context.BaseContext.Event = None, dispatching_set_evt: multiprocessing.context.BaseContext.Event = None, gui_framerate=30, gui_dispatcher=False, **kwargs)[source]¶ Bases:
stytra.utilities.FrameProcessA class which handles taking frames from the camera and dispatch them to both a separate process (e.g. for saving a movie) and to a gui for display