Experimental protocols in Stytra are defined as sequences of timed stimuli presented to the animal through a projector or external actuators. A sequence of stimuli, defined as a Python list of Stimulus objects, is defined in a Protocol object. This structure enables straightforward design of new experimental protocols, requiring very little knowledge of the general structure of the library and only basic Python syntax. A dedicated class coordinates the timed execution of the protocol relying on a QTimer from the PyQt5 library, ensuring a temporal resolution in the order of 15-20 ms (around the response time of a normal monitor, see inset). Drawing very complex stimuli consisting of many polygons or requiring online computation of large arrays can decrease the stimulus display performance.


Interval duration when flickering a white stimulus on every update of the display loop. The screen was recorded at 2 kHz.

The stimulus display framerate can be monitored online from the user interface when the protocol is running (see the lower left corner of the window in interface. Milli- or microsecond precision, which might be required for optogenetic experiments, for example, is currently not supported. Each Stimulus has methods which are called at starting time or at every subsequent time step while it is set. In this way one can generate dynamically changing stimuli, or trigger external devices. New Stimulus types can be easily added to the library just by subclassing Stimulus and re-defining the start() and update() methods.

A large number of stimuli is included in the package. In particular, a library of visual stimuli has been implemented as VisualStimulus objects using the QPainter object, a part of the Qt GUI library, enabling efficient drawing with OpenGL. Relying on a set of high-level drawing primitives makes the code very readable and maintainable. Stytra already includes common stimuli used in visual neuroscience, such as moving bars, dots, whole-field translation or rotations of patterns on a screen, and additional features such as movie playback and the presentation of images from a file (which can be generated by packages such as Imagen [ima]). The classes describing visual stimuli can be combined, and new stimuli where these patterns are moved or masked can be quickly defined by combining the appropriate Stimulus types. Finally, new stimuli can be easily created by redefining the paint() method in a new VisualStimulus object. Multiple stimuli can be presented simultaneously using StimulusCombiner. Presenting different stimuli depending on animal behavior or external signals can be achieved using the ConditionalWrapper container, or with similarly designed custom objects. Visual stimuli are usually displayed on a secondary screen, therefore Stytra provides a convenient interface for positioning and calibrating the stimulation window (visible in interface on the right-hand side). Although in our experiments we are using a single stimulation monitor, displaying stimuli on multiple screens can be achieved with virtual desktop technology or screen-splitting hardware boards. Importantly, all stimulus parameters are specified in physical units and are therefore independent of the display hardware. Finally, the timed execution of code inside Stimulus objects can be used to control hardware via I/O boards or serial communication with micro-controllers such as Arduino or MicroPython PyBoard. For example, in this way one may deliver odors or temperature stimuli or optogenetic stimulation. Examples for a few different kinds of stimuli are provided below. For a description of how to synchronize the stimulus with an external data-acquisition device such a s a microscope, see the triggering section of the developer documentation.

Stimuli examples

Full-field luminance

    def get_stim_sequence(self):
        lum = pd.DataFrame(dict(t=[0, 1, 2], luminance=[0.0, 1.0, 0.0]))
        return [
            DynamicLuminanceStimulus(df_param=lum, clip_mask=(0.0, 0.0, 0.5, 0.5)),
            DynamicLuminanceStimulus(df_param=lum, clip_mask=(0.5, 0.5, 0.5, 0.5)),


    def get_stim_sequence(self):
        Stim = type("stim", (InterpolatedStimulus, GratingStimulus), dict())
        return [
            Stim(df_param=pd.DataFrame(dict(t=[0, 2], vel_x=[10, 10], theta=np.pi / 4)))

OKR inducing rotating windmill stimulus

    def get_stim_sequence(self):
        Stim = type(
            "stim", (InterpolatedStimulus, WindmillStimulus), {}  # order is important!

Seamlessly-tiled image

    def get_stim_sequence(self):
        Stim = type("stim", (SeamlessImageStimulus, InterpolatedStimulus), {})
        return [
                df_param=pd.DataFrame(dict(t=[0, 2], vel_x=[10, 10], vel_y=[5, 5])),

Radial sine (freely-swimming fish centering stimulus)

    def get_stim_sequence(self):
        return [RadialSineStimulus(duration=2, period=10, velocity=5)]

Random dot kinematograms

    def get_stim_sequence(self):
        return [
                        t=[0, 1, 1, 5, 5, 9],
                        coherence=[0, 0, 0.5, 0.5, 1, 1],
                        frozen=[1, 1, 0, 0, 0, 0],

Set voltage of an NI board

This example set a voltage on an external NI board. Note that this code would require an installed NI board and the nidaqmx library installed.

def get_stim_sequence():
    return([SetVoltageStimulus(duration=10, dev="Dev1", chan="ao0", voltage=0),
            SetVoltageStimulus(duration=1, dev="Dev1", chan="ao0", voltage=3.5)])