Git Product home page Git Product logo

xenloom's Introduction

XenLoom: Custom looming stimuli and behavior tracking for Xenopus tadpoles

XenLoom is a software suite for presentation of customized looming stimuli, collection of visual-evoked responses, and automated tracking of Xenopus tadpoles with computer vision. It builds on the beta version (https://github.com/tonykylim/XenLoom_beta) by allowing greater variety and control over the looming stimuli presented. Additionally, the data analysis interface has been simplified and made more user friendly.

New features of XenLoom include:

Different looming stimulus types

Dark looming stimulus Bright looming stimulus Isoluminant looming stimulus

No looming control stimuli with only a change to screen luminance

Darkening luminance stimulus Brightening luminance stimulus Isoluminant luminance stimulus

Looming stimulus velocity

Low speed Medium speed High speed

Looming stimulus contrast

Low contrast Intermediate contrast High contrast

Isoluminant stimulus noise grain

Fine noise grain Intermediate noise grain Coarse noise grain

All stimuli are programmatically generated, so combinations of the above are also possible.

Getting Started

The code has been partitioned into 3 modules (stim-present, loom-decision, tad-tracker).
'stimpresent-videocapture.py' is used to present customized looming stimuli while recording responses from a webcam.
'loom-decision.py' is used to evaluate the tadpole response to looming stimuli.
'tad-tracker.py' is used to extract tracking data from video data.

Prerequisites and setup

Please see the prerequisites and setup listed at XenLoom beta. The requirements to the updated XenLoom scripts are the same as those found at:
https://github.com/tonykylim/XenLoom_beta#prerequisites
https://github.com/tonykylim/XenLoom_beta#installing
https://github.com/tonykylim/XenLoom_beta#experimental-setup

Additional prerequisite--contrast calibration: For proper contrast calibration, a luminance meter is required.
Fill the glass bowl with buffer and aim the projector at the paper screen.
Aim the luminance meter at the side of the screen which the animal will see.
Psychopy (the library XenLoom uses to display stimuli), uses a brightness scale from -1 to +1. Use the 'contrast-measure' script to vary the screen brightness from -1 to +1 and record the resulting absolute luminance (in cd/m²).

Plot the absolute luminance against the psychopy screen brightness value.

Fit the screen brightness data to a nonlinear four-parameter logistic equation.

Enter the following data into the 'contrast calibration variables' section of the 'stimpresent-videocapture.py' script
cc_low = luminance value when screen was set to -1
cc_high = luminance value when screen was set to +1
cc_bottom = the minimum value (obtained by curve fit)
cc_top = the maximum value (obtained by curve fit)
cc_halfway = the point of inflection (obtained by curve fit)
cc_hs = the Hill's slope (obtained by curve fit)

How to use the new customized looming functions

Experiment type

The 'experiment_types' variable sets the type of looming stimulus.
experiment_types[0] = dark looming stimulus
experiment_types[1] = bright looming stimulus
experiment_types[2] = isoluminant looming stimulus

Contrast only mode

As a control, you can average out changes in luminance over the entire screen and disabling the looming animation. To use this, turn on contrast only mode by setting the variable 'contrast_only_mode' to True. Set it to False to disable this mode.

Looming speed

Changing looming speed is set by altering the 'loom_speed_modulation' variable.
The relationship between 'loom_speed_modulation' and the animation time is as so:
0.5 seconds / loom_speed_modulation = looming animation time in seconds

For example, setting loom_speed_modulation to 0.5, results in a looming animation time of 1 second. Conversely, setting loom_speed_modulation to 2, results in a looming animation time of 0.25 seconds.

Relative contrast

Relative contrast according to the Michalson's equation is set by altering the 'contrast_percent' variable from 0 to 100.

Isoluminant looming noise size

Isoluminant looming stimuli are programatically generated and the size of the noise blocks can be varied using the 'noise_size' variable. This variable must be a number from 1-36 that is a perfect square, and corresponds to the size of the noise blocks.

Additional interval time

The default amount of time between looming stimuli is approximately 20 seconds. This time between trials can be extended by changing the 'trial_interval' value. For example, setting the value to 5 would increase the amount of time between trials by 5 seconds--for a total of 25 seconds.

Recording data

Prepare the setup with the tadpole in the petri dish. The glass bowl is filled with buffer to prevent refraction index abberations.

Run 'videocapture-test.py' to ensure the webcam captures the view of the entire petri dish.

When ready, run 'stimpresent-videocapture.py'. You will be prompted for the following information:

Animal ID -- Enter a unique alphanumeric string to identify the animal or group being tested. eg. 'E13'
Timepoint -- Enter any timepoint information, if required. eg. 'day2'
Treatment -- Enter any treatment information, if required. eg. 'vehicle'
XenLoom then presents stimuli according to the settings set above and generates avi video files of each trial, along with an csv file containing timing data.

Data analysis

Escape probability

To determine escape probability, use the 'escape=decision.py' script. Place all video files with timing csv files to be analyzed together in a folder containing 'escape-decision.py'.
Run 'escape-decision.py' and follow the prompts.

If the tadpole exhibited escape behaviour, press the green button. If it did not, press the red button. If it is unclear, press "Unable to determine".
You can also to play the video over again, or alter playback speed too.

Once all the trials have been evaluated for escape behaviour by the user, open 'response_to_loom_sorted' to find a csv file containing all the escape behaviour data. The response rate can be calculated by dividing the number of positive responses by the combinded number of positive and negative responses. Responses where it was not clear whether or not the animal responded are discarded.

Example data examining the effect of different looming stimuli types on escape probability.

Tadpole tracker

The 'tadpole-tracker.py' script allows the user to gather data from individual trials, such as escape velocity, escape angle, and distance traveled. Instantaneous velocity 3 seconds before and after the looming stimulus is also gathered. Contrails are also obtained, which are a nice way of visually summarizing escape behaviour.

Copy the 'tadpole=tracker.py script to the folder with trial avi video files. After running this script, you will be prompted for the video file to be analyzed.

Select a video file for analysis and continue.

The script will use the petri dish to set the scale of the video. You should see a green circle along the circumference of the petri dish.

If the circle is too small, stop the script and increase the pdmaxrad variable and try again. If the circle is too big, try reducing pdmaxrad. Press spacebar to continue.

Next, a background subtraction image is required in order to automatically track the tadpole. This background image is programatically generated by removing the tadpole from one frame of the video and splicing in a different video frame. If the tadpole moves to a new location, this method will work.
Select the tadpole using the mouse (click, drag, release) to define a region around the tadpole. Then press spacebar.

If successful, the tadpole will be removed from the image.

Press space to continue.

However, if the tadpole doesn't move between the two frames, the tadpole will not be removed from the image.

Press "No" and instead XenLoom will try to create a subtraction image by filling in the area where the tadpole is located with the surrounding pixels. Place another selection box as tightly around the tadpole as possible, and press space.

Now XenLoom will track the tadpole and prompt the user to confirm the start and end escape angles by clicking on the image and pressing space.

Then tell XenLoom whether the tadpole rotated clockwise or counterclockwise, and add in any extra rotations to confirm the escape angle.

Now XenLoom will produce a csv file called 'data.csv' that includes the escape distance, escape velocity and escape angle. This csv file will be appended to with the data from each video file analyzed within the folder.

Instantaneous velocity is outputed in a folder called 'output_speed'. Example results to dark looming stimuli:

XenLoom will also produce annotated video files with tracking in the 'output_video' folder. Example:

Finally, in the 'output_contrails' folder, contrails are saved from each trial. Copy the 'contrails-merger.py' script to this folder and run it to merge contrails from the different trials to form one merged contrail file per animal. Example:

Versioning

The beta version of XenLoom is available here at https://github.com/tonykylim/XenLoom_beta
This newer version of XenLoom was developed in the Ruthazer lab in 2020 by Tony Lim. QA and bug reporting was performed by Jade Ho.

License

This project is licensed under the MIT License - see the LICENSE file for details.

xenloom's People

Contributors

tonykylim avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.