Read the preprint!
DeepLabCut 1,2,3, SLEAP 4, and OpenPose 5 have revolutionized the way behavioral scientists analyze data. These algorithm utilizes recent advances in computer vision and deep learning to automatically estimate 3D-poses. Interpreting the positions of an animal can be useful in studying behavior; however, it does not encompass the whole dynamic range of naturalistic behaviors.
Behavior identification and quantification techniques have undergone rapid development. To this end, supervised or unsupervised methods (such as B-SOiD6 ) are chosen based upon their intrinsic strengths and weaknesses (e.g. user bias, training cost, complexity, action discovery).
Here, a new active learning platform, A-SOiD, blends these strengths and in doing so, overcomes several of their inherent drawbacks. A-SOiD iteratively learns user-defined groups with a fraction of the usual training data while attaining expansive classification through directed unsupervised classification.
To facilitate use, A-SOiD comes as an intuitive, open-source interface for efficient segmentation of user-defined behaviors and discovered subactions.
A-SOiD is a streamlit-based application that integrates the core features of A-SOiD into a user-friendly, no-coding required GUI solution that can be downloaded and used on custom data.
For this we developed a multi-step pipeline that guides users, independent of their previous machine learning abilities through the process of generating a well-trained, semi-supervised classifier for their own use-case.
In general, users are required to provide a small labeled data set (ground truth) with behavioral categories of their choice using one of the many available labeling tools (e.g. BORIS7 ) or import their previous supervised machine learning data sets. Following the upload of data (see Fig. above, a), a A-SOiD project is created, including several parameters that further enable users to select individual animals (in social data) and exclude body parts from the feature extraction.
Based on the configuration, the feature extraction (see Fig. below, b top) can be further customized by defining a "bout length" referring to the temporal resolution in which single motifs are expected to appear (e.g. the shortest duration a definable component of the designated behavior is expected to last). The extracted features are then used in combination with the labeled ground truth to train a baseline model. Here, an initial evaluation will give users insight into the performance on their base data set (see Fig. above, b bottom). Note, that different splits are used to allow for a more thorough analysis (see Publication Methods for further details).
The baseline classification will then be used as a basis for the first active learning iteration, where users are prompted by the app to view and refine bouts that were classified with low confidence by the baseline model (see Fig. above,c left). Bouts are visualized by showing an animated sequence of the provided pose information and designated body parts and the viewer can be utilized to show the bouts in several options, including increased/decreased speed, reverse view and frame-by-frame view. After submission of a refined bout, a new bout is shown at its place and the refinement continues for a user-defined amount of low confidence bouts. Following refinement, a new iteration of the model is trained and its performance can be viewed (see Fig. above,c right) in comparison to previous iterations. This process is then repeated until the user is satisfied with the model's performance or until a plateau has been reached (see publication). \newline
Finally, users can upload and classify new data using the app and the previously trained classifier (see Fig. above,d). To gain further insight into the results of the classification, the app offers a reporting tab that allows users to view results (see Fig. above,d).
A-SOiD supports the following input types:
- BORIS -> exported as binary files in 0.1 sec time steps (10 Hz): Read the docs
- any annotation files in this style (one-hot encoded), including an index that specifies time steps in seconds.
You can see an example of this using pandas in our docs: Convert annotations to binary format
There are two ways to install A-SOiD. We recommend using a fresh environment in any case to avoid any installation conflicts.
To simplify the process, we provide asoid.yml
file that will do everything for you (see below).
Clone this repository and create a new environment in which A-SOiD will be installed automatically (recommended) Anaconda/Python3.
cd path/to/A-SOiD
Clone it directly from GitHub
git clone https://github.com/YttriLab/A-SOID.git
or download ZIP and unpack where you want.
conda env create --file asoid.yml
in the directory you saved the repo in:
cd path/to/A-SOiD
activate the environment, you want to install A-SOiD in:
conda activate MYENVIRONMENT
install using setup.py
in your own environment:
pip install .
A-SOiD is installed alongside all dependencies.
-
Download or clone the latest version of A-SOiD from this repository.
-
Activate the environment you installed A-SOiD in.
conda activate asoid
- Go to the locotion of that you unpacked the latest version at.
cd path/to/A-SOiD
- Install the new version on-top of the other using
setup.py
:
pip install .
The console output should look like this:
Successfully built asoid
Installing collected packages: asoid
Attempting uninstall: asoid
Found existing installation: asoid 0.1
Uninstalling asoid-0.1:
Successfully uninstalled asoid-0.1
Successfully installed asoid-0.2.0
You can start A-SOiD again and use the new version just like before.
conda activate asoid
You can run A-SOiD now from inside your environment by using (you do not have change directories anymore):
asoid app
A-SOiD was developed as a collaboration between the Yttri Lab and Schwarz Lab by:
Jens Schweihoff, University Bonn
Alex Hsu, Carnegie Mellon University
Martin Schwarz, University Bonn
Eric Yttri, Carnegie Mellon University
Martin K. Schwarz SchwarzLab
Eric A. Yttri YttriLab
For recommended changes that you would like to see, open an issue.
There are many exciting avenues to explore based on this work. Please do not hesitate to contact us for collaborations.
If you are having issues, please refer to our issue page first, to see whether a similar issue was already solved. If this does not apply to your problem, please submit an issue with enough information that we can replicate it. Thank you!
A-SOiD is released under a Clear BSD License and is intended for research/academic use only.
If you are using our work, please make sure to cite us and any additional resources you were using
A-SOiD, an active learning platform for expert-guided, data efficient discovery of behavior.
Jens F. Schweihoff, Alexander I. Hsu, Martin K. Schwarz, Eric A. Yttri
bioRxiv 2022.11.04.515138; doi: https://doi.org/10.1101/2022.11.04.515138
or see Cite Us