Closed-loop optogenetic system for deep behavioral phenotyping
Artificial Intelligence
Computer Vision
Vision Science
Behavioral Analysis
CLOSER provides an accessible, reproducible, fully automated platform for closed-loop optogenetic behavior experiments coupled with content-rich behavioral analysis. By standardizing stimulus delivery and utilizing unbiased machine vision for behavior classification, CLOSER enhances reproducibility while reducing experimenter labor and subjective biases in interpreting stimulus-evoked responses. The system's closed-loop design can be applied to study any limb or body region of interest in various model organisms. Automatic and comprehensive behavioral phenotyping using CLOSER could be scaled for high throughput screening of novel therapeutics optimized to mitigate spontaneous and stimuli-evoked maladaptive forms of pain. The integration of real-time optogenetic control with computer vision behavior tracking is a powerful and versatile tool for dissecting neural circuits underlying complex behaviors. Additionally, CLOSER will be open-source, with algorithms and a collection of pretrained models to facilitate its use in new experimental configurations without the need for additional annotation.
Technologies Used: OpenCV, PyTorch, Transformers, VideoProcessing.
My Role:
- Trained the complete behavior classification tool from scratch using self-supervised pre-training.
- Modified the existing VGG Image Annotator for collaborators to annotate videos.
- Explainability (gradient analysis) to explain confounds, background subtraction, and segmentation on the videos: Transformers and Optical Flow.
- Annotated over 20,000 frames overall for DeepLabCut and bootstrapped DeepLabCuts Models for individual animals.
- Built a tracker for the animals, using signal processing (Kalman filter and Savitzky-Golay filters) for tracking, and eliminating Jitter. Fine-tuned YOLOv5 and SSD for object detection
Project Poster
