Applying machine learning explore the neural code of body posture during landing, takeoff, and reward acquisition in the hippocampus of Egyptian fruit bats.
Michael Yartsev, Professor
Bioengineering
Closed. This professor is continuing with Fall 2023 apprentices on this project; no new apprentices needed for Spring 2024.
In Full:
High-quality video is an immensely powerful way to track and quantify behavior of animals in an environment. However, the curating and processing of this video is time-consuming and subject to individual bias. This project aims to leverage state of the art tools, primarily, DeepLabCut (see website below) and Anipose, to extract posture and position information from high-quality videos of Egyptian fruit bats performing a spatial navigation task. Bats have highly complex and unique body postures because they navigate and behave in 3D space (unlike many animals which cannot fly, and thus navigate primarily in 2D).
This project aims to overcome these challenges by curating video from 3 different laboratory environments (both free interaction and task-based) and training a neural net to accurately and consistently detect the bat’s body parts. The result of this project, ideally, will be a neural net tailored to the postures and movement of our animal model, which can be used to extract position, posture, and orientation of each individual. We will then use this information to investigate the effects of postural change on neural tuning in the hippocampus.
(Short):
This project aims to leverage state of the art deep learning tools to automatically detect the position and posture of Egyptian fruit bats freely exploring an environment. The primary tools are DeepLabCut and Anipose, which are well-suited for labeling and tracking animals in a laboratory setting. Here, we aim to tailor this approach for our animal model, which has a highly complex and unique set of body postures. Subsequent statistical approaches can then be used to separate behaviors based on this rigorous and principled labeling methodology. These behaviors can be correlated with neural signal to probe the effect of spatially local behavior change on neural signal in the hippocampus.
Role: As an undergraduate on this project you would help to process videos, accurately label frames, build an intuition for the array of body postures and mechanics of fruit bat movement, and come up with creative solutions to refine and improve the performance of DeepLabCut and Anipose for this unique animal model. If time permits, some exploration of the neural data and learning about neural recording techniques may be incorporated into the project.
Qualifications: Technical: Experience with coding (Python or Matlab), machine learning, and ideally some experience with convolutional neural networks and animal behavior.
Non-technical: This project will be a collaborative effort, ideally with 2-3 undergraduates working closely with one another and with their supervisor, and effective communication and cooperation skills will be crucial.
Day-to-day supervisor for this project: Madeleine Snyder, Ph.D. candidate
Hours: 6-8 hrs
Off-Campus Research Site: Can be remote work. There will be a weekly meeting in person on campus for project guidance, mentorship, collaborating with other students on the project, and discussing papers and analysis.
Related website: http://www.mackenziemathislab.org/deeplabcut
Biological & Health Sciences Engineering, Design & Technologies