Investigating the correlation between cholinergic neuron activity and real-time behavior in mice using machine learning techniques

Authors

  • Yash Sreepathi Department of Bioengineering and Center for Neural Informatics, Neural Structures, and Neural Plasticity, George Mason University, Fairfax, VA
  • Myah Gross Department of Bioengineering and Center for Neural Informatics, Neural Structures, and Neural Plasticity, George Mason University, Fairfax, VA
  • Fatemeh Farokhi Moghadam Department of Bioengineering and Center for Neural Informatics, Neural Structures, and Neural Plasticity, George Mason University, Fairfax, VA
  • Holger Dannenberg Department of Bioengineering and Center for Neural Informatics, Neural Structures, and Neural Plasticity, George Mason University, Fairfax, VA

Abstract

Cholinergic modulation, regulated by the neurotransmitter acetylcholine, is critical for cognitive functions and behaviors associated with spatial memory and navigation. The activity of cholinergic projection neurons is correlated with changes in cognitive function, brain state, and behavior. However, there is limited research investigating the real-time activity of cholinergic neurons in relation to behavior. In our study, we aim to explore this relationship by recording cholinergic activity in mice during an object location memory task, specifically focusing on the mice's ability to recall the location of objects within a given environment. To measure activity of cholinergic neurons, we utilize fiber photometry, a technique that employs fiber optics to transmit and receive light signals. A Cre-dependent recombinant adeno-associated virus (rAAV) coding for GCaMP is injected into the medial septum of ChAT-Cre mice, creating fluorescence when the neurons generate action potentials. The object location memory task consists of a probe and test phase. During the probe phase, the mice explore an environment with two identical objects for a fixed duration of 15 minutes. After a 1-hour delay, one of the objects is relocated to a novel position near the opposite corner, and mice explore both objects for 15 minutes (test phase). Video tracking is employed throughout both phases to capture the mice's movements and behavior. Video-tracked pose and behavior was analyzed using deep learning applications, including DeepLabCut and DeepEthogram. DeepLabCut is utilized for precise markerless tracking of the mice's body positions and calculating speed. DeepEthogram applies machine learning algorithms to classify behavioral activities such as walking, grooming, and rearing. Our research approach provides valuable insights into the cholinergic modulation of cognitive processes underpinning spatial learning with potential implications for understanding neurological diseases.

Published

2024-10-13

Issue

Section

College of Engineering and Computing: Department of Bioengineering