Neurons in mammalian V1 exhibit tuning to visual stimulus features, non-visual task features and self-motion. Recent work shows that rodent V1 responds to head orienting movements and exhibits similar receptive field structure during free locomotion, but to date no other visual tuning properties have been studied under freely-moving conditions. We examined the interaction of direction/orientation tuning and self-motion representation in binocular V1 by employing a virtual reality arena to present drifting grating Gabor patches to freely moving mice while recording calcium transients with wireless 1-photon miniscopes. This allows us to fix the visual stimulus in the mouse’s field of view while the animal moves unrestrained. We image the same field of view under both freely-moving and head-fixed conditions to directly compare responses of the same neurons. We find that neurons exhibit direction/orientation selectivity under freely-moving conditions, with similar fractions of significantly tuned cells between freely-moving and head-fixed sessions. Further, we find that in freely moving sessions, nearly all cells have tuning to both visual features and self-motion, while under head-fixed conditions cells are primarily unimodally visually tuned. For cells matched between freely-moving and head fixed-sessions, we find a subpopulation where the preferred direction/orientation shifts, and another subpopulation where tuning is maintained. Analyzing the tuning vectors of all neurons in a low-dimensional space shows putative clusters of cells strongly tuned to self-motion or visual stimulus features.
Responding appropriately to environmental cues is crucial for survival. This ability is influenced by multiple factors including experience, hormonal, and developmental changes. This study focuses on how these dynamic behavioral shifts occur in an age-dependent manner and the correlated changes in neural activity. To investigate age-related changes, we explored two aspects in laboratory mice. First, the changing nature of innate appetitive and defensive behavior across life stages. Second, neural activity within superior colliculus, a mid-brain structure important for sensorimotor integration, reflecting circuit refinement and correlates with behavioral differences. We exposed juvenile (P21) and adult (P90) mice with visual stimuli known to trigger different types of innate behaviors. These stimuli were presented simultaneously in the lower and upper visual fields to observe behavioral changes under conditions with conflicting cues. We found age differences in exploratory-like (approach, pursuit) and defensive responses to overhead stimuli. To examine the relationship between neural activity and the behavioral differences observed across life stages, we have started performing high-density electrophysiology recordings (Neuropixels) in the superior colliculus of freely moving adult mice during the presentation of conflicting stimuli. Preliminary analysis show that different populations of neurons respond to ‘threatening’ or ‘appetitive’ visual stimuli. So far, we suggest there might be changes in neural activity in the early stages of visual processing that could underly some of the differences observed throughout an animal's life.
How does an animal’s brain state affect the retinal visual processing? The structure and function of the vertebrate retina have been extensively studied across species in isolated, ex vivo preparations. Recent studies, however, demonstrated that the retina operates differently in vivo, especially in an awake condition. Here we identified histaminergic modulation of the retinal output kinetics, but not the firing rate, via projection from histidine-decarboxylase-positive (HDC+) neurons to the optic chiasm. Pharmacologically blocking postsynaptic histamine receptor (H1) facilitated the response kinetics, while blocking H3 receptor – an autoreceptor that presynaptically inhibits histamine release – slowed down the response latency. Furthermore, chemogenetic activation of HDC+ neurons in the tuberomammillary nucleus of the mouse hypothalamus led to slower responses in retinal ganglion cells (RGCs). Finally, we examined the circadian modulation of the RGC responses, and found that their kinetics were faster during day time (subjective night) than night time (subjective day), but only when the mouse stayed still. Taken together, our data indicate that retinal responses are faster in the absence of histamine when animals are dormant, and this is likely beneficial to respond faster to potential visual threats.
A central interest in neuroscience is understanding how the brain produces and orchestrates behavior. Our group studies these topics in rats, utilizing recent advances in tracking technology and analytic tools to quantify and explore behavior in naturalistic, unrestrained contexts. Previous publications from the group describe robust postural and behavioral representations in the posterior parietal cortex (PPC), frontal motor cortex and primary sensory cortices (Mimica et al., 2018, Mimica, Tombaz et al., 2023). To better understand how neuronal populations integrate active sensory input to inform and update ongoing posture, we are now expanding our recording paradigm to include tracking of facial features – namely whiskers and eye movements. We therefore developed a custom head-mounted set-up combining two facial-tracking cameras, housing for Neuropixels recording probes (allowing for insertion at relevant angles), and a retroreflective rigid body for 3D posture tracking. We are currently employing this integrated approach to investigate sensory-motor integration in the barrel cortex, as well as the combined representation of multiple body schema effectors (e.g. the head, vibrissae, and eyes) in the PPC during spontaneous behavior. Preliminary analyses have uncovered stable encoding of whisker position in S1, and will extend existing work on behavioral representation in the PPC.
Across animal species, novel stimuli elicit arousal and evoke sensory inspection and exploration. This type of spontaneous curiosity critically relies on the animals’ ability to identify the novelty of a sensory stimulus against previously experienced stimuli. Behavioural novelty responses have been used to investigate the neural mechanism underlying novelty detection. However, since familiarity is characterized by the absence of a behavioral responses, passive novelty detection tasks involve asymmetric motor responses which can confound the analysis of neural signals. To address this limitation, I established a two alternative forced choice novelty detection task which offers the opportunity to investigate novelty detection in a volitional context with symmetric motor responses for novel and familiar stimuli. Mice were trained to report the familiarity of olfactory stimuli by collecting rewards at either of two reward port associated with novel or familiar stimuli, respectively. Mice learned the task, as indicated by the correct selection of the novelty-associated reward port on the first occurrence of a novel stimulus. Four mice which performed the operant task with above 70% success rate were simultaneously recorded in the LEC, CA1, CA2 and SUB with chronically implanted 4-shank Neuropixel probes. I further performed recordings in 5 mice in a passive task, in which head-restrained were exposed to familiar and novel stimuli. I used a linear classifier to determine when the information was present in the recorded regions and found time-dependent differences between the two tasks. In the passive task, novelty and odor identity could both be discriminated before the onset of the behavioral response. In the operant task, on the other hand, odor identity was delayed until after the decision.
Virtual environments have been implemented to gain insight into visual processing in rodents, commonly in head-fixed settings. Virtual environments are advantageous as they provide tight control over visual scenes. However, it is known that head motion and other locomotion cues can influence visual processing. In this paper, we present a cylindrical augmented reality (CAR) system that makes use of the advantages of virtual environments in freely moving rats. Furthermore, a virtual object recognition task was performed to assess whether the novel augmented reality system can elicit visually evoked behavioural responses. The CAR system consisted of a 360-degree display screen with which animals could freely interact. Animals were tracked in real-time enabling closed-loop interaction with virtual scenes. The virtual object recognition task, similar to a novel object recognition task, consisted of a habituation, familiarization and test phase. During the familiarization phase, the same two virtual objects were depicted on the circular screen, during the test phase a novel object was introduced. Different sides of the 3D virtual objects could be explored in real time depending on the location of the animal. All animals spent relatively more time at the novel object in comparison with the familiar object during the test phase. To conclude, the setup provides a novel way to implement virtual environments during unrestricted behaviour which can provide insight into the relation between visual processing and intrinsically motivated behaviours.
Extraction of relevant stimulus features from the dynamic sensory scene needs to be coupled to the execution of appropriate adaptive responses to ensure survival. A predator will need to evaluate its position with respect to that of moving prey, define an approach strategy and carry out the proper motor commands to execute it. The most efficient strategy will require an estimation of the future position of the target and should account for the predator’s sensorimotor processing delays to make a predictive interception. Mice can hunt moving prey and have been established as a successful model to study visually guided pursuit and capture behaviours. However, their ability to adapt their pursuit strategy to the direction of travel and speed of the target is poorly understood. To study this, we developed a new behavioural paradigm in which mice were trained to pursue and catch a moving target displayed on a touch screen. We ensured the performance of behaviourally consistent pursuit approaches by implementing a maze-like arena design and closed-loop stimuli presentation. Mice perform hundreds of trials per session and their interception success depends on the speed and contrast of the target. By modifying their running speed and the trajectory followed to reach the target, mice can adapt their pursuit strategy to the demands of the task. We are currently investigating the role of the superior colliculus in the tracking and interception of moving targets during pursuit behaviour.
Animals combine estimate of their position and motion with visual inputs to perform visually guided behaviour. Recent studies in rodents have shown that motor signals modulate activity of the primary visual cortex (V1), suggesting that visuo-motor integration could already occur early in the visual pathway. However, largely due to technical limitations, the mechanisms by which visual inputs are combined with self-motion signals remain unclear. To solve this issue, we combined a head-fixed virtual reality (VR) system for rodents that allows precise control of visual inputs, with an augmented reality (AR) system that allows perturbation of visuo-motor coupling in freely moving animals. Using the OpenEphys ONIX system and head-mounted camera, we track head position and orientation, and gaze direction, as well as neural activity in V1, continuously in both VR and AR. We find that this approach simplifies the calibration of the system and allows precise characterization of both visual and motor responses in the same neuron. We are currently using this system to study how depth selectivity from motion parallax emerges from visuo-motor integration in V1.
The recent emergence of markerless pose estimation tools, such as DeepLabCut, SLEAP, and LightningPose, has revolutionised the study of animal behaviour. However, despite their popularity, there is currently no user-friendly, general-purpose approach for processing and analysing the pose tracks that these tools generate. To address this, we are developing movement, an open-source Python package that offers a unified interface for analysing pose data from multiple major pose estimation packages. During movement’s early development, we are focusing on implementing versatile and efficient methods for data cleaning, filtering, and kinematic analysis. However, we plan to eventually include modules for more specialised applications, such as pupillometry and gait analysis. Other planned features involve analysing pose tracks within the spatial context of an animal's environment and integrating movement with neurophysiological data analysis tools. Importantly, movement is being designed to accommodate researchers with varying coding skills and computational resources, and will soon feature an intuitive graphical user interface. In addition, movement’s development is transparent and robust, with dedicated engineers ensuring its long-term maintenance. Ultimately, we envision movement evolving into a comprehensive, all-around software suite for analysing animal behaviour.