Toward Real-Time Animal Tracking with Integrated Stimulus Control for Automated Conditioning in Aquatic Eco-Neurotoxicology

Yutao Bai, Jason Henry, Eva Cheng, Stuart Perry, David Mawdsley, Bob B.M. Wong, Jan Kaslin, Donald Wlodkowic

Research output: Contribution to journalArticleResearchpeer-review


Aquatic eco-neurotoxicology is an emerging field that requires new analytical systems to study the effects of pollutants on animal behaviors. This is especially true if we are to gain insights into one of the least studied aspects: the potential perturbations that neurotoxicants can have on cognitive behaviors. The paucity of experimental data is partly caused by a lack of low-cost technologies for the analysis of higher-level neurological functions (e.g., associative learning) in small aquatic organisms. Here, we present a proof-of-concept prototype that utilizes a new real-time animal tracking software for on-the-fly video analysis and closed-loop, external hardware communications to deliver stimuli based on specific behaviors in aquatic organisms, spanning three animal phyla: chordates (fish, frog), platyhelminthes (flatworm), and arthropods (crustacean). The system’s open-source software features an intuitive graphical user interface and advanced adaptive threshold-based image segmentation for precise animal detection. We demonstrate the precision of animal tracking across multiple aquatic species with varying modes of locomotion. The presented technology interfaces easily with low-cost and open-source hardware such as the Arduino microcontroller family for closed-loop stimuli control. The new system has potential future applications in eco-neurotoxicology, where it could enable new opportunities for cognitive research in diverse small aquatic model organisms.

Original languageEnglish
Pages (from-to)19453-19462
Number of pages10
JournalEnvironmental Science and Technology
Issue number48
Publication statusPublished - 13 Nov 2023


  • animal tracking
  • behavior
  • conditioning
  • ecotoxicology
  • real time

Cite this