Object recognition analysis in mice using nose-point digital video tracking

T. S. Benice, Jacob Raber

Research output: Contribution to journalArticlepeer-review

62 Scopus citations


Preferential exploration of novel locations and objects by rodents has been used to test the effects of various manipulations on object recognition memory. However, manual scoring is time-consuming, requires extensive training, and is subject to inter-observer variability. Since rodents explore primarily by sniffing, we assessed the ability of new nose-point video tracking software (NPVT) to automatically collect object recognition data. Mice performed a novel object/novel location task and data collected by NPVT, two expert observers, and one inexperienced observer were compared. Percent time spent exploring the objects were correlated between the two expert observers and between NPVT and the two expert observers. In contrast, the inexperienced observer showed no correlation with either expert observer or NPVT. NPVT collected more reliable data compared to the inexperienced observer. NPVT and the expert observers gave similar group averages for arbitrarily assigned groups of mice, whereas the analysis of the inexperienced observer gave different results. Finally, NPVT generated valid results in a NO/NL experiment comparing mice expressing human apolipoprotein E3 versus E4, a risk factor for age-related cognitive decline. Video tracking with nose-point detection generates useful analyses of rodent object recognition task performance and possibly for other behavioral tests.

Original languageEnglish (US)
Pages (from-to)422-430
Number of pages9
JournalJournal of Neuroscience Methods
Issue number2
StatePublished - Mar 15 2008


  • Apolipoprotein E
  • Automation
  • Novel location
  • Novel object
  • Rodent behavior
  • Video tracking

ASJC Scopus subject areas

  • Neuroscience(all)


Dive into the research topics of 'Object recognition analysis in mice using nose-point digital video tracking'. Together they form a unique fingerprint.

Cite this