Model-based sensor fusion for aviation

Misha Pavel, Ravi K. Sharma

Research output: Contribution to journalConference articlepeer-review

5 Scopus citations


We describe a sensor fusion algorithm based on a set of simple assumptions about the relationship among the sensors. Under these assumptions we estimate the common signal in each sensor, and the optimal fusion is then approximated by a weighted sum of the common component in each sensor output at each pixel. We then examine a variety of techniques to map the sensor signals onto perceptual dimensions, such that the human operator can benefit from the enhanced fused image, and simultaneously, be able to identify the source of the information. We examine several color mapping schemes.

Original languageEnglish (US)
Pages (from-to)169-176
Number of pages8
JournalProceedings of SPIE - The International Society for Optical Engineering
StatePublished - 1997
Externally publishedYes
EventEnhanced and Synthetic Vision 1997 - Orlando, FL, United States
Duration: Apr 21 1997Apr 21 1997


  • Color fusion
  • Color mapping
  • Color vision
  • Enhanced vision
  • Image fusion
  • Multisensor fusion
  • Sensor fusion

ASJC Scopus subject areas

  • Electronic, Optical and Magnetic Materials
  • Condensed Matter Physics
  • Computer Science Applications
  • Applied Mathematics
  • Electrical and Electronic Engineering


Dive into the research topics of 'Model-based sensor fusion for aviation'. Together they form a unique fingerprint.

Cite this