Multitarget/Multisensor Data Fusion Techniques for Target Detection, Classification, and State Estimation

EC ENGR 810.80

This course describes sensor and data fusion methods that improve the probability of correct target detection, classification, identification, and state estimation.

READ MORE ABOUT THIS COURSE

What you can learn.

  • Advantages of multisensor data fusion for object discrimination and state estimation
  • JDL-DFIG model for data fusion processing levels
  • Taxonomies for target detection, classification, identification, and state estimation algorithms
  • Appreciation of skills needed to develop and apply data fusion algorithms to complex situations
  • Bayesian and Dempster–Shafer approaches to object and event identification
  • Sequential probability ratio test to initiate target tracks
  • Kalman filter operation for updating the state estimate of a target

About this course:

This revised two-day course introduces the student to sensor and data fusion methods that improve the probability of correct target detection, classification, identification, and state estimation. These techniques combine information from collocated or dispersed sensors that utilize either similar or different technologies to generate target signatures or imagery. The course begins by describing the effects of the atmosphere and countermeasures on millimeter-wave and infrared sensors to illustrate how the use of different phenomenology-based sensors enhances the effectiveness of a data fusion system. This class introduces the Data Fusion Information Group (DFIG) enhancements to the JDL data fusion processing model, several methods for describing sensor and data fusion architectures, and the taxonomies for the data fusion algorithms used for detection, classification, identification, and state estimation and tracking. This is followed by descriptions of the higher-level data fusion processes of situation and threat assessment that are considered part of situation awareness. Process refinement, now deemed part of resource management, and user refinement dealing with human-computer interactions and human decision making are treated next. Subsequent sections of this course more fully develop the Bayesian and Dempster–Shafer algorithms, radar tracking system design concerns, multiple sensor registration issues, track initiation in clutter, Kalman filtering and the alpha-beta filter, interacting multiple models, data fusion maturity, and several of the topics that drive the need for continued data fusion research. Examples demonstrate the advantages of multisensor data fusion in systems that use microwave and millimeter-wave detection and tracking radars, laser radars (imagery and range data), and forward-looking IR sensors (imagery data). You can also apply many of the data fusion techniques when combining information from almost any grouping of sensors as long as they are conditionally independent of each other and can supply the input data required by the fusion algorithm.

Contact Us

Our team members are here to help. Hours: Mon-Fri, 8am-5pm
Ready to start
your future?
By signing up, you agree to UCLA Extension’s Privacy Policy.

vector icon of building

Corporate Education

Learn how we can help your organization meet its professional development goals and corporate training needs.

Learn More

vector icon of building

Donate to UCLA Extension

Support our many efforts to reach communities in need.

Innovation Programs

Student Scholarships

Coding Boot Camp

Lifelong Learning