We are co-organizing a course on Multi-INT Fusion for Activity Recognition for Uncoordinated Sensors.

Course Outline

This interactive course will consist of a \’classroom\’ component and a \’panel\’ component. The panel will be an open discussion with students and will be moderated by experts who have worked with Multi-INT Fusion and Activity Recognition technologies either on the development or operational side.

0700 – 0720:   Overview of Challenges in Multi-INT Fusion (Aptima, Inc.)

0720 – 0740:   Video-Based Multi-INT Fusion (Kitware, Inc)

0740 – 0800:   Coordinating Uncoordinated Space Sensors (Air Force Research Labs)

0800 – 0820:   TBD (National Air and Space Intelligence Center)

0820 – 0840:   TBD or panel

0840 – 0900:   Challenges in Multi-INT Fusion Panel

Course Description

Expensive air and space sensors were critical intelligence assets during the first part of this century when Allied Forces dominated both air and cyber space.   However, as the mission shifts to restricted airspaces, collection techniques are shifting away from single-source, high altitude sources that require persistent online communication with intelligence data streams.   To work in environments without guaranteed air rights and communications, the defense community has turned to integrating novel intelligence sources from Open Source Intelligence (OSINT) and commercial data sources with traditional collection techniques.  Further complicating the goal of harnessing uncoordinated sensor networks is the issue of trustworthiness of non-DoD sensors.  In this interactive workshop, Aptima and Kitware will explore adapting the existing pipeline of sensor exploitation tools to reflect the shift in tasking and collection processes.  We will begin by addressing the challenges of extracting and fusing knowledge for Activity Based Intelligence from uncoordinated, asynchronous sensors.  We will focus on correlating non-standardized metadata, deconflicting inconsistent observations, disambiguating uncertain data and identifying gaps in knowledge.  Additionally, we will explore the extension of existing video exploitation technologies, developed for persistent, reliable data sources such as FMV and WAMI, to support intermittent data from untrusted sources. Finally, we will address how the fusion of multiple sources can be exploited for automated target tracking, activity recognition and threat forecasting to support intelligence analysts in their mission.  We will ground these discussions in motivating examples from the GEOINT and Space Situational Awareness (SSA) communities.

Students attending this course will benefit from the knowledge of experts and operational users who develop and use Multi-INT fusion and Activity Based Intelligence tools.   Students will learn emerging trends in this area and its role in multiple domains both inside and outside of the GEOINT community. 

Physical Event