Matt Brown, Ph.D.

Principal Engineer

Computer Vision

Kitware North Carolina
Carrboro, NC

Ph.D. in Mechanical and Aerospace Engineering
Princeton University

M.S. in Mechanical and Aerospace Engineering
Princeton University

B.S. in Mechanical and Aerospace Engineering
Rutgers University

Matt Brown, Ph.D. is a principal engineer on Kitware’s Computer Vision Team located in Carrboro, North Carolina. He has over 14 years of experience developing advanced imaging systems and video-exploitation algorithms. His expertise spans from the fundamental physics of imaging to the applied aspects of designing and integrating hardware and artificial intelligence software to solve challenging problems. At Kitware, he has led and made key contributions to various cyber-physical projects. 

As Kitware’s chief scientist on the Defense Advanced Research Projects Agency (DARPA) URSA program, Matt designed and led the development of a distributed surveillance system. The system is supported by a network of autonomous ground and aerial vehicles to detect threats within complex urban environments. The system fuses analytics from a suite of state-of-the-art deep neural networks to detect, track, and re-identify entities while assessing their activities and relationships. Matt was the principal investigator (PI) on multiple Night Vision and Electronic Sensors Directorate (NVESD) Panoptic Guardian programs, developing similar capabilities for infrared video. Matt was the lead algorithm developer on the DARPA Squad X program, where he wrote software for camera modeling and calibration, multi-modal detection fusion, and person tracking.

In addition to his national defense projects, Matt has led several projects funded by the National Oceanic and Atmospheric Administration (NOAA). These projects have developed autonomous imaging systems to support automated aerial surveys of animal populations and environmental conditions. These systems provide real-time deep learning-based computer vision onboard both manned and unmanned aircraft. Matt leads the open source ADAPT Multi-Mission Payload project, making these real-time capabilities accessible to various environmentally-focused groups. 

Prior to joining Kitware, Matt worked at Logos Technologies, where he developed Wide Area Motion Imagery (WAMI) sensor systems (Kestrel, Simera, Serenity, Redkite) for civilian and military surveillance applications. His contributions included optimizing the design and calibration of complex, multi-camera, optomechanical sensor systems, as well as prototyping the associated control and image processing software. At Logos, Matt was the principal investigator in the development of real-time camera-pose estimation and georegistered EO–IR video rendering algorithms deployed with the sensor systems. He also developed algorithms to demonstrate automated, near-real-time target detection from hyperspectral imagery.

Matt received his Ph.D. in mechanical and aerospace engineering from Princeton University in 2011. His doctoral dissertation explored a novel, laser-actuated printing process. This work involved time-resolved imaging experiments coupled with image processing and computational modeling of complex fluid-structure interactions. In 2007, He also received his master’s degree in mechanical and aerospace engineering from Princeton. Matt received his bachelor’s degree in mechanical and aerospace engineering from Rutgers University in 2006. He graduated summa cum laude.

Publications

  1. D. Davila, J. VanPelt, A. Lynch, A. Romlein, P. Webley, and M. Brown, "ADAPT: An Open-Source sUAS Payload for Real-Time Disaster Prediction and Response with AI," in Workshop on Practical Deep Learning in the Wild, 2022. [URL]
  2. M. Brown, K. Fieldhouse, E. Swears, P. Tunison, A. Romlein, and A. Hoogs, "Multi-Modal Detection Fusion on a Mobile UGV for Wide-Area, Long-Range Surveillance," in 2019 IEEE Winter Conference on Applications of Computer Vision (WACV), 2019. [URL]
  3. M. Brown, K. Fieldhouse, E. Swears, P. Tunison, A. Romlein, and A. Hoogs, "Multi-Modal Detection Fusion on Mobile UGV for Squad-Level Threat Alerting," in Proceedings of the MSS National Symposium on Sensor and Data Fusion, 2018.
  4. M. Brown, C. Brasz, Y. Ventikos, and C. Arnold, "Impulsively actuated jets from thin liquid films for high-resolution printing applications," Journal of Fluid Mechanics, vol. 709, pp. 341-370, Oct. 2012. [URL]
  5. M. Brown, E. Glaser, S. Grassinger, A. Slone, and M. Salvador, "Development of an efficient automated hyperspectral processing system using embedded computing," in SPIE Defense, Security, and Sensing, 2012. [URL]
  6. N. Kattamis, M. Brown, and C. Arnold, "Finite element analysis of blister formation in laser-induced forward transfer," Journal of Materials Research, vol. 26, no. 18, pp. 2438-2449, Sep. 2011. [URL]
  7. M. Brown, N. Kattamis, and C. Arnold, "Time-resolved dynamics of laser-induced micro-jets from thin liquid films," Microfluidics and Nanofluidics, vol. 11, no. 2, pp. 199-207, Aug. 2011. [URL]
  8. M. Brown, N. Kattamis, and C. Arnold, "Time-resolved study of polyimide absorption layers for blister-actuated laser-induced forward transfer," Journal of Applied Physics, vol. 107, no. 8, pp. 083103, Apr. 2010. [URL]
  9. M. Brown and C. Arnold, "Fundamentals of Laser-Material Interaction and Application to Multiscale Surface Modification," in Laser Precision Microfabrication. Springer Berlin Heidelberg, 2010, pp. 91-120. [URL]
  10. M. Brown, J. Shan, C. Lin, and F. Zimmermann, "Electrical polarizability of carbon nanotubes in liquid suspension," Applied Physics Letters, vol. 90, no. 20, pp. 203108, May 2007. [URL]

Bibliography generated 2022-06-29-13:38:58 (5073)

X