International Conference on Computer Vision

October 15, 2025
International Conference on Computer Vision. October 19 - 23, 2025, in Honolulu, HI

October 19 – 23, 2025, in Honolulu, HI

ICCV’25, brings together leading researchers and practitioners to advance automated analysis of challenging visual data in marine and event-based environments. Kitware is proud to contribute to the technical program with accepted papers that showcase our expertise in computer vision, multi-camera imaging, and automated environmental monitoring.

  • Anthony Hoogs, Ph.D. is serving as the Senior Area Chair and Workshop Co-Chair, curating sessions that highlight emerging topics and innovations in marine vision research.
  • Christopher Funk, Ph.D. and Scott McCloskey, Ph.D. are serving as Area Chairs, overseeing the peer review process and helping to shape the program for the ICCV Joint Workshop on Marine Vision.

Kitware’s Activities and Involvement

KAMERA: Knowledge-guided Image Acquisition Manager and Archiver for Aerial Surveys of Ice-associated Seals in Arctic Environments

Workshop Paper | Joint Workshop on Marine Vision | Sunday, October 19
Authors: Adam Romlein (Kitware), Benjamin X. Hou (NOAA), Yuval Boss (University of Washington), Cynthia L. Christman (University of Washington), Stacie Koslovsky (NOAA), Erin E. Moreland (NOAA), Jason Parham (Kitware), Anthony Hoogs (Kitware)

Tracking Arctic ecosystems and how they evolve is an increasingly important task as we try to preserve these fragile environments and species. NOAA’s Marine Mammal Laboratory (MML) Polar Ecosystems Program has been tackling this problem by examining population trends and distribution of seals in Arctic and sub-Arctic ecosystems. This paper introduces KAMERA, a multi-camera, multi-spectral system for real-time aerial surveys. KAMERA reduces dataset processing time by up to 80%, enables detection across multiple spectra, and maps survey results for accurate environmental monitoring. KAMERA includes deep-learning-based AI models for seal detection and classification. All data are annotated with metadata for easy reference, and the software, models, and schematics are fully open source, supporting the wider scientific community. Challenges that arose from the development of the system, lessons learned, and sample results from the operation of the system above the Arctic Circle are all discussed.

Steller Sea Lion Counting Across Multiple Aerial Cameras

Workshop Paper | Joint Workshop on Marine Vision | Sunday, October 19
Authors: Matthew Dawkins (Kitware), Jon Crall (Kitware), Katie Sweeney, Burlyn Birkemeier, Neal Siekierski, Dawei Du

Steller sea lions, which tend to live on the rocky shores and coastal waters of the subarctic, have faced declining populations since the 1970s and were listed as endangered in 1997 within distinct western population centers. Monitoring their populations is therefore crucial to future management and setting government regulations. In this paper, we present a novel dataset for sea lion detection and identification across multiple cameras, which includes different labels for age and sex sub-classification, alongside related distractor animal species such as fur seals. The data was collected from mixed 1- and 3-camera systems, mounted on both fixed-wing airplanes and unmanned systems. The performance of several algorithms is compared on this dataset, including novel detector ensembles and multi-sensor temporal tracking methods developed to improve the accuracy of population counts.

Quantifying Accuracy of an Event-Based Star Tracker via Earth’s Rotation

Workshop Paper | Workshop on Neuromorphic Vision (NeVi): Advantages and Applications of Event Cameras | Monday, October 20
Authors: Dennis Melamed (Kitware), Connor Hashemi (Kitware), Scott McCloskey (Kitware)
Event-based cameras (EBCs) are a promising new technology for star tracking-based attitude determination, but prior studies have struggled to determine accurate ground truth for real data. We analyze the accuracy of an EBC star tracking system utilizing the Earth’s motion as the ground truth. By keeping an event camera static and pointing it through a ground-based telescope at the night sky, the only camera motion in the celestial reference frame is that induced by the Earth’s rotation. The resulting event stream is processed to generate orientation estimates, which we compare to the International Earth Rotation and Reference System (IERS) measured orientation. The system achieves a root mean squared across error of 18.47 arcseconds and an about error of 78.84 arcseconds. Combined with other benefits of event cameras over framing sensors (reduced computation, higher dynamic range, lower energy consumption, faster update rates), this accuracy suggests their utility for low-cost, low-latency star tracking. We provide all code and data used to generate our results.

Leaders in Computer Vision

With decades of experience, Kitware is a recognized leader in computer vision, scientific imaging, and open source software development. Our areas of technical focus include:

  • Multi-camera and multi-spectral imaging.
  • Automated image and video analysis for environmental monitoring.
  • Event-based and high-frame-rate vision systems.
  • Open source software development for scientific research.

Through our commitment to academic-level R&D and open source technologies, Kitware enables researchers, ecologists, and engineers to leverage cutting-edge tools for solving real-world challenges in marine and event-based vision. Contact our team to discuss your project.

Leave a Reply