Computation has become a vital component of research in the applied areas of mathematics, and through them all areas of science and engineering. Academic publications or industrially relevant mathematical results that do not involve some aspect of computational analysis are few and far between. Unfortunately, the software and data that drives this computation is too often developed and managed in a haphazard fashion prone to error and difficult to replicate or build upon.
We aim in this workshop to gather speakers to discuss best practices for ‘reproducible research’: The idea that research contributions in the computational sciences involve not only publication of an article in an academic venue, but also release of sufficient components of the software and data such that the results claimed in the publication can be reproduced and extended by other scientists.
At the workshop, Dr. Luis Ibanez will be giving a talk on ‘Open Science Tools for Reproducible Research.’
As a collaborative, open source company, we have developed a set of Open Science tools and practices that facilitate engagement with large communities and promote practices consistent with reproducible research. These practices start with embracing the use of open licenses to encourage collaboration and create a space of shared resources, including open source software, open data, and open access documents. Upon this layer, we provide open source software tools for data dissemination and analysis in a reproducible context. These tools are developed employing a rigorous, test-driven software process that performs quality control on a continuous basis and enables the adoption of contributions from community members. In particular, we encourage contribution through a novel open access journal, the Insight Journal, which supports contributions of documents, data, and/or software, and provides an automated scoring process in conjunction with human review.
The Insight Journal is used when a community contribution reaches the level of a promising algorithm or evaluation study. Authors are required to submit their source code, data and parameters along with their paper. The Journal compiles and runs the code automatically and publicly posts the results as a first review of the paper. This initial cycle takes no more than 24 hours, from author submission to online publication. The submitted materials are available online for any member of the community to download, enabling readers to replicate the results described in the paper and post public reviews back on the site. Authors are encouraged to post revisions and improvements of their papers and accompanying software. In this manner the comments of reviewers and readers can be addressed to the benefit of the entire community. Because all the materials required for reproducibility are included, readers are empowered as reviewers in an authentic ‘peer-review’ system where transparency is the rule. Reviewers’ comments are public, non-anonymous, and the community has the opportunity to rate the reviewers. Using the Insight Journal we have been able to accommodate high quality, reproducible software that meets the needs of our open source communities in a timely manner.