Editor’s Note: Article by SC18 Student Cluster Competition Reproducibility Chair Christopher Bross who is a researcher and PhD student at Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU).
Replication and reproducibility of experimental computer science results is essential for peer reviewed, high-quality papers. Over the past years, aspects of replication and reproducibility have become more relevant in the HPC community.
The IEEE/ACM International Conference for High Performance Computing, Networking, Storage, and Analysis (SC) with its Technical Program and Student Cluster Competition (SCC), has been a pioneer in evaluating HPC applications in terms of their replicability.
For the past three years, SC has promoted the adoption of the Artifact Description (AD) policy endorsed by ACM. The conference has selected one of the papers from the past edition to become the benchmark for the SC SCC reproducibility challenge. The competitive selection includes the review of all the past edition SC papers with AD and the in-person interview of the finalist papers’ authors.
This year, the SCC committee is proud to announce the winning paper for the reproducibility challenge at this year’s SCC at SC18: “Extreme scale multi-physics simulations of the tsunamigenic 2004 sumatra megathrust earthquake”. The paper is co-authored by Carsten Uphoff, Sebastian Rettenberger, and Michael Bader from Technical University of Munich and Elizabeth H. Madden, Thomas Ulrich, Stephanie Wollherr amd Alice-Agnes Gabriel from Ludwig-Maximilians-Universität München.
More Background on the SCC
Sixteen teams of undergraduate and high school students will reproduce the results of this paper in a real-time, non-stop, 48-hour challenge. It will include other benchmarking and throughput tasks at the SC18 Exhibit Hall in Dallas, Texas.
For the students this is a great opportunity to dip into the process of scientific working and learn from the best papers and researchers. The students will need a deep understanding of the paper to understand the presented results.
At the competition, the teams will compare the results in the paper with their own experimental results on their own cluster, which is not necessarily the same system used in the selected paper.
The replication of the paper provides the entire HPC community with tremendous insights on the portability of HPC applications across platforms, outlining strengths and weaknesses of individual systems. Ultimately, the students will hand in a report that includes their interpretation and analysis of the results compared to the original results in the paper.
The overall process will give students the opportunity to learn how to critically read publications, analyze results, and leverage peers’ work to generate new knowledge. As in the past, the top scoring teams from the SCC reproducibility challenge will be given the opportunity to publish their results from the competition in a special issue of the Parallel Computing journal. The article is for most students on the teams their first publication in their research career.
The authors of the selected paper are engaged and work with the SCC committee during the process of building the SCC benchmark. The authors are acknowledged for their effort with the Results Replicated badge in the ACM Digital Library and the SIGHPC Certificate of Appreciation at the SC award ceremony.
By joining their forces, the Technical Program and SCC at SC aims to promote the high quality of the conference publications with their AD and a training opportunity for the next generation of HPC scientists with the SCC reproducibility challenge.