Skip to main content

Research Reproducibility 2016: Call for Proposals

This conference will feature prominent speakers and opportunities to explore the concept of reproducibility.

Research Reproducibility 2016 Logo

Important Dates

July 31
  • Submission deadline
September 2
  • Acceptance notification
September 13
  • Abstracts posted
November 14
  • Conference Poster Session

Notifications of Acceptance went out September 2, 2016

The deadline for abstract submissions has passed. Please contact us if you have questions. Thank you for your interest and we look forward to seeing you this fall.

Call for Posters

The Research Reproducibility Conference will bring together researchers, students, and administrators for a frank discussion on how institutions can support research reproducibility and make more research true. We hope to further the dialogue around open science, open data, transparency, and good research practices. The conference will be held November 14-15, 2016 at The University of Utah S. J. Quinney College of Law.

The Poster Session will showcase cutting-edge research and works-in-progress in pursuit of making research true. Presenting a poster is a great opportunity, especially for students and new researchers, to obtain interesting and valuable feedback on ongoing research from conference attendees. We strongly encourage student and industry submissions.

A committee of faculty and librarians will review all proposals. Accepted poster abstracts will be published on the conference website. Authors of accepted posters are expected to be present during the scheduled poster session. A Best Poster Award will be presented based on the quality of research work, poster design, and oral presentation.

Suggested Topics

  • Case study of large-scale collaborative research
  • Adoption of replication culture and its impact
  • Research illustrating a lack of reproducible research in your field
  • Registration of studies, protocols, analysis codes, datasets, raw data, and results
  • Examples of use or sharing of data, protocols, materials, software, and other tools
  • Reproducibility practices
  • More appropriate statistical methods
  • Work in standardization of definitions and analyses
  • Improvement of study design standards
  • Improvements in peer review, reporting, and dissemination of research
  • Better training of scientific workforce in methods and statistical literacy