SIPS 2018 is an action-oriented meeting, serving our mission to improve psychological science. There will be no symposia or keynote speakers: the meeting will focus on initiating and conducting projects. In other words: no passive listening to boring talks, but instead getting your hands dirty and immediately starting to improve psychological science! All meeting attendees are expected to abide by our code of conduct.

Because this format is something not everyone is used to, and some of it might sound a bit abstract, you can check out last year’s program, the SIPS’ meeting OSF page, and this great blog post to learn more.

The conference programming committee is hard at work planning workshops, hack-a-thons, unconference, and lightning talk sessions for this year’s meeting. Don’t have any idea what any of that stuff means? Don’t worry – we barely do either! All that is required to attend this meeting is a desire to improve psychology and a willingness to get to work.

In-Progress Plans (2018)


Effect Sizes and Power Analysis

Erin Hennes and Sean Lane


Jin Goh and Joe Hilgard

Fundamentals of R

Loek Brinkman and Nicholas Michalak

Doing Replications

Rich Lucas and Rolf Zwaan


Sara Weston

Introduction to Bayesian Analysis with JASP

Johnny van Doorn

Checking Yourself (and Others)

Michèle Nuijten and Nick Brown

Agent-Based Modeling

Paul Smaldino

Defining Your Constructs

Jessica Flake and Eiko Fried

Multi-Site Collaborations

Chris Chartier and Randy McCarthy


Facts, Fictions, and Ongoing Disputes in Replicability-Related Methods Reform

Laura Scherer, Alexandra Sarafoglou,

Hannah Moshontz, and Marcel van Assen

Creating and Finding Open Science Jobs

Liz Page-Gould and Mickey Inzlicht

Sharing Replicable Research

Neil Lewis Jr. and Patrick Forscher

Teaching Replicable and Reproducible Science

Kristin Lane and Heather Urry

Self Correction

Lee Jussim

Diversity (+ Re-Hackathon)

Joanne Chung + TBD

Increasing Uptake of Reform Proposals by Psychology Journals

Steve Lindsay and TBD

Lightning Talks

An OSF-Inspired Project: The Research Protocol that Brings Accountability into Our Labs and Our Field

Karly Cochran

Recording citation purposes to build a more cumulative science

Carol Iskiwitch

Does competition incentivize low-quality science?

Leonid Tiokhin

An Attempt (of sorts) at Creating Open and Accessible Virtual Reality Experiments

Oliver Clark

Assessing the the validity of widely used ideological instruments

Flavio Azevedo

Positive Controls To Make Research Results More Interpretable

Bob Calin-Jageman

Augmented publishing – A Proof of Concept

Antonio Schettino

Robust, Replicable and Reproducible Analysis of Existing Data Sets

Julia Rohrer

Machine learning techniques as a means of enhancing replicability of psychological science

Andrew Hall

Beyond p-values: Utilizing Multiple Methods to Evaluate Evidence

KD Valentine

Training as a Pathway to Improving Reproducibility in Psychology

Elizabeth Williams

The day after methods are fixed: Pervasive problems of weak theories

Ivan Grahek

Motivation and Academic Scholarship

Nathan Hall

Using information-theoretic approaches for model selection: moving from explanation toward prediction

Ladislas Nalborczyk

…and More Lightning Talks to Come!

(call for subsequent rounds will be released closer to conference date)

Unconference Sessions

Unconference sessions will form as the meeting progresses. Depending on their progress, these Unconference sessions may lead to additional Hackathons and/or Lightning Talks, which will also be announced and during the meeting.