The past few years has witnessed much debate regarding research practices that can potentially undermine the accuracy of reported research findings (e.g., p-hacking, lack of direct replication, low statistical power; Ioannidis, Munafo, Fusar-Poli, Nosek, & David, 2014; O’Boyle, Banks & Gonzalez-Mule, 2014; Simmons, Nelson, & Simonsohn, 2011), and some leading journals that publish research in the field of social psychology have made editorial changes to address these issues (e.g., Eich, 2014; Funder et al., 2014; Journal of Experimental Social Psychology, 2014). Pre-registration of study hypotheses and methods has been suggested as one way to enhance the accuracy of reported research findings by making the research process more transparent (e.g., Campbell, Loving & LeBel, in 2014; Chambers, 2014; De Groot, 1956/2014; Krumholz & Peterson, 2014; Miguel et al., 2014; The PLOS Medicine Editors, 2014). Many journals now have a registered reports section where editors and reviewers focus on the strength of pre-registered methods and data analytic plans to test proposed hypotheses and accept articles for publication in advance of data collection (e.g., Perspectives on Psychological Science). A new journal, titled Comprehensive Results in Social Psychology (CRSP), supported by the European Social of Social Psychology as well as the Society of Australasian Social Psychologists, is the first social psychology journal to publish only pre-registered papers. Are researchers in the field of Social Psychology, however, presently following these suggestions by adopting the practice of pre-registering details of their studies?
There are different ways to answer this question, and the approach I adopted here was to cross-reference the current membership of the Society of Experimental Social Psychology (SESP; accessed October 1, 2014) with all current users of the Open Science Framework (OSF; accessed October 1-2, 2014). The membership of SESP was selected to represent the field of social psychology for the following reasons: (1) membership is open to any researcher regardless of disciplinary affiliation, (2) individuals are only eligible to be considered for membership after holding a PhD for five years and following an evaluation by committee of the degree to which their publication record advances the field of social psychology, and (3) there are presently over 1000 members in institutions all over the world. Members of SESP therefore represent a cross-section of recognized social psychological researchers. I selected the user list of the OSF because since its launch in 2011 it has positioned itself as the most recognized third-party website for posting study details in the social sciences.
To conduct the cross-referencing I first recorded all of the names listed in the membership directory of SESP (http://sesp.org/memlist.htm) in a spreadsheet. I then typed each name into the search window of the OSF website (https://osf.io) to identify current user status. If an individual was listed as a user, I navigated to his/her user page to determine (a) the number projects the user has currently posted to the OSF website, and (b) how many of these projects were currently public (i.e., fully accessible by any visitor to the site). User status, number of projects, and number of public projects were entered into the spreadsheet. It is important to note that posted projects refer to studies already conducted or currently being conducted given that project details remain on the site over time.
Descriptive analyses revealed that of the 1002 current members of SESP, 98 (or 9.8%) had created accounts on the OSF website. The two highest frequencies were for posting zero projects (i.e., having an account only; 26.5%) and for posting one project (35.7%). The frequencies for posting more than one project then decrease very rapidly. The number of public projects is 44%, meaning that the details of a slight majority of projects (56%) are not publicly available. This is perhaps understandable given that researchers may prefer to wait to share pre-registered study details when a manuscript containing data from a given study has been accepted for publication.
On the one hand it is a positive development to see that in a relatively short period of time (i.e., since 2011) close to 10% of researchers identified by their peers as making significant contributions to the study of social psychology (i.e., members of SESP) have created a user account on the OSF, the most prominent online site devoted to increasing transparency in the research process. On the other hand, over 90% of SESP members currently are not users on the OSF, and the individuals that are users presently post very few projects. It is very likely that the low number of posted projects does not reflect the actual number of research projects conducted (active or completed) in the respective labs of SESP members that have posted projects on the OSF. It can then be concluded that pre-registering of study details is currently a very uncommon practice in the field of social psychology, at least within the membership of SESP, on the OSF. There presently exists a gap, therefore, between the suggestions to pre-register study details to enhance transparency of the research process and the employment of this practice among active researchers in social psychology.
This practice is likely to become more common going forward, but one potential explanation for the low rate of pre-registering study details at this time is the potential concern that it is cumbersome, that not all study hypotheses are established at the time of data collection, and that other researchers may “scoop” posted hypotheses and methods (see Campbell et al., 2014). To the extent that these pose real risks to researchers adopting pre-registration, the act of pre-registration itself could be argued to hurt the advance of ideas in our field. This argument is largely philosophical at this time given that there is simply not enough empirical evidence upon which to evaluate this possibility.
Campbell, L., Loving, T.J., & LeBel, E.P. (2014). Enhancing transparency of the research process to increase accuracy of findings: A guide for relationship researchers. Personal Relationships. DOI: 10.1111/pere.12053
Chambers, (2014). Psychology’s ‘registration revolution’. Retrieved from: http://www.theguardian.com/science/head-quarters/2014/may/20/psychology-registration-revolution.
De Groot, A. D. (1956/2014). The meaning of “significance” for different types of research. Translated and annotated by Eric-Jan Wagenmakers, Denny Borsboom, Josine Verhagen, Rogier Kievit, Marjan Bakker, Angelique Cramer, Dora Matzke, Don Mellenbergh, and Han L. J. van der Maas. Acta Psychologica, 148, 188-194.
Eich, E. (2014). Business not as usual. Psychological Science, 25, 3-6.
Funder, D.C., Levine, J.M., Mackie, D.M., Morf, C.C., Vazire, S., & West, S.G. (2014). Improving the dependability of research in personality and social psychology: Recommendations for research and educational practice. Personality and Social Psychology Review, 18, 3-12.
Ioannidis, J.P., Munafo, M.R., Fusar-Poli, P., Nosek, B.A., & David, S.P. (2014). Publication and other reporting biases in cognitive sciences: Detection, prevalence, and prevention. Trends in Cognitive Science, 18, 235-241.
Journal of Experimental Psychology (2014). JESP editorial guidelines. Retrieved from http://www.journals.elsevier.com/journal-of-experimental-social-psychology/news/jesp-editorial-guidelines/.
Krumholz, H.M., & Peterson, E.D. (2014). Open access to clinical trials data. The Journal of the American Medical Association. 312, 1002-1003.
Miguel, E., Camerer, C., Casey, K., Cohen, J., Esterling, K.M., Gerber, A….Van der Laan, M. (2014). Promoting transparency in social science research. Science, 343(6166), 30-31.
O’Boyle, Jr., E.H., Banks, G.C., & Gonzalez-Mule, E. (2014). The Chrysalis effect: How ugly initial results metamorphosize into beautiful articles. Journal of Management. doi 10.1177/0149206314527133).
Simmons, J.P., Nelson, L.D., & Simonsohn, U. (2011). False-positive psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological Science, 22, 1359-1366.
The PLOS Medicine Editors, (2014). Observational studies: Getting clear about transparency. PLoS Med 11(8): e1001711. doi:10.1371/journal.pmed.1001711.
DOI: 10.15200/winn.143509.91514 provided by The Winnower, a DIY scholarly publishing platform