Why I love preprints

I believe that preprints are a great medium for communicating and disclosing new research in the life sciences. Recently, Science magazine asked me why I am so enthusiastic for preprints and published a feature including some of my responses, and some reasons why I love preprints. Below are more reasons why I believe that preprints hold much promise for improving the communication of biomedical research:

What do you get out of it? Have you gotten useful feedback (if so via comments, twitter email etc.)?
Preprints are great for sharing broadly my latest research findings. My first bioRxiv preprint has been viewed over 11,000 times and my preprint on single cell proteomics (SCoPE-MS) of stem cells well over 19,000 times in just a few months. These numbers, along side with comments from leading scientist and prominent news coverage of my preprints, suggest that preprints can be very effective in communicating my results to the community and establishing priority. I believe preprints and journal articles can be equally effective (or ineffective) at establishing priority, depending on the their quality, visibility and percolation through the community.

I have received feedback from senior professors, multiple suggestions for collaborations (some of which materialized), news coverage of my preprints, editorial invitations to submit my preprints to journals, and an invitation to apply for a faculty position at an elite institution.

Have you modified a manuscript as a result? Or is the benefit more getting your work out there/sharing earlier?
The benefits are many. One is modifying and improving a manuscript based on the feedback, which I have done. Another is getting a timestamp on my work that makes me more willing to broadly share and present my results without concerns for being scooped.

Do you worry that posting a preprint could jeopardize publication in a journal?
No. Most prominent journals accept preprints; journals that do not, lose. In my opinion, the benefits of using preprints far outweigh the limited opportunity to publish in some journals. I believe that journals that do not embrace preprints will decline in prominence over time.

Do you have papers you have not posted as preprints? if not why not?
I am committed to posting all papers from my laboratory as preprints. I am coauthor on papers that were not posted as preprints but I was not the lead author for them and did not make the final decision.

Do you think preprints will make journals obsolete, or do we still need peer reviewed journals?
Preprints are not aiming to side-step peer review. We need good peer review, more than ever, and preprints provide more opportunities for peer review, not less. We still need a formal system that can ensure peer review with minimal bias that is as transparent as possible and successful journals will adapt to fulfill this need.

How do colleagues react when you try to get them to share preprints? Are some more receptive than others? Are there differences by age or field? Why do you think some are still reluctant?
Many of my colleagues are receptive, others less so. Importantly, I have not heard a cogent argument why preprints are bad for science. The most frequent argument revolves around fear of losing priority of discovery. My usual question is: Have you ever felt that one of your peer reviewed papers is not cited and given credit when it should have been? Publishing a paper in a peer reviewed journal does not make it immuned to scooping. I think that the quality and the visibility of a timestamped research article is more important for establishing priority over the exact time when it is peer reviewed. Of course peer review and independent replication are essential for establishing the validity of any research article but these can be separated in time from the first disclosure of the results.

I have noticed that in many fields, a lot of the papers are quantitative, modeling, etc. and not wet biology. Will it take longer for those doing lab experiments to embrace preprints? Why would they be more reluctant?
I have observed these differences across disciplines. I think they stem from differences in culture and technical skills. Preprints will percolate slower in some communities, but I am confident that they will continue to spread fast and eventually will be adopted by all.

What impedes substantive progress?

It is widely discussed, most prominently by Alberts et al., that the current academic culture is increasingly suppressing the creativity, risk-taking, and original thinking required to make fundamental discoveries. A similar sentiment reverberates in this article by Peter Thiel, Technology Stalled in 1970, on the incremental developments of technologies by entrepreneurs, traditionally associated with bold, original and ambitious undertakings. Why is that? What hinders the bold, creative undertakings required for substantive progress?

This simple why question has a complex answer. It could be that at the current stage of technological development, many incremental improvements are easily accessible while the big conceptual questions and the substantial technological improvements are rather inaccessible. It could be that our culture has increasingly myopic time–horizon and prioritizes incremental risk–averse projects with short–term returns over longer–term visionary and riskier projects. It could be the increased competition, though I would argue that creative, non-obvious projects may be a winning competitive strategy. It could be a combination of all these factor and many more not mentioned above. Any suggestions?

The factor that I want to focus on is this: The tendency to portray rather incremental progress as a great breakthrough. A common example of this tendency in biomedical research is performing a well-established type of measurement on a larger scale than ever before and advertising the results as a “great breakthrough”, sometimes codified in prizes. It is usually decent research resulting in a large and useful dataset. It also saps resources and talent away from the more creativeSuperman_Inflatable_Suit research that is more justifiably described as a breakthrough and that is more likely to result in a non-trivial conceptual advance in our understanding. The examples of mediocre improvements of commercial technologies that are advertised as great breakthroughs are even more numerous; in fact, developed nations spend sizable fractions of their GDPs on this kind of make–believe advertisement. It is this misleading advertisement, both in basic scientific research and in commercial technologies that makes incremental, sometimes mediocre, improvements a viable strategy to achieving make–believe greatness. The more we dismiss and suppress misleading advertisement, the more we will encourage the creativity, risk-taking, and original thinking required to make fundamental discoveries and to significantly improve commercial technologies; this way we can incentivize and facilitate substantive progress.

Can mere hype succeed in the long-term ?

Michael I. Jordan, one of the most prominent statisticians of our time, warns that “The overeager adoption of big data is likely to result in catastrophes…“. This comment raises a question: If much of the enthusiasm for big data is indeed mere hype, is the big data movement likely to deliver results in the long–term?

I believe that the big data movement will deliver many significant results, but not because it is a particularly good new idea substantiated by real progress in statistics, machine learning, or useful data collection. I agree that we “… are no further along with computer vision than we were with physics when Isaac Newton sat under his apple tree.”. Much of big data nowadays seems to be about manipulating people for political and mercenary purposes based on simplistic statistical models.

I think that any hype is likely to “succeed” if it attracts enough attention and resources and if it is stated broadly/vaguely enough. Resources and prominence attract great people and they bring many ideas, most of which way beyond the scope and the imagination of the people who started the hype. Some of these ideas are likely to be good and result in significant progress. A decade from now, we will be able to look back and see that many significant methodological developments in statistics were enabled by the big data hype and most likely would not have happened without it. What we would not be able to see are all the other, perhaps much more significant and beneficial, developments that would have happened if the resources and the attention were allocated to other areas.

Yes, mere hype can succeed in the long-term. Attracting resources and attention is the key to the success of any huge initiative, and a hype is perhaps the easiest way to attract them. Huge initiatives attract at least some good people who are likely to come up with at least some good ideas that deliver something useful. If the hype was stated broadly/vaguely enough, these results seem to validate the vision of the people who started the movement. The hype is declared a success.

Tell me about the science, not the prizes!

The more we focus on awards and advertise career building, the more we attract people seeking awards and glamorous careers, and the bigger the burden on the peer review system.

The independent and critical assessment of data and of analysis is at the core of the scientific method. Yet, the rapid growth of the scientific enterprise and the explosion of the scientific literature have made it not only hard but impossible to read, think deeply, and assess independently all published papers, or even the subset of all papers relevant to one’s research. This is alarming. It has alarmed many people thinking of creative and effective ways to evaluate the quality of scientific research. This exceptionally hard endeavor has attracted much needed attention and I am hopeful that progress will be made.

In this essay, I suggest another approach to alleviating the problem, starting with two related questions: Why is low quality “science” written up and submitted for publication and what can we do to curb such submissions? These questions touch upon the poorly quantifiable subject of human motivation. Scientists have a complex set of incentives that include understanding nature, developing innovating solutions to important problems, and aspirations for social status, prestige and successful careers. All these incentives are part of our human nature, have always existed and always will. Yet, the balance among them can powerfully affect the problems that we approach and the level of evidence that we demand to convince ourselves of the truths about nature.

In my opinion, scientific culture can powerfully affect the incentives of scientists and in the process harness the independent thought of the individual scientists — not only the external reviewers — in raising the standards and rigors of their own work. I see a culture focused on prizes and career building as inimical to science. If the efforts of bright young people are focused on building careers, they will find ways to game the system. Many already have. As long as the motivation of scientists is dominated by factors other than meeting one’s own high standards of scientific rigor, finding the scientific results worthy of our attention will remain a challenge even with the best heuristics of ranking research papers. However, if “the pleasure of finding things out” — to use Feynman’s memorable words — is a dominant incentive, the reward, the pleasure, cannot be achieved unless one can convince oneself of the veracity of the findings. The higher the prominence of this reward intrinsic to scientific discovery is, the lower the tendency to game the system and the need for external peer review.

A scientific culture that emphasizes the research results — not their external reflections in prizes and career advancement — is likely to diminish the tendency to use publishing primarily as a means of career advancement, and thus enrich the scientific literature of papers worthy our attention. We know that racial stereotypes can be very destructive and we have lessened their destructive influences by changing the popular culture. How can we apply this lesson to our scientific culture to focus on the critical and independent assessment of research and thus lessen the negative aspects of career building and glamour seeking?

A great place to begin is by replacing the headlines focused on distinctions and building careers with headlines focused on factual science. For example, the “awards” section in CVs, faculty profiles and applications for grants and tenure-track faculty-positions can be replaced by a “discoveries” section that outlines, factually, significant research findings. Similarly, great scientists should be introduced at public meetings with their significant contributions rather than with long lists of prizes and grants they received. One might introduce Egas Moniz as the great Nobel laureate and Dmitri Mendeleev as a chemist with few great awards. Much more informatively, however, one should introduce Egas Moniz as an influential protagonist of lobotomy and Dmitri Mendeleev as the co-inventor of the periodic table of elements. 

Admittedly Mendeleev and Moniz are prominent outliers but they are far from being the only examples of discrepancy between awarded prizes and scientific contributions. Still the worst aspect of focusing on prizes, grants and career building is not only the reinforcement of political misattribution of credit; far worse is the insidious influence of excessive focus on prizes and career building on the scientific culture. The more we celebrate awards, the more we attract people seeking awards and glamorous careers, and the bigger the burden on the peer review system.

We should celebrate research and examples like those of Einstein and Feynman, not the prizes that purport to reflect such research. Focusing on the work and not the prize would hardly diminish the credit. Indeed, the Nobel prize derives its prestige from scientists like Einstein and Feynman and not the other way around. A prize may or may not reflect significant contributions and we should be given the opportunity to evaluate independently the contributions. We should focus on the scientific contributions not only because critical and independent evaluation of data is the mainstay of science but because it nourishes constructive scientific culture, a culture focused on understanding nature and not gaming the system. Only such a culture of independent assessment can give the best ideas a chance to to prevail over the most popular ideas.

The next time you have a chance to introduce an accomplished colleague, respect their contributions with an explicit reference to their work, not their prizes. With this act of respect you will help realign our scientific culture with its north star: the independent and critical evaluation of experiments, data, ideas, and conceptual contributions.

An edited version of this opinion essay was published by The Scientist Accomplishments Over Accolades

The Best Projects Are Least Obvious

We are fortunate to live in an exciting time. Today, new technologies enable the design and execution of straightforward experiments, many of which were not possible just a few years ago. These experiments hold the potential to bring new discoveries and to improve medical care. An abundance of obvious-next-step experiments creates a buzz of activities and excitement that is quite palpable among graduate students, postdocs, and professors alike.

Such enthusiasm permeates the air and stimulates; it also overwhelms. It seems there is always so much to do and never enough time to do it. Recent findings have opened up many new research avenues, and emerging technologies are ever-alluring. How are investigators to pursue all of these things, given our limited time? Or, failing that, how can we at least choose the best leads to follow?

Much of the aforementioned buzz is often the result of an overabundance of next-step projects that are obvious to most researchers. Many of these projects are quite good, but rarely are they exceptional — at least in the sense that they result in a nontrivial connection. It’s not often that these projects help researchers advance their fields. Many such projects use novel, fashionable technologies, but bring little new perspective to the scientific community. Yet I have seen colleagues become so busy pursuing such experiments that they lack the time to complete most of their projects, or to even think conceptually and creatively.

Of course, some next-step experiments are poised to become major landmarks, as were the first gene expression measurement by RNA-seq, the first comprehensive mass spectroscopy-based quantification of a eukaryotic proteome, the first gene deletion collection, the first analysis of conserved DNA sequences in mammalian genomes, and the first induction of pluripotent stem cells. If I do not pursue the obvious experiments likely to become landmarks, someone else will, and science will progress without delay. These tempting experiments typically lure multiple independent groups, at least some of which abandon the projects once their competitors’ first big paper has been published.

Thus, none of the many tempting next-step experiments — even among these that are poised to be landmarks — is likely the best to do if I want to make a difference. After all, the many experiments that are obvious to me are likely to be obvious to most of my colleagues. Few of the most tempting experiments are likely to bring genuinely new perspectives to standing problems or find new important problems. In fact, I find that the more obvious an experiment is to me, the less likely it is to evoke a new perspective, no matter what new and fashionable technologies are used. What’s more, the more tied up I become with next-step experiments, the less time I have to think of truly great ones.

The overabundance of stimulating next-step experiments contrasts strikingly with a dearth of genuinely new perspectives. Focusing on the genuinely creative ideas rephrases the original question of “How can I possibly follow all of the many tempting avenues?” to a harder, but potentially much more fruitful question: “How can I chart a course that is truly worth following?”

An edited version of this opinion essay was published by The ScientistThe Best Projects Are Least Obvious

The mission of MIT

I still remember very clearly the key reason behind my decision to attend MIT about a decade ago. It was a statement that set MIT apart from the other top schools. On one of the MIT webpages, I read that an MIT education is a calling about understanding nature and not about building a career. Throughout my time at MIT, both as an undergrad and as a postdoc, I have seen many examples to support this mission statement that have always made MIT special for me.

Recently, however, I have been hearing more voices of an alternative culture; one puts career first and science second. I often hear my colleagues being more concerned about “spinning” and “selling” a paper rather than about understanding nature. I hear MIT students and postdocs for whom the “impact factor” (IF) of the magazine/journal in which they publish is more important than the substance of what they publish. This worship of the IF, computed and published by the Thomson corporation, is particularly odd for scientists given the methods of computing the IF, and particularly out of place at MIT (see this excellent editorial for more information). I still believe that MIT is a special place; I have met too many students and faculty passionate about science to think otherwise. Yet, I also think that we as a community should make a concerted effort to counteract the cancerous spread of the IF worship and preserve what makes MIT special. The personal example of the senior members of the community who put science first can be a particularly effective and inspiring part of such an effort. I know from personal experience because I have benefited tremendously from the example of my mentors.

This emphasis on the IF can be seen as a particular example of the general trend of decoupling merit from social reward. Such decoupling is rather widespread in all realms of life, whether actively fostered by specious advertising or passively allowed by hiring and promotion committees focusing excessively on the IF. The decoupling is perhaps more common in business than in science, perhaps more common in other academic institutions than at MIT. Yet, I find it particularly unacceptable in science and completely incongruous with MIT’s culture and mission.

This opinion appeared in the The MIT Tech