Anyone who talks with young economists entering academia about their career prospects and those of their peers cannot fail to note the importance they place on publication in the so-called Top Five journals in economics: the American Economic Review, Econometrica,the Journal of Political Economy,the Quarterly Journal of Economics, and the Review of Economic Studies.The discipline’s preoccupation with the Top Five is reflected in the large number of scholarly papers that study aspects of the these journals, many of which acknowledge the Top Five’s de facto role as arbiter in tenure and promotion decisions (e.g. Ellison 2002, Frey 2009, Card and DellaVigna 2013, Anauti et al. 2015, Hamermesh 2013, 2018, Colussi 2018).
While anecdotal evidence suggests that the Top Five has a strong influence on tenure and promotion decisions, actual evidence on such influence is sparse. Our paper (Heckman and Moktan 2018) fills this gap in the literature. We find that the Top Five has a large impact on tenure decisions within the top 35 US departments of economics, dwarfing the impact of publications in non-Top Five journals. A survey of current tenure-track faculty hired by the top 50 US economics departments confirms the Top Five’s outsize influence.
Our empirical and survey-based findings of the Top Five’s influence beg the question: is the Top Five an adequate filter of quality? Extending the analysis of Hamermesh (2018), we show that appearance of an article in the Top Five is a poor predictor of quality as measured by citations. Substantial variation in the citations accrued by papers published in the Top Five and overlap in article quality across journals outside the Top Five make aggregate measures of journal quality such as the Top Five label and Impact Factors poor measures of individual article quality. This is a view expressed by many economists and non-economists alike.1
There are many consequences of the discipline’s reliance on the Top Five. It subverts the essential process of assessing and rewarding original research. Using the Top Five to screen the next generation of economists incentivises professional incest and creates clientele effects whereby career-oriented authors appeal to the tastes of editors and biases of journals. It diverts their attention away from basic research toward strategising about formats, lines of research, and favoured topics of journal editors, many with long tenures. It raises the entry costs for new ideas and persons outside the orbits of the journals and their editors. An over-emphasis on Top Five publications perversely incentivises scholars to pursue follow-up and replication work at the expense of creative pioneering research, since follow-up work is easier to judge, is more likely to result in clean publishable results, and is hence more likely to be published.2 This behaviour is consistent with basic common sense: you get what you incentivise.
In light of the many adverse and potentially severe consequences associated with current practices, we believe that it is unwise for the discipline to continue using publication in the Top Five as a measure of research achievement and as a predictor of future scholarly potential. The call to abandon the use of measures of journal influence in career advancement decisions has already gained momentum in the sciences. As of the time of the writing of this column, 667 organisations and 13,019 individuals have signed the San Francisco Declaration of Research Assessment, a declaration denouncing the use of journal metrics in hiring, career advancement, and funding decisions within the sciences.3 Economists should take heed of these actions. We provide suggestions for change in the concluding portion of this column.
Documenting the power of the Top Five
We find strong evidence of the influence of the Top Five. Without doubt, publication in the Top Five is a powerful determinant of tenure and promotion in academic economics. We analyse longitudinal data on employment and publication histories for tenure-track faculty hired by the top 35 US economics department between 1996 and 2010. We find that Top Five publications greatly increase the probability of receiving tenure during the first spell of tenure-track employment (see Figure 1). This is true if we limit samples to the first seven years of employment. Estimates from duration analyses of time to tenure show that publishingthree Top Five articles is associated with a 370% increase in the rate of receiving tenure, compared to candidates with similar levels of publication in non-Top Five journals. The estimated effects of publication in non-Top Five journals pale in comparison.
Figure 1 Predicted probabilities for receipt of tenure in the first spell of tenure-track employment
Notes: The figures plot the predicted probabilities associated with different levels of publications by authors in different journal categories, where the predictions are obtained from a logit model. White diamonds on the bars indicate that the prediction is significantly different than zero at the 5% level.
A survey of current assistant and associate professors hired by the top 50 US economics departments corroborates these findings. On average, junior faculty rank Top Five publications as being the single most influential determinant of tenure and promotion outcomes (see Figure 2).4
Figure 2 Ranking of performance areas based on their perceived influence on tenure and promotion decisions
Notes: The figure summarises respondents’ rankings of either performance areas. Responses are summarised by type of career advancement: tenure receipt, promotion to assistant professor, and promotion to associate professor. The bars present mean responses for each performance area. Respondents were given the option to not rank any or all of the eight performance areas. As a result, the number of respondents varies across the performance areas.
Responses to our survey reveal a widespread belief among junior faculty that the effect of the Top Five on career advancement operates independently of differences in article quality. To separate quality effects from a Top Five placement effect, we ask respondents to report the probability that their department awards tenure or promotion to an individual with Top Five publications compared to an individual identical to the first individual in every way except that he/she has published the same number and quality of articles in non-Top Five journals. If the Top Five influence operates solely through differences in article impact and quality, the expected reported probability would be 0.5. The results in Figure 3 show large and statistically significant deviations from 0.5 in favour of Top Five publication. On average, respondents from top 10 departments believe that the Top Five candidate would receive tenure with a probability of 0.89. The mean probability increases slightly for lower-ranked departments.
Figure 3 Probability that a candidate with Top Five publications receives tenure or promotion instead of an identical candidate with non-Top Five publications, ceteris paribus
Notes: The figure summarises respondents’ perceptions about the probability that a candidate with Top Five publications is granted tenure or promotion by the respondent’s department instead of a candidate with non-Top Five publications, ceteris paribus. Responses are summarised by type of career advancement: tenure receipt, promotion to assistant professor, and promotion to associate professor. The bars present mean responses for each performance area. White diamonds indicate that the mean response is significantly different than 50% at the 10% level.
The Top Five as a filter of quality
The current practice of relying on the Top Five has weak empirical support if judged by its ability to produce impactful papers as measured by citation counts. Extending the citation analysis of Hamermesh (2018), we find considerable heterogeneity in citations within journals and overlap in citations across Top Five and non-Top Five journals (see Figure 4). Moreover, the overlap increases considerably when one compares non-Top Five journals to the less-cited Top Five journals. For instance, while the median Review of Economics and Statistics article ranks in the 38thpercentile of the overall Top Five citation distribution, the same article outranks the median-cited article in the combined Journal of Political Economy and Review of Economic Studies distributions.
Figure 4 Distribution of residualalog citations for articles published between 2000 and 2010 (as at July 2018)
Source: Scopus.com (accessed July 2018)
Note: a The table plots distributions of residual log citations obtained from a model that estimates log(citations+1) as a function of third-degree polynomial for years elapsed between the date of publication and 2018, the year citations were measured. This residualisation adjusts log citations for exposure effects, thereby allowing for comparison of citations received by papers from different publication cohorts.
Definition of journal abbreviations: QJE–Quarterly Journal Of Economics, JPE–Journal Of Political Economy, ECMA–Econometrica, AER–American Economic Review, ReStud–Review Of Economic Studies, JEL–Journal Of Economic Literature, JEP–Journal Of Economic Perspectives, ReStat–Review Of Economics And Statistics, JEG–Journal Of Economic Growth, JOLE–Journal Of Labor Economics, JHR–Journal Of Human Resources, EJ–Economic Journal, JHE–Journal Of Health Economics, ICC–Industrial And Corporate Change, WBER–World Bank Economic Review, RAND–Rand Journal Of Economics, JDE–Journal Of Development Economics, JPub–Journal Of Public Economics, JOE–Journal Of Econometrics, HE–Health Economics, ILR–Industrial And Labor Relations Review, JEEA–Journal Of The European Economic Association, JME–Journal Of Monetary Economics, JRU–Journal Of Risk And Uncertainty, JInE–Journal Of Industrial Economics, JOF–Journal Of Finance, JFE–Journal Of Financial Economics, ReFin–Review Of Financial Studies, JFQA–Journal Of Financial And Quantitative Analysis, and MathFin–Mathematical Finance.
Restricting the citation analysis to the top of the citation distribution produces the same conclusion. Among the top 1% most-cited articles in our citations database,5 13.6% were published by three non-Top Five journals.6
Low editorial turnover and incest
Figure 5 Density plot of the number of years served by editors between 1996 and 2016
Source: Brogaard et al. (2014) for data up to 2011. Data for subsequent years collected from journal front pages.
Compounding the privately rational incentive to curry favour with editors is the phenomenon of longevity of editorial terms, especially at house journals (see Figure 5). Low turnover in editorial boards creates the possibility of clientele effects surrounding both journals and their editors. We corroborate the literature that documents the inbred nature of economics publishing (Laband and Piette 1994, Brogaard et al. 2014, Colussi 2018) by estimating incest coefficients that quantify the degree of inbreeding in Top Five publications. We show that network effects are empirically important – editors are likely to select the papers of those they know.7
Table 1 Incest coefficients: Publications in Top Five between 2000 and 2016, by author affiliation listed during publication
Source: Elsevier, Scopus.com.
Notes: The table reports three columns for each Top Five journal. The left most columns report the number of articles that were affiliated to each university. The middle columns present the percentage of articles published in the journal that were affiliated to the university out of all articles affiliated to the list top universities. The right most columns present the percentage of articles published in the journal that were affiliated to the university out of all articles published in the journal. An author is defined as being affiliated with a university during a given year if he/she listed the university as an affiliation in any publication that was made during that specific year. An article is defined as being affiliated with a university during a specific year if at least one author was affiliated to the university during the year.
Reliance on the Top Five as a screening device raises serious concerns. Our findings should spark a serious conversation in the economics profession about developing implementable alternatives for judging the quality of research. Such solutions necessarily de-emphasise the role of the Top Five in tenure and promotion decisions, and redistribute the signalling function more broadly across a range of high-quality journals.
However, a proper solution to the tyranny will likely involve more than a simple redefinition of the Top Five to include a handful of additional influential journals. A better solution will need to address the flaw that is inherent in the practice of judging a scholar's potential for innovative work based on a track record of publications in a handful of select journals. The appropriate solution requires a significant shift from the current publications-based system of deciding tenure to a system that emphasises departmental peer review of a candidate's work. Such a system would give serious consideration to unpublished working papers and to the quality and integrity of a scholar's work. By closely reading published and unpublished papers rather than counting placements of publications, departments would signal that they both acknowledge and adequately account for the greater risk associated with scholars working at the frontiers of the discipline.
A more radical proposal would be to shift publication away from the current journal system with its long delays in refereeing and publication and possibility for incest and favouritism, towards an open source arXiv or PLOS ONE format.8 Such formats facilitate the dissemination rate of new ideas and provide online real-time peer review for them. Discussion sessions would vet criticisms and provide both authors and their readers with different perspectives within much faster time frames. Shorter, more focused papers would stimulate dialogue and break editorial and journal monopolies. Ellison (2011 )notes that online publication is already being practiced by prominent scholars. Why not broaden the practice across the profession and encourage spirited dialogue and rapid dissemination of new ideas? This evolution has begun with a recently launched economics version of arXiv.
Under any event, the profession should reduce incentives for crass careerism and promote creative activity. Short tenure clocks and reliance on the Top Five to certify quality do just the opposite. In the long run, the profession will benefit from application of more creativity-sensitive screening of its next generation.
Anauti, V, S Galiani, and R Galvez (2015), "Quantifying the life cycle of scholarly articles across fields of economic research", Economic Inquiry 1339-1356.
Bertuzzi, S, and D Drubin (2013), "No shortcuts for research assessment." Molecular Biology of the Cell 1505-1506.
Brogaard, J, J Engelberg, and C Parsons (2014), "Networks and productivity: Causal evidence from editor rotations." Journal of Financial Economics 251-270.
Card, D, and S DellaVigna. 2013. "Nine facts about top journals in economics." Journal of Economic Literature 144-161.
Colussi, T (2018), "Social ties in academia: A friend is a treasure." Review of Economics and Statistics 45-50.
Eisen, M (2013), "The Past, Present and Future of Scholarly Publishing." Remarks by Michael Eisen, co-Founder of Public Library of Science (PLOS), at the Commonwealth Club in San Francisco.
Ellison, G (2002), "The slowdown of the economics publishing process." Journal of Political Economy 947-993.
Ellison, G (2011), "Is Peer Review in Decline?" Economic Inquiry 635-657.
Frey, B (2009), "Economists in the PITS?" International Review of Economics 335-346.
Hamermesh, D (2018), "Citations in Economics: Measurement, Uses, and Impacts." Journal of Economic Literature 115-156.
Hamermesh, D (2013), "Six Decades of Top Economics Publishing: Who and How?" Journal of Economic Literature 162-172.
Heckman, J J and S Moktan (2018), "Publishing and Promotion in Economics: the Tyranny of the Top Five." INET Working Paper 82.
Heckman, J J, and S Moktan (2018), "Publishing and Promotion in Economics: the Tyranny of the Top Five." NBER Working Paper 25093.
Laband, D, and M Piette (1994), "Favoritism versus search for good papers: Empirical evidence regarding the behavior of journal editors." Journal of Political Economy 194-203.
Schekman, R (2013), "How journals like Nature, Cell and Science are damaging science." The Guardian, 12 9.
Vale, R D (2015), "Accelerating Scientific Publication in Biology." Proceedings of the National Academy of Science 13439-13446.
See https://www.aeaweb.org/webcasts/2017/curse for a roundtable discussion on this topic by prominent economists; see Bertuzzi and Drubin (2013) for comments by biologists; see Schekman (2013)for comments by Randy Schekman, Nobel Laureate in Physiology or Medicine; for statements by Nobel Laureates in Chemistry, see Martin Chalfie’s comments here and Brian Kobilka’s comments here.
[2 ]See the discussion at https://www.aeaweb.org/webcasts/2017/curse
 The San Francisco Declaration on Research Assessment (DORA) has garnered signatures from 667 organisations and 13,019 individuals as of the writing of this column (see https://sfdora.org/signers for the full list of signatories, which include prominent scientists such as Nobel Laureate Martin Chalfie). DORA presents recommendations for judging research output in hiring, advancement, and funding decisions within the sciences. Chief among its recommendations is the avoidance of journal-based metrics when assessing individual research articles and the contributions of individual scientists. DORA was developed by “a group of editors and publishers of scholarly journals […] during the Annual Meeting of The American Society for Cell Biology (ASCB) in San Francisco, CA, on December 16, 2012” (see https://sfdora.org/read for the full declaration).
 Pairwise Wilcoxon tests comparing the distribution of rankings provided by respondents for the eight different performance areas reject the null hypothesis of equality between the ranking distribution for Top Five publication and each of the other seven performance areas at the 10% level.
 The database is comprised of citations to all articles published by 25 top economics journals between 2000—2010
 Each of the three journals produced more top 1% articles than the Review of Economic Studies, and two of the three journals produced at least as many top 1% articles as the Journal of Political Economy. The Review of Economic Studies is outranked by six additional non-survey non-Top Five journals, which together contributed a further 16% to the pool of top 1% articles.
 Whether this practice capitalises on the benefits of using inside information that improves journal quality as measured by citations or whether it is unproductive cronyism is much discussed. The evidence on this issue is not conclusive, but it appears to favour the story of net benefits to insider knowledge.
 See Vale (2015)for a discussion of the use of arXiv in Physics. See Eisen (2013) for remarks on PLOS ONE by Michael Eisen, its co-founder.