You may have heard that Chinese researchers are not very well compensated, compared to their Western counterparts. What you might not know is that they can increase their income by a factor of 10 with a single publication. The better the journal they publish in, as judged by the average number of times that its papers are cited, the more money they make. According to an anonymous source specializing in science evaluation in China, some research institutions follow a simple formula for determining cash rewards: 10,000 yuan, multiplied by one plus the journal impact factor (the impact factor reflects average citation levels). For example, publication in The Lancet, whose impact factor was 39.06 in 2012, would fetch 400,600 yuan (about $65,000). By comparison, the average yearly income of Chinese scientific researchers was 39,850 yuan in 2007, according to a survey by the China Association for Science and Technology.1
The role of famous journals is firmly ensconced into the life of Chinese academia. This reporter carried out an informal survey of 17 scientists working in China, and found that 15 scientists agreed that publication in high-impact factor journals is decisive for promotion. In addition, 65 percent agreed that it has a remarkable influence on their income, and 76 percent said it was very common to see research institutions granting cash rewards to researchers for publication. These results are consistent with a 2004 survey which found that 43 percent of scientific researchers said this kind of performance-linked pay accounted for more than 50 percent of their income.2
But the requirement to publish in these journals is also widely resented. Thirteen of 17 survey respondents said that the impact factor was not a reasonable way to judge a researcher’s potential, and 12 were against cash rewards based on journal title. These opinions reflect a groundswell of opposition to the role of the impact factor in professional success. A watershed moment came in 2011, when the well-known chemist Nai-Xing Wang published an editorial in Nature claiming that “chemistry research in China has turned into a Vanity Fair” because of the obsession with impact factors.3 Entrenched yet unpopular, the impact factor is a window onto the history of Chinese science, and seemingly inextricable from its near-term future.
Forty-three percent of scientific researchers said this kind of performance-linked pay accounted for more than 50 percent of their income.
A turning point in that history came in 1983, when Nanjing University’s president, Kuang Yaming, joined three other university presidents in writing a letter to the Chinese Communist Party’s central committee. The letter urged them to choose 50 universities1 to be developed to world-class standards. The good news was that the central committee agreed with the goal suggested in the letter, and selected 15. The bad news was that Kuang’s own university was not among those chosen. Kuang’s successor, Qu Qinyue, was left with the task of competing with these universities, each of which would be receiving substantial support from the central government.
Qu turned to the Science Citation Index. The Index is an authoritative, international system for assessing research, and Nanjing worked it into its own evaluation systems. “In the eyes of the university presidents,” Nanjing University’s website reads, “the Science Citation Index is the best system because it does not rely on interpersonal relationships.” Now owned by Thomson Reuters, the Index counts the number of times a given paper, published in one of about 8,500 academic journals (the SCI journals),
appears in the reference list of another paper published in an SCI journal. This is called a citation. Nanjing University began to award 1,000 yuan (about $163 at today’s exchange rate) to every author of every paper published in an SCI journal, with the intention of developing an objective indicator of assessment.
The strategy worked. From 1992 to 1998, Nanjing University published more SCI papers than any other university in mainland China. In 1994, it was chosen for special government sponsorship as part of the prestigious “Project 211” run by the National Education Commission. Other universities took notice, and began incorporating the Thomson Reuters SCI list and metrics into their own yearly reviews of researchers’ performance. Cash incentives to individual researchers grew.
The good news was that the central committee agreed with the goal suggested in the letter. The bad news was that Kuang’s own university was not among the chosen five.
Today, the power of SCI papers and citations is visible everywhere. It is used not only to dole out rewards, but also punishments. Yu Ying started her career as an emergency room doctor at the prestigious Peking Union Medical College Hospital in Beijing in 2004, and in 2011 opened an account on China’s version of Twitter, called Weibo. Thousands were attracted by the possibility of asking the advice of a College Hospital physician without waiting overnight to register at the hospital. Within a month, she had 110,000 followers. A little more than a year later, she had two million, and had earned the title “ER superwoman” for her ability to balance her professional duties with online counseling. However, in mid-June of this year, Yu quit her job. “Goodbye, my love,” read one Weibo post. She went on to explain that she had become tired of the evaluation system in Chinese academia. “I have been an attending doctor for seven or eight years, and the reason I did not receive promotion is that I did not publish enough papers in [the right] journals. This is unfair,” she wrote.
Yu Ying is not alone in her opinion. Opposition to the impact factor is broad, and stems in part from the fact that it is a blunt instrument. “Impact factors are designed to evaluate the influence of a journal, but they are not suitable for evaluating a discipline or a scientist,” said Jiang Gaoming, a leading botanist in China. For one thing, he told me, it is meaningless to compare the citation rates among different research fields, which may have a different number of researchers, and different typical publication frequencies. “Climate change and energy,” he said, “might be the hottest topics now, but papers on these issues, which get more citations, are not necessarily more important than other papers.”
A focus on citations encourages scientists to change their research focus to easier and more fashionable areas, in pursuit of publications in journals with high impact factors. You might think that this would hurt scientists’ careers, by hampering the independence of their work—but it is a pecularity of citation metrics that it does not. Suppose you’ve reached the summit of Mount Everest. Few would question your ability to climb to the top of a smaller mountain—say, Mount Logan. Yet this is not the case in the world of impact factor-based scientific assessment. “You may have to spend a decade finishing some Everest-level work,” says Yu Biao, a well-known research chemist, and deputy director of the Shanghai Institute of Organic Chemistry, Chinese Academy of Sciences. “But you will get few citations for it, because few other people would try to climb Everest. You will spend less time climbing Logan, and perhaps get more followers and citations.”
Yu’s Everest was a four-year effort to develop a 63-step method to synthesize Landumycin A, which is both an antibiotic and an anti-tumor agent. Yu says the synthesis is important because it allows scientists to compose many related, less-complex chemical compounds. He also told me that it was time-consuming work, and a “diseconomy,” especially for young scientists. That’s because the research resulted in just a single paper, and one that has been cited just 14 times since its publication in 2011. Two of Yu’s graduate students spent six years on the synthesis, but failed to secure a permanent position after graduation despite his success, because he had published just a single paper. “This work is, of course, not enough for a professorship,” Yu said. Had Yu been chasing the citations that modern tenure and funding committees look for, he would have elected not to do this work.
Suppose you’ve reached the summit of Mount Everest. Few would question your ability to climb to the top of a smaller mountain.
Content, then, is not necessarily king when science is judged by citations. Even the form of the research report can matter as much as its content. There are a variety of manuals circulating in China that teach scientists how to write papers geared towards the tastes of editors of high-impact journals. A search for “how to write SCI papers” on Baidu.com, a popular search engine in China, returns tens of thousands of results. One document suggests avoiding making the “Materials and Methods” section of a paper too long, “because that makes your results and discussion seem too thin,” nor so short that “others might doubt your result.” The document also urges the writer to adjust his or her style to match that of the magazine being submitted to. “Do not bore your editor,” it instructs.4
There are a host of other problems with an overreliance on impact factor, as has been documented by a variety of international researchers.5,6 For example, journal impact factors are influenced strongly by a small number of very highly cited papers that raise the journal’s average citation rates. This average, in turn, raises the profile of every other paper published in that journal. But the citations these other papers garner may never reach anywhere near the journal average. Researchers have also found that publication in a high-impact journal is a sort of self-fulfilling prophecy: The same paper published in a lower-impact journal receives about half the number of citations.7 In addition, the impact factor uses citations that occur within a two-year window, but the significance of some papers is not clear until this window has passed.
The limitations of impact factor are being recognized, even if slowly. On May 13, 2013, more than 150 scientists and 75 scientific organizations signed the San Francisco Declaration on Research Assessment, which traces its roots to the December 2012 conference of The American Society for Cell Biology. The declaration described Journal Impact Factors as having “a number of well-documented deficiencies as a tool for research assessment,” and listed 18 individual recommendations encouraging funding agencies, institutions, researchers, and publishers to move away from using impact factors in scientists’ hiring, tenure, and promotion. Research, the declaration stated, should be decided on its own merits, and not just its place of publication.
Nai-Xing Wang agrees that the academic community is coming around. “The world scientific community is starting to realize the problem of impact factors, two years after I pointed them out,” he told me. Even in China, the beginnings of a withdrawal from the overuse of impact factors are visible. In 2010, the Institute of Scientific and Technical Information of China, which helps universities and institutes with research orientation and decision making, changed its evaluation criteria: Up until 2010, ISTIC ranked the country’s universities and institutes entirely according to how many Science Citation Index papers they published annually. Since 2010, this criteria has been dropped in favor of the total citations of all papers. Similarly, last year, the China Academic Degrees & Graduate Education Development Center under the Ministry of Education published its evaluation of more than 390 universities in China. The center’s director, Li Jun, told media that it was the first time the center stopped using Science Citation Index papers as its sole assessment criteria.
But while China’s reliance on the impact factor is cooling, it is a long way from dying. Cash incentives remain in place, and tenure committees continue to expect SCI paper publications from candidates. Joseph Jen, an editor of the journal Food Control, told me that this is in part a result of the way science is run in China. “Research projects are defined by the granting agencies. Therefore, scientists do research to generate data for publication purposes, instead of pursuing scientific advancements for their own sake.” The numbers support the centralized nature of science funding in China. A ten-country survey conducted in 2011 shows that 78 percent of China’s SCI papers received some kind of government sponsorship, compared to 53 percent in the United States and 51 percent in the United Kingdom.8 A centralized competition requires a single, unified performance metric, and citations fill that need nicely. Wu Yishan, deputy director of ISTIC, explains his agency’s thinking: “The space of a high impact factor journal is a scarce resource. It proves your competitive advantage over competitors. Project funds, which are also a scarce resource, should also be given to people with this advantage.”
Had Yu been chasing the citations that modern tenure and funding committees look for, he would have elected not to do this work.
Also relevant is the relative immaturity of Chinese academia. Le Xie, a graduate of China’s Tsinghua University and now an assistant professor at Texas A&M University, told me China’s quantitative metric system “may be a good differentiator between the ‘bad’ and ‘merely good.’ However, in order to encourage and differentiate ‘great’ from ‘merely good,’ a quantitative metric system may not work.” Le says that the U.S. peer review system is more effective from this perspective, but a quality peer review system requires a healthy, competitive academic environment, which takes time to build up. China, he says, is not quite there yet: “The reasons why are complex, and related to cultural background, social institutions, and the phase of economic development. It probably will take time.”
Liu Qingsheng, a retired professor who spent ten years evaluating research projects in China, agrees. “In China’s science academe, there is not yet a complete credibility system, so some quantity index is still needed,” he said. Liu points to a paucity of punishments for scientists who break professional ethics, for examply by misrepresenting data. “Only a few institutions take any measures.”
Whatever the future may hold, today citations and well-known journals continue to wield enormous power. Looking back, Yu Biao considers himself lucky: “I got into the field early, when impact factors were not as important as they are today.” He chose to focus on problems that were tough and less popular, without worrying about publication or promotion. But Yu is not happy to see the evaluation and promotion of young researchers today rely so heavily on impact factors. “I know how hard it is for them,” he said, “professorship, money, and fame, everybody in society chases after these.” His advice to students is simple: “Don’t work because you need to publish. Work because what you are are doing is meaningful.”
Naomi Ching is a science writer based in Beijing. She has won national and international awards for her work.
1. China Association for Science: Second Survey Report released science and technology workers. Xinhua News Agency, (2009). Retrieved from http://news.xinhuanet.com/newmedia/2009-07/11/content_11689396_1.htm.
2. Gao, X.Y. & Han, J.H. Status of research institutions Salary Review. China Soft Science, (2004). Retrieved from http://d.wanfangdata.com.cn/periodical_zgrkx200408022.aspx.
3. Wang, N.X. China’s chemists should avoid the Vanity Fair. Nature 476, 253 (2011).
5. Gaoming, J. Chinese scientists, for whom they work. Science Times, (2008). Retrieved from http://news.sciencenet.cn/htmlnews/2008314811931203566.html?id=203566.
6. Lawrence, P.A. Lost in publication: how measurement harms science. Ethics in Science and Environmental Politics 8, 9-11 (2008).
7. Larivière, V. & Gingras, Y. The impact factor’s Matthew effect: a natural experiment in bibliometrics. Journal of the American Society for Information Science and Technology 61, 424-427 (2010).
8. Sun, J.W., et al. Science Foundation and SCI Paper Output: 10 Countries for comparative analysis. Studies in Science of Science 31, (2013). Retrieved from http://d.wanfangdata.com.cn/periodical_kxxyj201301006.aspx.