|
|
Combination of Scientometrics and Peer-Review in Research Evaluation: International Experiences and Inspirations |
Zhang Lin1, Sivertsen Gunnar2 |
1.School of Information Management, Wuhan University, Wuhan 430072 2.Nordic Institute for Studies in Innovation, Research and Education (NIFU), Oslo NO- 0880 |
|
|
Abstract Scientometrics and peer review are the two most widely used methods of scientific research evaluation. From both theoretical and practical perspectives, the two methods have unique merits and drawbacks, and can complement each other if applied jointly. Focusing on the guidelines for “Encouraging comprehensive evaluation methods combining quantitative and qualitative approaches”, jointly suggested in recent research evaluation policy documents released by the Ministry of Science and Technology and the Ministry of Education of China, this paper reviews the status of international research and practical applications of both scientometrics and peer review to scientific evaluation. Further, target implementation suggestions are proposed based on international experiences and specific contexts within China.
|
Received: 04 May 2020
|
|
|
|
1 教育部, 科技部. 《关于规范高等学校SCI论文相关指标使用树立正确评价导向的若干意见》的通知[EB/OL]. [2020-02-20]. http://www.moe.gov.cn/srcsite/A16/moe_784/202002/t20200223_423334.html. 2 科技部. 《关于破除科技评价中“唯论文”不良导向的若干措施(试行)》的通知[EB/OL]. [2020-02-17]. http://www.most.gov.cn/mostinfo/xinxifenlei/fgzc/gfxwj/gfxwj2020/202002/t20200223_151781.htm. 3 Chinas research-evaluation revamp should not mean fewer international collaborations[J]. Nature, 2020, 579(7797): 8. 4 Mallapaty S. China bans cash rewards for publishing papers[J]. Nature, 2020, 579(7797): 18. 5 DORA. San Francisco Declaration on Research Assessment[EB/OL]. [2013-05-13]. https://sfdora.org/. 6 Hicks D, Wouters P, Waltman L, et al. Bibliometrics: The Leiden Manifesto for research metrics[J]. Nature, 2015, 520(7548): 429-431. 7 European Commission. Responsible research & innovation[EB/OL]. [2020-04-20]. https://ec.europa.eu/programmes/horizon2020/en/h2020-section/responsible-research-innovation. 8 Aksnes D W, Sivertsen G. A criteria-based assessment of the coverage of Scopus and Web of Science[J]. Journal of Data and Information Science, 2019, 4(1): 1-21. 9 Amara N, Landry R. Counting citations in the field of business and management: Why use Google Scholar rather than the Web of Science[J]. Scientometrics, 2012, 93(3): 553-581. 10 Sivertsen G. The Norwegian model in Norway[J]. Journal of Data and Information Science, 2018, 3(4): 3-19. 11 Waltman L. A review of the literature on citation impact indicators[J]. Journal of Informetrics, 2016, 10(2): 365-391. 12 Kuan C H, Huang M H, Chen D Z. Cross-field evaluation of publications of research institutes using their contributions to the fields?? MVPs determined by h-index[J]. Journal of Informetrics, 2013, 7(2): 455-468. 13 Aksnes D W, Langfeldt L, Wouters P. Citations, citation indicators, and research quality: An overview of basic concepts and theories[J]. SAGE Open, 2019, 9(1): 1-17. 14 Krücken G, Blümel A, Kloke K. The managerial turn in higher education? On the Interplay of organizational and occupational change in German academia[J]. Minerva, 2013, 51(4): 417-442. 15 Nedeva M, Boden R, Nugroho Y. Rank and file: Managing individual performance in university research[J]. Higher Education Policy, 2012, 25(3): 335-360. 16 Rebora G, Turri M. The UK and Italian research assessment exercises face to face[J]. Research Policy, 2013, 42(9): 1657-1666. 17 van Noorden R. Metrics: A profusion of measures[J]. Nature, 2010, 465(7300): 864-866. 18 Butler L. Assessing university research: A plea for a balanced approach[J]. Science and Public Policy, 2007, 34(8): 565-574. 19 Jiménez-Contreras E, de Moya Anegón F, López-Cózar E D. The evolution of research activity in Spain: The impact of the National Commission for the Evaluation of Research Activity (CNEAI)[J]. Research Policy, 2003, 32(1): 123-142. 20 Hammarfelt B, de Rijcke S. Accountability in context: Effects of research evaluation systems on publication practices, disciplinary norms, and individual working routines in the faculty of Arts at Uppsala University[J]. Research Evaluation, 2015, 24(1): 63-77. 21 Ingwersen P, Larsen B. Influence of a performance indicator on Danish research production and citation impact 2000-12[J]. Scientometrics, 2014, 101(2): 1325-1344. 22 Ossenblok T L B, Engels T C E, Sivertsen G. The representation of the social sciences and humanities in the Web of Science—A comparison of publication patterns and incentive structures in Flanders and Norway (2005-9)[J]. Research Evaluation, 2012, 21(4): 280-290. 23 Schneider J W. An outline of the bibliometric indicator used for performance-based funding of research institutions in Norway[J]. European Political Science, 2009, 8(3): 364-378. 24 胡明铭, 黄菊芳. 同行评议研究综述[J]. 中国科学基金, 2005, 19(4): 251-253. 25 郭碧坚. 科技管理中的同行评议: 本质, 作用, 局限, 替代[J]. 科技管理研究, 1995, 15(4): 8-12. 26 杜杏叶, 李贺, 王玲, 等. 中国学者对学术论文公开同行评议的接受度研究[J]. 图书情报工作, 2018, 62(2): 73-81. 27 Ragone A, Mirylenka K, Casati F, et al. On peer review in computer science: Analysis of its effectiveness and suggestions for improvement[J]. Scientometrics, 2013, 97(2): 317-356. 28 van Arensbergen P. Talent Proof: Selection processes in research funding and careers[EB/OL]. [2014-08-12]. http://www.worldcat.org/title/talent-proof-selection-processes-in-research-funding-and-careers/oclc/890766139&referer=brief_results. 29 Pezzoni M, Sterzi V, Lissoni F. Career progress in centralized academic systems: Social capital and institutions in France and Italy[J]. Research Policy, 2012, 41(4): 704-719. 30 Hicks D, Katz J S. Equity and excellence in research funding[J]. Minerva, 2011, 49(2): 137-151. 31 Akst J. I hate your paper[J]. The Scientist, 2010, 24(8): 36-41. 32 Jefferson T, Wager E, Davidoff F. Measuring the quality of editorial peer review[J]. JAMA, 2002, 287(21): 2786-2790. 33 Weller A C. Editorial peer review: Its strengths and weaknesses[J]. Medford: Information Today, 2001. 34 Frodeman R, Briggle A. The dedisciplining of peer review[J]. Minerva, 2012, 50(1): 3-19. 35 Mutz R, Bornmann L, Daniel H D. Heterogeneity of inter-rater reliabilities of grant peer reviews and its determinants: A general estimating equations approach[J]. PLoS ONE, 2012, 7(10): e48509. 36 van Leeuwen T N, Moed H F. Funding decisions, peer review, and scientific excellence in physical sciences, chemistry, and geosciences[J]. Research Evaluation, 2012, 21(3): 189-198. 37 Lee C J, Sugimoto C R, Zhang G, et al. Bias in peer review[J]. Journal of the American Society for Information Science and Technology, 2013, 64(1): 2-17. 38 Abramo G, D’Angelo C A, di Costa F. National research assessment exercises: A comparison of peer review and bibliometrics rankings[J]. Scientometrics, 2011, 89(3): 929-941. 39 Langfeldt L, Kyvik S. Researchers as evaluators: Tasks, tensions and politics[J]. Higher Education, 2011, 62(2): 199-212. 40 Marsh H W, Jayasinghe U W, Bond N W. Improving the peer-review process for grant applications: Reliability, validity, bias, and generalizability[J]. The American Psychologist, 2008, 63(3): 160-168. 41 Welpe I M, Wollersheim J, Ringelhan S, et al. Incentives and performance: Governance of research organizations[M]. Heidelberg: Springer International Publishing, 2015. 42 黄慕萱, 严逐莲. 同侪审查的起源、研究现状与展望[J]. 图书资讯学刊, 2016, 14(1): 41-85. 43 王国豫, 朱晓林. 同行评议与“外行”评议[J]. 科学学研究, 2015, 33(8): 1121-1126, 1133. 44 Abramo G, DAngelo C A. Evaluating research: From informed peer review to bibliometrics[J]. Scientometrics, 2011, 87(3): 499-514. 45 叶继元. 引文法既是定量又是定性的评价法[J]. 图书馆, 2005(1): 43-45. 46 Derrick G E, Pavone V. Democratising research evaluation: Achieving greater public engagement with bibliometrics-informed peer review[J]. Science and Public Policy, 2013, 40(5): 563-575. 47 姜春林, 张立伟. 学术评价: 同行评议抑或科学计量[J]. 中国高等教育, 2014(15): 20-22, 35. 48 王前, 李丽, 高成锴. 跨学科同行评议的合理性研究[J]. 科学学研究, 2013, 31(12): 1792-1795. 49 Langfeldt L. The policy challenges of peer review: Managing bias, conflict of interests and interdisciplinary assessments[J]. Research Evaluation, 2006, 15(1): 31-41. 50 Bromham L, Dinnage R, Hua X. Interdisciplinary research has consistently lower funding success[J]. Nature, 2016, 534(7609): 684-687. 51 Luukkonen T. Conservatism and risk-taking in peer review: Emerging ERC practices[J]. Research Evaluation, 2012, 21(1): 48-60. 52 Thelwall M, Kousha K, Wouters P, et al. The metric tide: Literature review[EB/OL]. [2015-07-06]. https://www.researchgate.net/publication/279423923_The_metric_tide_Literature_review. 53 Rinia E J, van Leeuwen T N, van Vuren H G, et al. Comparative analysis of a set of bibliometric indicators and central peer review criteria[J]. Research Policy, 1998, 27(1): 95-107. 54 Aksnes D W, Taxt R E. Peer reviews and bibliometric indicators: A comparative study at a Norwegian university[J]. Research Evaluation, 2004, 13(1): 33-41. 55 Traag V A, Waltman L. Systematic analysis of agreement between metrics and peer review in the UK REF[J]. Palgrave Communications, 2019, 5(29): 1-12. 56 Harzing A W. Running the REF on a rainy Sunday afternoon: Do metrics match peer review?[EB/OL]. [2019-09-08]. https://harzing.com/blog/2017/08/running-the-ref-on-a-rainy-sunday-afternoon. 57 Gl?nzel W, Hinze S. Metrics for research evaluation: Indicators, methods and mathematical foundations[C]// Proceedings of the European Summer School for Scientometrics, Vienna, Austria, 2011. 58 刘莉. 英国大学科研评价改革: 从RAE到REF[J]. 科学学与科学技术管理, 2014, 35(2): 39-45. 59 Farla K, Simmonds P. REF accountability review: Costs, benefits and burden[EB/OL]. [2015-07-17]. https://www.technopolis-group.com/report/ref-accountability-review-costs-benefits-and-burden/. 60 Swedish Government. Kunskap i samverkan - f?r samh?llets utmaningar och st?rkt konkurrenskraft[M]. Stockholm: Regeringen, 2016. 61 Swedish Research Council. Research quality evaluation in Sweden - FOKUS: Report of a Government Commission regarding a model for resource allocation to universities and university colleges involving peer review of the quality and relevance of research[M]. Stockholm: Swedish Research Council, 2015. 62 Wilsdon J, Allen L, Belfiore E, et al. The metric tide: Report of the independent review of the role of metrics in research assessment and management[M]. Bristol: HEFCE, 2015. 63 Association of Universities in the Netherlands. Standard Evaluation Protocol 2015-2021[EB/OL]. https://www.vsnu.nl/files/documenten/Domeinen/Onderzoek/SEP2015-2021.pdf. 64 Anfossi A, Ciolfi A, Costa F, et al. Large-scale assessment of research outputs through a weighted combination of bibliometric indicators[J]. Scientometrics, 2016, 107(2): 671-683. 65 Ancaiani A, Anfossi A F, Barbara A, et al. Evaluating scientific research in Italy: The 2004-10 research evaluation exercise[J]. Research Evaluation, 2015, 24(3): 242-255. 66 Alfo? M, Benedetto S, Malgarini M, et al. On the use of bibliometric information for assessing articles quality: An analysis based on the third Italian research evaluation exercise[C]// Proceedings of the STI 2017 Conference on Open Indicators, Innovation, Participation and Actor Based STI Indicators, Paris, 2017. 67 Jonkers K, Zacharewicz T. Research performance based funding systems: A comparative assessment[M]. Brussels: JRC Science Hub, 2016. 68 Jonkers K, Zacharewicz T. Performance based funding: A comparative assessment of their use and nature in EU Member States[R]. Brussels: JRC Science for Policy Report, 2015. 69 P?l?nen J. Applications of, and experiences with, the Norwegian model in Finland[J]. Journal of Data and Information Science, 2018, 3(4): 31-44. 70 Engels T C E, Guns R. The Flemish performance-based research funding system: A unique variant of the Norwegian model[J]. Journal of Data and Information Science, 2018, 3(4): 45-60. 71 Aagaard K, Bloch C W, Schneider J W, et al. Evaluation of the Norwegian publication indicator (in Danish)[M]. Oslo: Norwegian Association of Higher Education Institutions, 2014. 72 Ahlgren P, Colliander C, Persson O. Field normalized citation rates, field normalized journal impact and Norwegian weights for allocation of university research funds[J]. Scientometrics, 2012, 92(3): 767-780. 73 Journal of Data and Information Science. JDIS Volume 3 Issue 4[EB/OL]. http://manu47.magtech.com.cn/Jwk3_jdis/EN/volumn/volumn_53.shtml. 74 高晓培, 武夷山, 李伟钢. 巴西人才库Lattes平台在优化科研和教育管理中的作用及其借鉴意义[J]. 全球科技经济瞭望, 2014, 29(7): 32-42. 75 Priem J, Taraborelli D, Groth P, et al. Altmetrics: A manifesto[EB/OL]. [2017-09-27]. http://altmetrics.org/manifesto/. 76 林梦泉, 任超, 陈燕, 等. 破解教育评价难题 探索“融合评价”新方法[J]. 学位与研究生教育, 2019(12): 1-6. 77 Tollefson J. China declared world’s largest producer of scientific articles[J]. Nature, 2018, 553(7689): 390. 78 Zhang L, Sivertsen G. The new research assessment reform in China and its implementation[J]. Scholarly Assessment Reports, 2020, 2(1): 3. |
|
|
|