Reviewers’ scores do not predict impact: bibliometric analysis of the proceedings of the human–robot interaction conference
Type of content
UC permalink
Publisher's DOI/URI
Thesis discipline
Degree name
Publisher
Journal Title
Journal ISSN
Volume Title
Language
Date
Authors
Abstract
© 2016, Akadémiai Kiadó, Budapest, Hungary. The peer review process is an essential component for the progress of science. The ACM/IEEE International Conference on Human–Robot Interaction is the prime publication channel for the field and this study evaluates its peer review process. The results show that the number of accepted papers are unevenly distributed across countries, organizations and authors. The contributions from the US outweigh all others contributions. A Binary Logistic Regression analysis showed that only for 85.5% of the papers the reviewers’ scores accurately predict its acceptance or rejection. Moreover, there was no significant correlation between the reviewers’ scores and the citations the papers later attract. 73% of the authors only ever submitted one paper and the proportion of newcomers at the conferences ranges from 63–77%.
Description
Citation
Keywords
Ngā upoko tukutuku/Māori subject headings
ANZSRC fields of research
Fields of Research::46 - Information and computing sciences::4608 - Human-centred computing::460806 - Human-computer interaction
Fields of Research::46 - Information and computing sciences::4610 - Library and information studies::461005 - Informetrics