Chapter 7 Discussion and conclusions
This dissertation aimed to provide a quantitative look at academic work in 21st century. We tried to investigate this multi-faceted phenomenon from different points of view. Although considering different datasets and contexts, our research design included different observational units from individual, meso (community), macro to multi-level ones.
Our ambition was to shed light on individual efforts of scientists at publishing their research results (Chapter 3). On the one hand, we considered bottom-up efforts of academics while trying to compete and survive in a hyper-competitive context. We tried to consider the nested structure and organizational embeddedness in our hierarchical linear models while we kept in mind that being embedded in different institutional context could have a crossed membership nature meaning sociologists working in similar departments in different universities be exposed to similar unwritten rules while being affiliated to similar university might bring about similar constraints, facilities or limitations to sociologists in different departments (e.g., those working in economics department compared to the ones in medical sciences departments).
Afterwards, we looked at the top-down processes, with a particular interest on certain national policies and their capacity (or failure) in inspiring higher research productivity (Chapter 4). Here, we did not want to limit the scope of our research only to individual or macro levels as research collaboration is a social process (Garvey 1979; Zhang et al. 2018). At the individual level, we found that academics working more internationally and with a specific group of collaborators had higher productivity, whereas at the macro level, we found that higher productivity was not solely inspired by macro-level policy changes. Cultural changes in academia cannot occur overnight or simply reflect policy initiatives (Melin 2000). When considering macro aspects, we found that certain ambiguities in the reward system of academia (Merton 1968; Boffo and Moscati 1998) can generate confusion for individual researchers, depending on their visions, interests and goals.
For instance, if policy signals are not univocally addressed to stimulate high quality research and reward systems are not directly linked to assessment, it is reasonable to expect a variety of responses by scientists. This lack of coherence between performance evaluation and promotion and career advancement (Abramo, D’Angelo, and Rosati 2014) has led certain observers even to question whether the Italian academic system is prepared to reward any performance based scheme (Bertocchi et al. 2015). Here, the fact that there is no unified and standard evaluation procedure for assessment and promotion creates ambiguities (e.g., case of national habilitation reports in Marzolla (2016)).
We used crossed membership random effects structure which enabled studying organizational context along with other fixed effects while ruling out individual subjects’ random noise (see Baayen, Davidson, and Bates (2008) for a discussion), but we need to emphasize again that there are other methodological possibilities to study effect of policy changes in scientist’s behavior, specifically Dif in Dif and Regression Discontinuity Design (e.g. see Seeber et al. (2017) for an example) which we haven’t used here. Those analytical strategies might help give more insights into policy effects.
Furthermore, we looked at the diversity of the sociological community. We wanted to see if considering embeddedness of academics in different hierarchies (Granovetter 1985) would help us understand scientist’s performance. In Chapter 5, we discussed certain unwritten rules characterizing the academic system that can explain the different success of academics. For instance, graduating or working in Ivy-league institutions can have different effect depending on different groups of authors, e.g., male versus female academics.
Here, our findings would support feminist narratives (e.g., Teele and Thelen (2017); Maliniak, Powers, and Walter (2013); González-Álvarez and Cervera-Crespo (2017)), which suggested that the current winner takes all academic system (Frank and Cook 2010) penalizes females by offering them less attractive and more unstable careers (Lomperis 1990; Hancock and Baum 2010). Furthermore, there is a structural process leading females to prefer specific fields of science (non-STEM), acquire specific skills and qualifications (non-quantitative) and perform more qualitative research (Kretschmer et al. 2012), which is less appreciated by prestigious journals. Recent research suggests that females are treated unfairly in peer review (Hengel and others 2017) and even when published their research is less valued (Krawczyk and Smyk 2016) and cited (Maliniak, Powers, and Walter 2013; Beaudry and Larivière 2016).
It is worth noting here that this type of studies, including ours, cannot draw clear causal effects. We cannot evaluate to what extent these causal mechanisms were at work and which ones. In our case, we only observed that these distortions, probably due to structural effects, were detectable also in our sociological community. For instance, while we found gender differences in the substantive research focus between sociologists, it is difficult to trace these differences back to single causal mechanisms more in depth analysis is required to disentangle the processes underlying these complex outcomes.
In Chapter 6, we explored research groups’ formation and evolution. We aimed to investigate the temporal dimension of scientific activities by using a rather sophisticated temporal community detection configuration (Mucha et al. 2010). We found that individual researchers join certain research communities and stay or leave these communities over time. We found that this process affected the existence of coauthorship ties (i.e., number of publications). Our aim was to present the usefulness of multi-level approach in looking at coauthorship evolution. Note that reducing scientific collaboration to a cross-sectional image by removing temporal changes would lead to neglect important factors. Accordingly, we concluded that the cross-sectional view would be reductionist and any study on research activity should seriously consider these diverse dimensions. While we are aware of more advanced multi-level network models that model ties formation and existence between individual (micro), community (meso) and organizations (macro) entities and our approach has limitations compared to these more advanced views.
The dynamics of citation and the heavy reliance of academic reward system over citing previous work is one of the aspects of academic work that we haven’t studied sufficiently. Explanation models taking into account complex factors from seniority of researcher and article’s age, research potential and subjective rigor to connections and embeddedness in coauthorship, departmental and affiliation networks can drive what has been named as citation cartels (Fister Jr, Fister, and Perc 2016), we did not build models like this and we did not take citation networks into account. This is a rather large limitation since there are not many studies on sociologists focusing on citations.
There are extensive studies on CV data and fine-grained analysis of academic pedigree that is rarely focused on sociological community (with the most recent exception being Warren (2019)’s study). We gathered background information on international community of sociologists but it has a higher potential for further analysis than what we presented in Chapter 5.
In Chapter 5 and Chapter 6, we examined research focus and content of sociological articles. Indeed, scientific collaboration is as much substantially oriented as it is socially constructed. Scientists choose their collaborators not merely on social, substantive or scientific ground but probably due to a mixture of all these factors. Here, we are at the beginning of a line of research that requires further advancements to draw any comprehensive conclusions. Among these steps, more robust and detailed analysis on the quantitative and qualitative divide in sociology and certain gendered attitudes would be important.
At the end of each chapter, we discussed certain limitations that penalized our work. Here, we want to emphasize that even the mere process of tie formation (either friendship, scientific collaboration or coauthorship) is a complex process whose study requires mixed methods research designs. This also includes qualitative understanding of trajectories and experiences. For instance, this is the type of study that Small (2017) did on early career researchers and postgraduate students, in which he focused on how they coped with hardships of their academic life. This type of mixed methods research would better represent the process of individual decision-making by revealing certain motivations behind scientists’ collaboration, which here were left on the background. We only briefly discussed the fact that academics face dualities in their everyday work such as contributing to the community by accepting peer review tasks that is not directly reflected in academic reward system versus being focused on one’s research and publication activity. There are new developments i.e. Publons (see Sammour (2016) for a description) which gives promise of tracing voluntary scholarly activity that might enable future researchers to look into bibliometric information on scientists publication and their Publons records on peer review activities to see how scientists are coping with the above mentioned duality. This type of new data might enable researchers and evaluators to refrain from being criticized on the prevalence of quantitative view in research evaluation and promise a more fine-grained quantitative and qualitative mixture in research assessment. Furthermore, this might help going some steps closer toward the ideal image of giving voice to those under evaluation (i.e. researchers) to take part and help in their own evaluation (see Cole and Zuckerman (1987) for a classic example of mixing demographic, bibliometric and interview data in research productivity measurement).
In our work, we could not reconstruct why and how scientists decided to collaborate with one another. We merely observed the outcome of these collaborations, i.e., the digital traces left by publications. We could neither reconstruct collaboration failures nor explore potential pool and the role of redundancy (e.g., gap in potential offer between academics of different institutes). We believe that a deeper look into these factors that revolve around collaboration is a necessary step towards a serious study of the social dimension of science.
References
Garvey, WD. 1979. “Communication, the Essence of Science: Facilitating Information Exchange Among Librarians, Scientists, Engineers and Students.”
Zhang, Chenwei, Yi Bu, Ying Ding, and Jian Xu. 2018. “Understanding Scientific Collaboration: Homophily, Transitivity, and Preferential Attachment.” Journal of the Association for Information Science and Technology 69 (1): 72–86.
Melin, Göran. 2000. “Pragmatism and Self-Organization: Research Collaboration on the Individual Level.” Research Policy 29 (1): 31–40.
Merton, Robert K. 1968. “The Matthew Effect in Science: The Reward and Communication Systems of Science Are Considered.” Science 159 (3810): 56–63.
Boffo, Stefano, and Roberto Moscati. 1998. “Evaluation in the Italian Higher Education System: Many Tribes, Many Territories... Many Godfathers.” European Journal of Education 33 (3): 349–60.
Abramo, Giovanni, Ciriaco Andrea D’Angelo, and Francesco Rosati. 2014. “Career Advancement and Scientific Performance in Universities.” Scientometrics 98 (2): 891–907.
Bertocchi, Graziella, Alfonso Gambardella, Tullio Jappelli, Carmela A Nappi, and Franco Peracchi. 2015. “Bibliometric Evaluation Vs. Informed Peer Review: Evidence from Italy.” Research Policy 44 (2): 451–66.
Marzolla, Moreno. 2016. “Assessing Evaluation Procedures for Individual Researchers: The Case of the Italian National Scientific Qualification.” Journal of Informetrics 10 (2): 408–38.
Baayen, R Harald, Douglas J Davidson, and Douglas M Bates. 2008. “Mixed-Effects Modeling with Crossed Random Effects for Subjects and Items.” Journal of Memory and Language 59 (4): 390–412.
Seeber, Marco, Mattia Cattaneo, Michele Meoli, and Paolo Malighetti. 2017. “Self-Citations as Strategic Response to the Use of Metrics for Career Decisions.” Research Policy.
Granovetter, Mark. 1985. “Economic Action and Social Structure: The Problem of Embeddedness.” American Journal of Sociology 91 (3): 481–510.
Teele, Dawn Langan, and Kathleen Thelen. 2017. “Gender in the Journals: Publication Patterns in Political Science.” PS: Political Science & Politics 50 (2): 433–47.
Maliniak, Daniel, Ryan Powers, and Barbara F Walter. 2013. “The Gender Citation Gap in International Relations.” International Organization 67 (4): 889–922.
González-Álvarez, Julio, and Teresa Cervera-Crespo. 2017. “Research Production in High-Impact Journals of Contemporary Neuroscience: A Gender Analysis.” Journal of Informetrics 11 (1): 232–43.
Frank, Robert H, and Philip J Cook. 2010. The Winner-Take-All Society: Why the Few at the Top Get so Much More Than the Rest of Us. Random House.
Lomperis, Ana Maria Turner. 1990. “Are Women Changing the Nature of the Academic Profession?” The Journal of Higher Education 61 (6): 643–77.
Hancock, Kathleen J, and Matthew Baum. 2010. “Women and Academic Publishing: Preliminary Results from a Survey of the Isa Membership.” In The International Studies Association Annual Convention, New Orleans, La.
Kretschmer, Hildrun, Ramesh Kundra, Theo Kretschmer, and others. 2012. “Gender Bias in Journals of Gender Studies.” Scientometrics 93 (1): 135–50.
Hengel, Erin, and others. 2017. “Publishing While Female. Are Women Held to Higher Standards? Evidence from Peer Review.” Faculty of Economics, University of Cambridge.
Beaudry, Catherine, and Vincent Larivière. 2016. “Which Gender Gap? Factors Affecting Researchers’ Scientific Impact in Science and Medicine.” Research Policy 45 (9): 1790–1817.
Mucha, Peter J, Thomas Richardson, Kevin Macon, Mason A Porter, and Jukka-Pekka Onnela. 2010. “Community Structure in Time-Dependent, Multiscale, and Multiplex Networks.” Science 328 (5980): 876–78.
Fister Jr, Iztok, Iztok Fister, and Matjaž Perc. 2016. “Toward the Discovery of Citation Cartels in Citation Networks.” Frontiers in Physics 4: 49.
Warren, John Robert. 2019. “How Much Do You Have to Publish to Get a Job in a Top Sociology Department? Or to Get Tenure? Trends over a Generation.” Sociological Science 6 (7): 172–96. https://doi.org/10.15195/v6.a7.
Small, Mario Luis. 2017. Someone to Talk to. Oxford University Press.
Sammour, Tarik. 2016. “Publons. Com: Credit Where Credit Is Due.” ANZ Journal of Surgery 86 (6): 512–13.
Cole, Jonathan R, and Harriet Zuckerman. 1987. “Marriage, Motherhood and Research Performance in Science.” Scientific American 256 (2): 119–25.