Abstract
This study analyzed how the ranking status has changed at various higher education system levels by applying different definitions of international faculty. Among the four measures (birthplace, current citizenship, and the country of bachelor and doctoral education), this study found that international faculty measured by the country of doctoral studies produced significantly different international outlook scores and thus ranking status from that based on birthplace or citizenship. Specifically, major English-speaking systems such as the UK, Canada, and Australia hire a large number of faculty who are foreign citizens while non-English speaking systems (Italy, Portugal, China, Korea, and Brazil) hire more local academics who have earned their doctoral degree abroad. This suggests that these non-English speaking countries are systematically under-rated in their international outlook scores by the adoption of the birthplace-based or citizenship-based international faculty measures. As an alternative, this study proposes to update the international faculty measure using a combination of citizenship of employment and doctoral training to minimize this systemic bias.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
Introduction
Internationalization indicators such as the share of international faculty and international students are included in global rankings because rankers believe that these indicators represent institutional competitiveness in the global market. However, there are criticisms that English-speaking systems are advantaged by these internationalization indicators while non-English speaking countries are disadvantaged (e.g., Marginson & van der Wende, 2007; Teichler, 2011; Gantman, 2012). In practice, higher education institutions (hereafter, HEIs) respond to the global ranking games differently to improve their global ranking status (e.g., Dowsett, 2020; Lee et al., 2020). One university might put weight on an international outlook while another might put more weight on research productivity. However, improving the international outlook is not simple because these scores depend on how to define “international” (Teichler, 2015; Huang & Welch, 2021). This study addresses how much the international outlook status of higher education system levels varies depending on the definition of “international” faculty.
The term of “international faculty”’ can be defined in various ways, as extensively discussed by Teichler (2015) and Kim and Jiang (2021). For example, foreign-born status can include a large number of immigrants crossing national borders. A second approach is to define international faculty by the “citizenship” of current employment. The foreign-born status is based on where a person was born and is defined as “nationality” while citizenship refers to whether a person has met the legal requirements as a legal current resident in the country where they are living (US Immigration Office, Jan. 28, 2021). It is relatively easy to obtain data from immigration offices on both the foreign-born and citizenship-based approaches, and most ranking data are based on this approach. The Times Higher Education (hereafter, THE) defines international faculty by the nationality criteria (THE, 2021) while the Quacquarelli Symonds (hereafter, QS) defines it by citizenship (QS, 2021). A third approach is an academic training-based approach where international faculty are defined by the country of their education, i.e., the country where they earned their first higher education degree (hereafter, bachelor degree), the country where they earned their doctoral degree, or where they undertook their post-doctoral studies.
Each definition has its own focus and emphasis. Foreign-born status focuses on the location of birthplace whereas citizenship places more emphasis on the visa status of their faculty members, while academic degrees (post-doc., doctoral, and bachelor) give more emphasis to academic training (e.g., Kim et al., 2011). Different definitions of international faculty lead to complexity in measuring international faculty (Teichler, 2015). Among these approaches, rankers such as THE and QS use birthplace (this study interchangeably uses birthplace and nationality) and citizenship rather than their academic training because they believe that citizenship represents the degree of internationalization of HEIs. However, the birthplace or citizenship might not represent the global competitiveness of the HEIs because some of them are local by their academic training. Therefore, one question raised by the research is how much an international faculty outlook score by citizenship differs from a score measured by academic training.
There are many criticisms of global rankings that evaluate the quality of universities on the basis of clumsy indicators (Dill et al., 2005; Altbach, 2006; Marginson et al., 2007; Kehm, & Stensaker, 2009; Aguillo et al., 2010; Shin et al., 2011; van Raan et al., 2011; Brankovic et al., 2018; Pietrucha, 2018; Safon, 2019; Safon et al., 2021). However, there has been little academic discussion or research on the issue of measuring international faculty in relation to global rankings using empirical data. This study addresses this issue through comparing the changes in international outlook at system levels by applying different measures to international faculty. Global rankings such as THE and QS assess the share of international faculty by nationality or citizenship, but these data do not provide insight on their academic training. Fortunately, international comparative surveys such as the Changing Academic Profession (hereafter, CAP) provide data on faculty members’ academic training as well as their nationality and citizenship, drawing from 25,282 academic staff across 19 higher education systems (Teichler et al., 2013). However, a methodological challenge is how to combine both institutional level data (ranking data) and national level data (the CAP data). This study transformed the international outlook scores of individual HEIs to a national average for the international outlook score, then analyzed possible changes in rankings by adopting different definitions—by academic training in this study—of international faculty.
This study developed a hypothetical ranking for individual HEIs that enables the changes in the ranking status of individual HEIs when applying different definitions of international faculty. This simulation will show how much ranking status has been changed at individual HEIs’ level and higher education system levels according to the different definitions of international faculty. In addition, this study proposes possible directions for updating the international outlook measure. For this research, the study proposed two questions:
-
Research question 1: How much has ranking status been changed by applying different definitions of international faculty by global rankings?
-
Research question 2: Are there similarities among the higher education systems that are under (or over) estimated by the current measure of international faculty?
Research background
This section overviews various definitions of international faculty, followed by how these definitions are measured, and how the measures are weighted in global rankings.
Definitions of international faculty
The underlying logic of using international faculty as a ranking indicator is that having more international faculty represents the global competitiveness of individual HEIs because global talent is moving around the world searching for better working places (Altbach, 2009; Yudkevich et al., 2017). Researchers use different terms such as foreign faculty, international faculty, expatriate faculty, foreign-born faculty, mobile academics, and immigrant academics depending on their research contexts (e.g., Kim et al., 2011; Teichler, 2015; Huang & Welch, 2021). These different terms highlight different dimensions of international faculty. For example, “foreign” faculty highlights a faculty member who is “not a local”; “expatriate” faculty refers to academics who are developing their career outside of their home country (Trembath, 2016), “foreign-born” academics refers to their birthplace, “mobile” academics refers to their frequent mobility across countries (Teichler, 2015). This study focuses on three concepts of international faculty, defined as birthplace-based, citizenship-based, or academic training-based.
In research practice, international faculty can be defined according to the research context. Current global ranking mechanisms use the term birthplace and/or citizenship to define international faculty members. THE defines international faculty as “those defined based on nationality differs from the country where the institution is based” (THE, 2021), and QS states “The term “international” should be determined by citizenship. For EU countries, this includes all foreign nationals, even if from another EU state. In Hong Kong SAR and Macau SAR, this includes professors from Mainland China” (QS, 2021). This is because nationality or citizenship is relatively easy to count and the data are publicly available through immigration offices. However, the term international faculty as determined by nationality or citizenship has some limitations as a measure of institutional competitiveness of HEIs. For example, some countries like the UK with its many former colonial states have higher numbers of international faculty as a baseline. In addition, citizenship depends on the immigration policy of a country rather than its academic competitiveness. Australia, Canada, the USA (before the Trump presidency), or the UK (before Brexit) have (or used to have) flexible immigration policies that favored naturalization for skilled workers (Barbaric & Jones, 2016; Huang & Welch, 2021; Kim & Jiang, 2021). Because of this, the number of foreign citizens might not necessarily represent the academic competitiveness of HEIs.
Given these limitations, academic training might be a better proxy for measuring international faculty as an indicator of global rankings. This is because competitive higher education systems such as in the USA, the UK and Australia recruit talented academics from globally competitive markets and not from locally trained human resources. Considering this, recruiting their faculty globally implies that their HEIs are competitive and open to global talent. Globally mobile academics might bring different areas of specialization, new networks, and even different cultures to their host universities (e.g., Tung, 2008; Teichler, 2015). These advantages are in line with the benefits a host university might gain through the internationalization of higher education as discussed in de Wit (2017) and Altbach and Knight (2007).
This study proposes an alternative term for international faculty—international faculty on the basis of academic training. Actually, academic training is often an important factor in faculty hiring policy in order to minimize faculty inbreeding where state policy encourages the hiring of candidates who are graduates from other universities. Some countries discourage the hiring of their own graduates because academic inbreeding is widespread across higher education systems such as in Portugal, Russia, Japan, and Korea (Horta et al., 2011; Altbach et al., 2015; Shin et al., 2016). Borrowing the terminology from academic inbreeding, this study defines international faculty as “faculty members who earned their academic degree from a university other than where they are currently employed”. The higher education degree could be a bachelor or doctoral degree. A master degree is not considered in this study because it is part of the doctoral degree in many countries (Shin et al., 2018).
According to the Changing Academic Profession (CAP) data defining international outlook by nationality or citizenship is quite different from that by academic training, at the higher education system level (Teichler et al., 2013). According to the CAP data, non-English speaking systems such as Argentina, Brazil, Italy, Malaysia, Mexico, Portugal and South Korea are underestimated in the international faculty rate when measured by nationality or citizenship because these systems demonstrate a much higher international faculty rate when international faculty is measured by academic training. This suggests that the current definition of international faculty in THE and QS favors some countries while understating the extent of international faculty in some other countries, mostly in non-English speaking systems. If academic training is a valid measure for the institutional competitiveness of HEIs, our follow-up analysis shows how much the ranking status fluctuates according to the different measures used, as this study proposes.
Composition of ranking indicators and sensitivity of ranking status
The THE ranking indicators consist of five areas—teaching (30%), research (30%), citation (30%), industry income (2.5%), and international outlook (7.5%). Similarly, the QS ranking assigns 10% for international outlook (5% for international faculty, and 5% for international students). However, the weighting assigned may or may not represent the actual contribution of each area because it depends on the variances of each indicator (Hou & Jacob, 2017). For example, if there is a high variance in teaching scores, then a small increase in teaching performance may not affect total scores because there is a relatively large gap between HEIs in the area of teaching. Thus, the high weight does not necessarily mean that HEIs inclined to focus more on the highly weighted indicators. In this context, HEIs might want to pay more attention to the indicators where they can easily and rapidly catch up with competing HEIs—possibly within one or two years.
HEIs often use a simulation to develop their strategy for institutional development in global rankings (e.g., Hazelkorn, 2007; Locke, 2011). In most cases, an institutional researcher analyzes research and citation indicators to develop a strategy for ranking games (e.g., Dowsett, 2020). In addition, current ranking mechanisms are based on these simulations and there have been academic disputes on the calculation methods as well as the definitions of individual indicators (Locke et al., 2008). Most rankers do not calculate publication scores simply by counting the number of papers. Rather, most rankers normalize the number of articles, and also citation counts (e.g., Leydesdorff & Bornmann, 2011; Waltman et al., 2011). Reflecting this complexity, the Leiden Ranking provides various scores according to different measures of the research and citation indicators. These simulation efforts have contributed to a ranking mechanism based on a more scientific methodology.
Despite this, the area of international outlook has not evolved much from when global rankings first emerged. With economic globalization, academic mobility has been considered a core factor of social development and innovation. Actually, international mobility has a longer story and there are well known lessons from history. For example, the aggressive study abroad policy during the Meiji period in Japan has been considered a key to it social and technological innovation (Kashioka, 1982) while the Spanish King Philip II’s policy of forbidding study abroad has been criticized as contributing to the decline of Spanish science (Goodman, 1983). Although there are arguments that the ranking status is simply representing the dominant local language rather than international competitiveness (e.g., Marginson & van der Wende, 2007; Teichler, 2011; van Raan et al., 2011), this argument has not been supported with empirical analysis using a solid methodology. HEIs that hire academics trained by HEIs in other than the current host country might better represent institutional openness and institutional diversity (Altbach & Knight, 2007). For example, many Hong Kong universities which tried to recruit foreign scholars from globally competitive markets are positioned as international key players, especially in Asian higher education (Mok, 2005). Hiring “locally trained” international academics does not represent institutional openness or diversity because these academics are already a part of the higher education systems in the host country (e.g., Pustelnikovaite, 2021). From that point of view, academic training might be a better proxy measure of international outlook in terms of institutional competitiveness. However, very little data support this view. The following method section describes how to simulate this using empirical data.
Research method
This section briefly explains the data that this study is based on and introduces the analytical strategy for addressing the research questions.
Data
The data for this study are from four major resources—The THE, the UNESCO data, national statistical data, and the CAP data. This study uses THE global ranking data because the QS does not disclose the scores by indicators before its ranking in 2019. The THE global ranking provides data on the rankings of individual HEIs with scores on international outlook, teaching, research, citations, and industry income. The UNESCO data provide the data on international students at a national level. The data on international faculty at nationwide were collected from various sources such as national statistical reports, national data systems and several books on case studies for internationalization (Appendix 1). The CAP data provide international faculty by academic training (bachelor degree and doctoral degree) as well as birthplace and citizenship of current employment. The CAP data were collected during 2007/2008 from 19 higher education systems through common survey items. Because the CAP data is based on 2007/2008, this study analyzed THE Global Ranking in 2011/2012. The ranking data are the oldest available on the Internet along with a list of the top 400 universities. Although this study is based on international faculty data in different time periods, we believe that it is not serious issue for this simulation because the proportion of international faculty by nationality or citizenship is not much changed within a short time period. For example, international outlook scores in last 10 years have not been changed a lot as shown in the correlation analysis in Appendix 2.
The 2015 UNESCO data were used in this study because they include most higher education systems included in the THE 400 rankings, and the 2015 data included cases from 18 of the 19 CAP participating countries (Australian data was released in 2015 and German data was released in 2013. Only Argentina was omitted). No university in Argentina, Malaysia or Mexico was ranked in the THE 2011/2012 top 400, and therefore, in this comparative study, 16 higher education systems among these 19 systems include 310 HEIs covering 77.11% of HEIs included in the THE 2011/2012 of the top 400 rankings. In our final analysis, this study analyzed 281 out of 310 universities because the industry income of 29 universities was not presented in the THE score. The data underlying our analysis is summarized in Table 1.
International outlook scores in the THE ranking are calculated based on the number of international faculty members, international students and international collaboration, but the raw scores are not released to the public. The THE ranking provides only a total score of international outlook scores without releasing each indicator’s raw score. Because of this, researchers outside of the ranking institute can access only the total score of the international outlook, rather than by each international faculty member and student. However, we can suggest how much the international outlook represents international faculty and students by comparing the national average of international outlook scores and a national average of international faculty members in national statistics and international students in the UNESCO data.
The THE ranking normalizes indicator scores by a Z score which is then turned into a cumulative probability score to assign different scores on this indicator. To analyze the correlations between different measures, this study transformed these measures (the rate of international faculty in national statistical data, the rate of international students in UNESCO data, and the rate of international faculty by different measures of international faculty in the CAP) into a percentile rank. The correlations between the international outlook in THE, international faculty in national statistical data, international students in UNESCO data, and different measures of international faculty in the CAP are presented in Table 2. The UNESCO data on international students and national data on international faculty are highly and significantly correlated with international outlooks in THE.
In addition, this study extended correlation analysis to different measures of international faculty among 16 higher education systems in the CAP study. Table 2 shows that international outlook scores are highly and significantly correlated with international faculty members measured by their bachelor degree and their citizenship, but the correlation is low for doctoral degrees in the CAP data. Moreover, there is a high correlation between the international outlook scores and the number of international students in the UNESCO data. This correlation analysis shows that the international outlook of the national average among the top 281 HEIs is highly correlated with that of the CAP (except for doctoral degree in the CAP) and UNESCO data.
Analysis and clustering of higher education systems
This study analyzed the changes in ranking status by applying different measures of international faculty members–birthplace, citizenship of current employment, and the other two by academic training (bachelor degree and doctoral degree). For our analysis, this study produced a ranking for the total 281 HEIs (hereafter, “reference ranking”) through replacing international outlook scores in the THE with the national average of international students in UNESCO data and international faculty rate in national statistics. The reference ranking assigned 50% to international faculty and the other 50% to international students in the international outlook scores. This reference ranking has a high correlation (0.996) with the original ranking by THE and it is our reference for assessing our hypothetical ranking developed for our simulation. Our hypothetical ranking is developed through plugging in the rate of international students in the UNESCO and international faculty in the CAP to produce rankings for the 281 HEIs included in this study. For our analysis, we assume that international collaboration does not vary and remains fixed across HEIs. Hypothetical ranking 1 is the ranking based on international faculty by birthplace; hypothetical ranking 2 is based on current citizenship of employment; hypothetical ranking 3 is based on the country where they earned their bachelor degree; and, hypothetical ranking 4 is by the country of their doctoral degree. Through this process, this study enables us to estimate how much the global rankings fluctuate by different measures of international faculty members. The procedure is summarized in Fig. 1.
Analytical procedure. (1) Reference ranking produces rankings for 281 higher education institutions by plugging in national data from official statistics (international student and international staff) for individual university. (2) Hypothetical ranking produces ranking for 281 higher education institutions by plugging in the CAP data (international staff) for different definitions (birthplace, citizenship and two academic training status)
The global ranking by citizenship is the measure in current THE international outlook, and the gaps between birthplace, citizenship, and two academic training measures show which systems are underestimated (or overestimated) by current measures of international faculty. This study analyzes the changes in the ranking status of individual HEIs by numeric values, so that simulation results can be further analyzed and discussed. This study also classified 16 higher education systems based on four different international faculty measures by applying k-means cluster analysis. The k-means cluster analysis is a non-hierarchical clustering method that calculates the centroids for each cluster and divides n observations into k clusters belonging to the nearest cluster center. The one-way analysis of variance (ANOVA) will be applied to determine whether differences between groups are statistically significant. These analytical approaches enable us to show how much ranking status has increased or decreased by applying different measures of international faculty.
Results
This section summarizes major findings on how ranking status changes by applying different measures of international faculty and whether some systems are systematically over or underestimated by the different measures of international faculty.
Changes of international outlook score and rankings
We made reference scores and hypothetical international scores nationwide to examine changes in international outlook scores by applying different measures of international faculty. Table 3 shows differences in international outlook scores across different measures of international faculty. The international outlook score for each country was calculated by averaging the international outlook scores of universities that ranked within THE top 281 in each country. Australia shows the highest score in the international outlook score, followed by the UK and Hong Kong. The standard deviation of international outlook scores is between 2.69 and 16.43, and nine countries have a standard deviation under 10.00, indicating that the deviation between universities in a country is not very large. On the other hand, this analysis found that differences in international outlook scores between countries (SD: 19.94) are bigger than that of between universities in a country.
This simulation found that the international outlook scores on international faculty fluctuate a lot according to the different measures of international faculty whether measured by birthplace, citizenship status or academic training. For example, scores related to international faculty by birthplace and citizenship status differ as shown in Table 1, with the gaps calculated. The largest gap is in Australia where the gap is 26% points (37.9–11.9%), followed by Canada by 19.9% points (31.9–12.0%), and Hong Kong by 12.5% points (30.2–42.7%). In addition, scores on international faculty by academic training differ considerably in some systems from that of birthplace-based or current citizenship-based ones. Table 3 shows the gaps between the citizenship-based measure and Ph.D. training-based one. According to this result, the gap between the two measures is the largest in Korea (43.6% points), followed by Italy (33.8% points), Portugal (32.4% points), and China (20.8% points). This finding implies that some systems are systematically underestimated by the selection of a specific measure of international faculty.
Systemic differences in the changes to global rankings
This study further analyzed international faculty outlook scores by similarities and differences across 16 higher education systems by k-means cluster analysis. The number of clusters, k = 3 was determined as the optimal choice using the Elbow method, which defines clusters such that the total within-cluster sum of the square is minimized. Table 4 is the results of the k-means cluster analysis which is based on similarities and differences between groups by the four measures of international faculty. The cluster analysis shows that international outlook scores in five higher education systems are very stable across different measures of international faculty while the scores fluctuate in the rest of the systems.
In brief, the five systems of the UK, Canada, Australia, Canada, Hong Kong, Norway, and the UK show relatively stable scores across different measures of international faculty while this measure fluctuates considerably in the five systems of Brazil, China, Italy, Portugal, and South Korea. The remaining six systems of Finland, Germany, Japan, Netherlands, South Africa, and the USA fall between these two clusters. The five stable systems are not impacted very much by the different measures of international faculty whereas the other systems are. Interestingly, other than Norway, the five systems with stable international outlook scores have a system based on UK higher education. On the other hand, the five systems with considerable fluctuation in their rankings are in non-English speaking countries.
In addition, this study performed ANOVA analysis to examine whether differences were observed in four international faculty measures according to the clusters. There are statistical differences between four different measures of international faculty based on the three clusters.
These differences might be closely associated with the internationalization policy of each higher education system. These systems with high fluctuation (Brazil, Italy, Portugal, and South Korea) encourage their researchers to study abroad. As a result, these systems report a higher rate of doctoral training abroad while they did not hire many foreign-born or foreign citizens. Compared to these systems, five systems in the low fluctuation (Australia, Canada, Hong Kong, Norway, and the UK) prefer to recruit their researchers from the global market and report a high rate of international academics. Compared to these two types, six systems with the mid fluctuation (Finland, Germany, Japan, Netherlands, South Africa, the USA) locally train their faculty members, with some from abroad who earn their doctoral degrees within these systems. For example, the share of “temporary visa holders” which refers to non-citizens or non-resident status holders among doctoral degree recipients in the USA, was 32% in 2015 (US National Science Foundation, 2017).
Discussion
This study found that international outlook scores at THE have been significantly impacted when different measures of international faculty are used. Among the four measures included in this study, we found that international faculty measured by the country of doctoral studies produced quite different outlook scores from that of a nationality-based or citizenship-based one and thus produced different ranking statuses. This study found that major English-speaking systems such as the UK, Canada, and Australia hire a large percentage of foreign citizens as faculty members, but many of these foreign citizens have earned their doctoral degrees from the country of current employment. In contrast, Italy, Portugal, Korea, and Brazil hire a large number of faculty members who hold citizenship, but many of them earned their doctoral degrees abroad.
This finding implies that the current measure of international faculty favors major English-speaking systems while other systems are disadvantaged. The current internationalization indicators are particularly favorable to the English-speaking systems influenced by the British educational system (Luque-Martinez & Faraoni, 2020). This finding suggests a possible bias of international outlook scores and global rankings. Although there have been continuous debates and discussions on research measures such as publications and citation indicators (e.g., Lee et al., 2020; Safon & Docampo, 2021), we tend to accept international outlook measures as accepted. However, there are different definitions of international faculty in research and policy practices and the measures differ accordingly (Kim et al., 2011; Teichler, 2015). In this regard, one critical issue is whether citizenship status is a valid measure of international outlook.
Hiring faculty with non-citizenship indicates how much the systems and/or HEIs are open to different cultural backgrounds as discussed in Altbach and Knight, (2007). Without this culture, HEIs might be unable to recruit global talent that is the key for global competitiveness as discussed in a world-class university initiative (e.g., Altbach, 2009). This is because diversity in thought and ideas brings fresh ideas for breakthrough research (Altbach & Yudkevich, 2017; Yudkevich et al., 2017). However, hiring foreign citizenship holders does not necessarily mean the “diversity” in thought and ideas occurs if those international faculty are “naturalized” after they have been academically trained in the local system (Kim et al., 2011). In this case, international faculty are not very different from local faculty members.
Instead, hiring faculty who earned their doctoral degree abroad might be a better measure to rank a global outlook. International mobility of trained individuals returning to their home countries might improve academic performance in their home countries through academic exchange, language and culture (Saxenian, 2005). Returnees from study abroad have also maintained social and professional relationships with their domestic and international colleagues and actively collaborate and disseminate research between the two countries. In policy and institutional practice, similarly, a study visit is one of the requirements for researchers in Switzerland (Sautier, 2021), and in the Netherlands a large number of researchers are returnees from study abroad (de Jonge, 2021).
In addition, the motivation for hiring international faculty differs across higher education systems. For example, a large proportion of the international faculty is affiliated with science, technology, engineering, and mathematics (STEM) fields in the USA, the UK, Australia, and the Netherlands. For example, about 77% of the international faculty are affiliated with the fields of sciences, engineering and IT areas in Australian universities (Welch, 2021). Compared to these systems, however, in Japan, Korea, Taiwan, and Hong Kong a large proportion of the international faculty are in the fields of arts and humanities and social sciences where there is a lower rate of production of scientific publications (Chang, 2021; Chen, 2021; Huang, 2021; Shin, 2021). For example, in Taiwan in 2015, 60% of the international faculty members belonged to arts and humanities and social sciences (Chang, 2021). These different patterns of international faculty recruitment imply that some systems recruit international faculty for higher research productivity while the other systems do not.
Because of this pattern of faculty hiring practices, the share of international faculty is plateaued after some rapid growth during the last a couple of decades in Japan, Korea, and Taiwan (e.g., Huang et al., 2019; Chang, 2021; Shin, 2021). It seems that the share of international faculty might not increase in a short time frame in these systems. Nevertheless, the current measure of international faculty stimulates HEIs to hire faculty with a foreign passport.
In light of these differences across systems, citizenship status might not be a valid measure of international outlook and it is necessary to redefine international outlook beyond birthplace, nationality, or citizenship. Internationalization in higher education is related to the recovery of the cosmopolitan nature of higher education and the improvement of the quality of higher education through mutual learning and various international experiences (Knight, 2014). This study proposes the construction of the comprehensive form of internationalization needs to be unpacked when juxtaposed with the model of internationalization based solely on nationality or citizenship. One possible approach is to combine international faculty by both citizenship-based and doctoral training-based data.
In research and policy practice, researchers often use academic training as a measure of international faculty. For example, Sheehan and Welch (1996) counted international faculty in terms of their academic training in their study in Australia. Kim et al. (2011) considered the birthplace and their academic training to complement the limits caused by relying only on citizenship status. In addition, the OECD (2017) also defines international scientists by their first scientific publication, which is usually their doctoral thesis or related articles, for its report of the Science, Technology and Industry Scoreboard. We believe that the proposed approach is relevant to measuring institutional competitiveness as well as minimizing the issue of systemic bias by selecting the citizenship-based measure.
Conclusion
Global rankers selected the international faculty indicator as a proxy for the global competitiveness of HEIs. The THE describes its rationale for selecting an international outlook as “The ability of a university to attract undergraduates, postgraduates and faculty from all over the planet is key to its success on the world stage” in its methodology (THE, Dec. 9. 2021). The QS describes this ability as global “brand” power. This “ability” and “brand power” may represent one dimension of the institutional competitiveness of HEIs, but the logic of attracting international faculty as a proxy for institutional competitiveness is not valid in some systems. This is because internationalization is approached differently across higher education systems. Nevertheless, the current measure of international faculty stimulates HEIs especially in non-English speaking systems to hire foreign citizens to enhance their global rankings. However, hiring “foreign” passport holders brings other issues to both the host universities and the invited international faculty members (Gress & Shin, 2020).
This study proposed to complement these side effects of measuring international faculty by using only a citizenship-based one and recommended combining both citizenship-based and academic training-based ones together. Global rankers have been updating their indicators and measures, especially for measuring the credits for publication and citation to reflect scientific evidence by researchers as described on their websites. However, relatively few revisions were made in the areas of teaching and international outlook scores. Updating teaching quality is challenging because it involves serious issues around measurement and data collection. However, updating the international faculty measure is not particularly complicated for data collection and data analysis as this study has shown.
This study highlighted alternative approach to define international faculty and raised issues related to possible under- and overestimation of the international faculty rate across different higher education systems. However, this study has several limitations because of the data this simulation is based on. First, this study used relatively old data because of the data constraints. Follow-up research with more recent data might be more persuasive. Second, the CAP data were collected by an international comparative research project and each country team member took charge of data sampling and collection. Consequently, there may have been some selection bias, and generalizability may be limited. Third, this study defined internationalization of faculty members and did not pay much attention to their internationalization activity and its outputs such as international collaboration. Some global rankings such as the SCIMAGO institutions ranking, the Leiden ranking, and the University Ranking by Academic Performance (URAP) measure international collaboration as a measure of internationalization. An interesting area for research is how international co-authorship is associated with the rate of international faculty members in each HEI.
References
Aguillo, I. F., Bar-Ilan, J., Levene, M., & Ortega, J. L. (2010). Comparing university rankings. Scientometrics, 85(1), 243–256. https://doi.org/10.1007/s11192-010-0190-z
Altbach, P. (2006). The dilemmas of rankings. International Higher Education, 42, 1–2.
Altbach, P. G. (2009). Peripheries and centers: Research universities in developing countries. Asia Pacific Education Review, 10, 15–27. https://doi.org/10.1007/s12564-009-9000-9
Altbach, P. G., & Knight, J. (2007). The internationalization of higher education: Motivations and realities. Journal of Studies in International Education, 11, 290–305. https://doi.org/10.1177/1028315307303542
Altbach, P. G., & Yudkevich, M. (2017). Twenty-first century mobility: The role of international faculty. International Higher Education, 90, 8–10. https://doi.org/10.6017/ihe.2017.90.9995
Altbach, P. G., Yudkevich, M., & Rumbley, L. E. (2015). Academic inbreeding: Local challenge, global problem. Asia Pacific Education Review, 16, 317–330. https://doi.org/10.1007/s12564-015-9391-8
Barbaric, D. V., & Jones, G. A. (2016). International faculty in Canada. In M. Yudkevich, P. G. Altbach, & L. E. Rumbley (Eds.), International faculty in higher education comparative perspectives on recruitment, integration, and impact (pp. 51–74). Routledge.
Brankovic, J., Ringel, L., & Werron, T. (2018). How rankings produce competition: The case of global university rankings. Zeitschrift Für Soziologie, 47(4), 270–288. https://doi.org/10.1515/zfsoz-2018-0118
Chang, D. (2021). Recruitment of international academics and its challenges in Taiwanese higher education institutions. In F. Huang & A. R. Welch (Eds.), International faculty in Asia (pp. 95–114). Springer.
Chen, S. (2021). Nonforeign foreign academics in Hong Kong: Realities and strategies. In F. Huang & A. R. Welch (Eds.), International faculty in Asia (pp. 33–44). Springer.
de Jonge, J. (2021). International academics at Dutch universities: Policies and statistics. In F. Huang & A. R. Welch (Eds.), International faculty in Asia (pp. 171–184). Springer.
de Wit, H. (2017). Changing rationales for the internationalization of higher education. International Higher Education, 15, 1–3. https://doi.org/10.6017/ihe.1999.15.6477
Dill, D., & Soo, M. (2005). Academic quality, league tables, and public policy: A cross-national analysis of university ranking systems. Higher Education, 49(1), 495–533. https://doi.org/10.1007/s10734-004-1746-8
Dowsett, L. (2020). Global university rankings and strategic planning: A case study of Australian institutional performance. Journal of Higher Education Policy and Management, 42(4), 478–494. https://doi.org/10.1080/1360080X.2019.1701853
Gantman, E. R. (2012). Economic, linguistic, and political factors in the scientific productivity of countries. Scientometrics, 93(3), 967–985. https://doi.org/10.1007/s11192-012-0736-3
Goodman, D. (1983). Philip II’s patronage of science and engineering. British Journal for the History of Science, 16(1), 49–66.
Gress, D. R., & Shin, J. C. (2020). Perceptual differences between expatriate faculty and senior managers regarding acculturation at a Korean university. The Social Science Journal. https://doi.org/10.1080/03623319.2020.1813863
Hazelkorn, E. (2007). The impact of league tables and ranking systems on higher education decision making. Higher Education Management and Policy, 19(2), 87–110. https://doi.org/10.1787/17269822
Horta, H., Sato, M., & Yonezawa, A. (2011). Academic inbreeding: Exploring its characteristics and rationale in Japanese universities using a qualitative perspective. Asia Pacific Education Review, 12, 35–44. https://doi.org/10.1007/s12564-010-9126-9
Hou, Y., & Jacob, W. J. (2017). What contributes more to the ranking of higher education institutions? A comparison of three world university rankings. The International Education Journal: Comparative Perspectives, 16(4), 29–46.
Huang, F. (2021). International faculty at Japanese universities: Main findings from national survey in 2017. In F. Huang & A. R. Welch (Eds.), International faculty in Asia (pp. 45–62). Singapore: Springer.
Huang, F., Daizen, T., & Kim, Y. (2019). Challenges facing international faculty at Japanese universities: Main findings from the 2017 national survey. International Journal of Educational Development. https://doi.org/10.1016/j.ijedudev.2019.102103
Huang, F., & Welch, A. R. (Eds.). (2021). International faculty in Asia. Springer.
Kashioka, T. (1982). Meiji Japan’s study abroad program: Modernizing elites and reference societies. Unpublished Doctoral Dissertation. Duke University.
Kehm, B., & Stensaker, B. (Eds.). (2009). University rankings, diversity, and the new landscape of higher education. Rotterdam: Sense Publisher.
Kim, D., & Jiang, X. (2021). Understanding international faculty in the United States: Who hires them and why? In F. Huang & A. R. Welch (Eds.), International faculty in Asia (pp. 203–233). Singapore: Springer.
Kim, D., Wolf-Wendel, L., & Twombly, S. (2011). International faculty: Experiences of academic life and productivity in US universities. The Journal of Higher Education, 82(6), 720–747. https://doi.org/10.1080/00221546.2011.11777225
Kim, S. (2016). Western faculty ‘flight risk’ at a Korean university and the complexities of internationalisation in Asian higher education. Comparative Education, 52(1), 78–90. https://doi.org/10.1080/03050068.2015.1125620
Knight, J. (2014). Three generations of crossborder higher education: New developments, issues and challenges. In B. Streitwieser (Ed.), Internationalization of higher education and global mobility (pp. 43–58). Symposium Books.
Lee, J., Liu, K., & Wu, Y. (2020). Does the Asian catch-up model of world-class universities work? Revisiting the zero-sum game of global university rankings and government policies. Educational Research for Policy and Practice, 19, 319–343. https://doi.org/10.1007/s10671-020-09261-x
Leydesdorff, L., & Bornmann, L. (2011). How fractional counting affects the impact factor: Normalization in terms of differences in citation potentials among fields of science. Journal of the American Society for Information Science and Technology, 62(2), 217–229. https://doi.org/10.1002/asi.21450
Locke, W. (2011). The institutionalization of rankings: Managing status anxiety in an increasingly marketized environment. In J. C. Shin, R. K. Toutkoushian, & U. Teichler (Eds.), University rankings: Theoretical basis, methodology and impacts on global higher education (pp. 201–228). Springer.
Locke, W., Verbik, L., Richardson, J. T. E., & King, R. (2008). Counting what is measured or measuring what counts? League tables and their impact on higher education institutions in England (Report to HEFCE). Bristol: Higher Education Funding Council for England.
Luque-Martinez, T., & Faraoni, N. (2020). Meta-ranking to position world universities. Studies in Higher Education, 45(4), 819–833. https://doi.org/10.1080/03075079.2018.1564260
Marginson, S., & van der Wende, M. (2007). To rank or to be ranked: The impact of global rankings in higher education. Journal of Studies in International Education, 11(3/4), 306–329. https://doi.org/10.1177/1028315307303544
Mok, K. (2005). The quest for world class university: Quality assurance and international benchmarking in Hong Kong. Quality Assurance in Education, 13(4), 277–304. https://doi.org/10.1108/09684880510626575
OECD. (2017). Science, technology and industry scoreboard 2017. Retrieved from https://www.oecd.org/sti/oecd-science-technology-and-industry-scoreboard-20725345.htm
Pietrucha, J. (2018). Country-specific determinants of world university rankings. Scientometrics, 114, 1129–1139. https://doi.org/10.1007/s11192-017-2634-1
Pustelnikovaite, T. (2021). Locked out, locked in and stuck: Exploring migrant academics’ experiences of moving to the UK. Higher Education, 82(4), 783–797. https://doi.org/10.1007/s10734-020-00640-0
QS. (2021). Data definitions. Retrieved from https://support.qs.com/hc/en-gb/sections/360005980860-Data-Definitions
Safon, V. (2019). Inter-ranking reputational effects: An analysis of the Academic Ranking of World Universities (ARWU) and the Times Higher Education World University Rankings (THE) reputational relationship. Scientometrics, 121, 897–915. https://doi.org/10.1007/s11192-019-03214-9
Safon, V., & Docampo, D. (2021). Analyzing the impact of reputational bias on global university rankings based on objective research performance data: The case of the Shanghai Ranking (ARWU). Scientometrics, 125, 2199–2227. https://doi.org/10.1007/s11192-020-03722-z
Sautier, M. (2021). Move or perish? Sticky mobilities in the Swiss academic context. HigherEducation, 82, 799–822. https://doi.org/10.1007/s10734-021-00722-7
Saxenian, A. (2005). From brain drain to brain circulation: Transnational communities and regional upgrading in India and China. Studies in Comparative International Development, 40(2), 35–61. https://doi.org/10.1007/BF02686293
Sheehan, B. A., & Welch, A. R. (1996). The academic profession in Australia. Canberra: DEETYA, AGPS.
Shin, J. C. (2021). International faculty in a research-focused university in South Korea: Cultural and environmental barriers. In F. Huang & A. R. Welch (Eds.), International faculty in Asia (pp. 63–78). Springer.
Shin, J. C., Jung, J., & Lee, S. J. (2016). Academic inbreeding of Korean professors: Academic training, networks, and their performance. In J. F. Galaz-Fontes, A. Arimoto, U. Teichler, & J. Brennan (Eds.), Biographies and careers throughout academic life (pp. 187–206). Springer.
Shin, J. C., Kehm, B. M., & Jones, G. A. (2018). The Increasing importance, growth, and evolution of doctoral education. In J. C. Shin, B. M. Kehm, & G. A. Jones (Eds.), Doctoral education for the knowledge society: Convergence or divergence in national approaches? (pp. 1–10). Springer.
Shin, J. C., Toutkoushian, R. K., & Teichler, U. (Eds.). (2011). University rankings: Theoretical basis, methodology and impacts on global higher education. Springer, Netherlands.
Teichler, U. (2011). Social contexts and systemic consequences of university rankings: A meta analysis of the ranking literature. In J. C. Shin, R. K. Toutkoushian, & U. Teichler (Eds.), University rankings: Theoretical basis, methodology and impacts on global higher education (pp. 55–72). Springer.
Teichler, U. (2015). Academic mobility and migration: What we know and what we do not know. European Review, 23(S1), 6–37. https://doi.org/10.1017/S1062798714000787
Teichler, U., Arimoto, A., & Cummings, W. K. (2013). The changing academic profession. Springer.
THE. (2021). Methodology for overall and subject rankings for the Times Higher Education world university rankings 2022. Retrieved from https://www.timeshighereducation.com/world-university-rankings/world-university-rankings-2022-methodology
Trembath, J. L. (2016). The professional lives of expatriate academics: Construct clarity and implications for expatriate management in higher education. Journal of Global Mobility, 4(2), 112–130. https://doi.org/10.1108/JGM-04-2015-0012
Tung, R. L. (2008). Brain circulation, diaspora, and international competitiveness. European Management Journal, 26, 298–304. https://doi.org/10.1016/j.emj.2008.03.005
US National Science Foundation. (2017). Doctoral Recipients from U.S. Universities: 2015. Author. Retrieved from https://www.nsf.gov/statistics /2017/nsf17 306/.
US Immigration Office. (2021). Citizenship vs. nationality: What’s the difference? Retrieved from https://www.immi-usa.com/citizenship-vs-nationality/
van Raan, A., van Leeuwen, T., & Visser, M. (2011). Severe language effect in university rankings: Particularly Germany and France are wronged in citation-based rankings. Scientometrics, 88, 495–498. https://doi.org/10.1007/s11192-011-0382-1
Vosse, W. M. (2019). Reviving Japan through internationalization of higher education: Is there a “New Meiji”? In K. Coates, K. Hara, C. Holroyd, & M. Söderberg (Eds.), Japan’s future and a new Meiji transformation: International reflections (pp. 154–167). Routledge.
Waltman, L., Van Eck, N. J., Van Leeuwen, T. N., Visser, M. S., & Van Raan, A. F. J. (2011). Towards a new crown indicator: Some theoretical considerations. Journal of Informetrics, 5(1), 37–47. https://doi.org/10.1016/j.joi.2010.08.001
Welch, A. (2021). International academics in Australian higher education: People, process, paradox. In F. Huang & A. R. Welch (Eds.), International faculty in Asia (pp. 115–134). Singapore: Springer.
Yudkevich, M., Altbach, P. J., & Rumbley, L. (Eds.). (2017). International faculty in higher education: Comparative perspectives on recruitment, integration, and impact. Routledge.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no known competing financial interests or personal relationships that could have influenced the work reported in this paper.
Appendices
Appendix 1: National data sources in international faculty at nationwide
Country | International faculty member rate | Data year | Data source |
---|---|---|---|
Australia | 38.1 | 2005 | Oishi, N. (2017). Workforce Diversity in Higher Education. The University of Melbourne |
Brazil | 1.5 | 2012 | Yudkevich, M., Altbach, P.G., & Rumbley, L. E. (2016). International Faculty in Higher Education Comparative Perspectives on Recruitment, Integration, and Impact. NY: Routledge |
Canada | 40.8 | 2006 | Canadian Association of University Teachers (2014). CAUT almanac of post-secondary education in Canada 2013–2014. Ottawa, Canada: Canadian Association of University Teachers |
China | 2.3 | 2013 | Huang, F., & Welch, A. R. (2021). International Faculty in Asia. Singapore: Springer |
Finland | 8 | 2015 | Siekkinen, T., Pekkola, E., & Carvalho, T. (2020). Change and continuity in the academic profession: Finnish universities as living labs. Higher Education, 79, 533–551 |
Germany | 6.6 | 2014 | Yudkevich, M., Altbach, P.G., & Rumbley, L. E. (2016). International Faculty in Higher Education Comparative Perspectives on Recruitment, Integration, and Impact. NY: Routledge |
Hong Kong | 58.6 | 2015–2016 | Huang, F., & Welch, A. R. (2021). International Faculty in Asia. Singapore: Springer |
Italy | 2 | – | David, M. E., & Amey, M. J. (2020). The SAGE Encyclopedia of Higher Education |
Japan | 3.4 | 2008 | Huang, F., & Welch, A. R. (2021). International Faculty in Asia. Singapore: Springer |
Malaysia | 8.3 | 2013 | Huang, F., & Welch, A. R. (2021). International Faculty in Asia. Singapore: Springer |
Mexico | 5 | 2007 | Yudkevich, M., Altbach, P.G., & Rumbley, L. E. (2016). International Faculty in Higher Education Comparative Perspectives on Recruitment, Integration, and Impact. NY: Routledge |
Netherlands | 34 | 2016 | Huang, F., & Welch, A. R. (2021). International Faculty in Asia. Singapore: Springer |
Norway | 18 | 2007 | 29% of scientific positions held by foreign researchers (universityworldnews.com). Retrieved from https://www.universityworldnews.com/post.php?story=20210219110006289 |
Portugal | 4.2 | 2010–2011 | Huang, F., & Welch, A. R. (2021). International Faculty in Asia. Singapore: Springer |
South Africa | 8.9 | 2010 | Yudkevich, M., Altbach, P.G., & Rumbley, L. E. (2016). International Faculty in Higher Education Comparative Perspectives on Recruitment, Integration, and Impact. NY: Routledge |
South Korea | 4.7 | 2008 | KEDI (2010). Brief Statistics on Korean Education. MOE, KEDI |
UK | 32.1 | 2014–2015 | Data source link: https://www.hesa.ac.uk/data-and-analysis/staff/table-27 |
USA | 15.8 | 2009–2010 | SOURCE: U.S. Department of Education, National Center for Education Statistics, Integrated Postsecondary Education Data System (IPEDS), Winter 2009–10, Human Resources component, Fall |
Appendix 2: correlations across different measures of international outlooks
THE 2011–2012 International outlook score | THE 2011–2012 International outlook rank | THE, 2021 International outlook score | THE, 2021 International outlook rank | |
---|---|---|---|---|
THE 2011–2012 International outlook score | 1.000 | |||
THE 2011–2012 International outlook rank | − 0.996*** | 1.000 | ||
THE, 2021 International outlook score | 0.894*** | − 0.897*** | 1.000 | |
THE, 2021 International outlook rank | − 0.898*** | 0.898*** | − 0.996*** | 1.000 |
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Shin, J.C., Lee, S.J. Different measures of international faculty and their impacts on global rankings. Scientometrics 127, 6125–6145 (2022). https://doi.org/10.1007/s11192-022-04511-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11192-022-04511-6