Abstract
Abstract
This paper introduces the design and implementation of a comparative survey research project,
Keywords
Introduction
This article presents a case of the challenges and potential benefits, from development and design to implementation and analysis, of conducting cross-cultural research. The focus is
The Context for amistem Implementation: Survey Research in Chinese Higher Education Literature, 1990 to 2012
During the past two decades, scholars in China have increasingly used survey methodology to study dimensions of teaching and learning in higher education. To understand this research context and the challenges it may suggest for cross-national projects like
Themes and Content Areas of Survey Research on Student Learning
The psychological state of college students, language and distance learning, employment issues, and college students’ opinions on current educational policies are the main areas of investigation in our sample. In the domain of the psychological state and development of college students, several research papers focus on the motivations and aspirations of academic success among college students based on students’ self-reported data. For example, one research paper draws on survey data from 2007 students in 13 universities to investigate the relationship between procrastination and academic success. 2 The researchers studied the characteristics and factors related to types of procrastination, as well as the relationship of procrastination to students’ learning attitude, aptitude, motivation and institutional characteristics, such as reputation of university and student year in college, gender and college entrance examination score. Funded by Ministry of Education social science grants, research projects like this one have the capacity of sampling students from various institutions yielding significant results that take into account both institutional and individual characteristics. The method of using self-reported survey data proves an effective choice for studying students’ psychological states and development, which is in the subjective domain of human development, and allows researchers to identify in students’ responses what may be unique Chinese characteristics.
A large portion of our sample projects fall into the domain of psychology and language learning. Questions studied include student suicide and psychological pressure, sometimes with a focus on specific groups of students. 3, 4, 5 In the domain of teaching and learning, distance education and student learning outcomes and satisfaction are prominent research topics. Most online surveys address issues such as why respondents chose distance education or what aspects of distance education attracts participants, rather than probing deeply into the processes of teaching and learning. In other words, the focus of the projects is whether and in what areas distance education is needed, rather than about how to improve learning outcomes through distance education, which is a crucial area that calls for more research.
Over 30% of the survey research papers in our pool are designed to respond to contemporary social and educational issues and policies at the national level. For example, during the past two decades, the Chinese government initiated state-aided student loan policies or need-based tuition remission policies for students from low-income families. Some survey-based longitudinal research projects traced the influence of these policies, not only in relation to learning outcomes, but also in order to investigate the social-psychological dimensions of students’ development as part of the influence of these policies. 6 Survey research projects in this category provide important empirical evidence to policy making and implementation processes both at the national and institutional levels, and in so doing have the potential to further the process of creating, critically assessing and communicating “information that is useful in understanding and improving policies.” 7
In contrast to research on national level policies, just over 10% of the papers focus on institutional policy and administrative reforms, ranging from cafeteria renovation to the privatization of dormitory management. Such studies provide important information on student opinions on university reforms, and as such reflect one of the most important purposes of research, that is, to inform practice, especially institutional change.
Student employment is another area to which survey research notably contributes. Most of the surveys in our sample take the form of voluntary online surveys across regions and institutions, with the surveyed population being college students, especially seniors. Questions include categories of career aspiration, relationship between college education and career development, and satisfaction level regarding career development services. Interesting examples from some surveys ask students to rate the level of importance of factors influencing student success on the job market, including academic achievement, family background, appearance (professional image), and overall presentation. These options clearly parallel distinct patterns of hiring priorities and practices in contemporary China. 8, 9, 10
Notably, survey research as a methodology has contributed to our understanding of the processes of teaching and learning, especially from 1990 to 2000, when collecting teaching evaluations was not a common practice in Chinese universities. Surveys in this category serve a significant function of providing systematic feedback on teaching, although most findings in these projects are not generalizable.
Concerns and Challenges of Survey Research
Review of the studies in our sample brought into sharper focus several issues that could potentially affect the validity and credibility of research findings. First, given constraints of funding, some research projects generated findings from a small sample size, lessening the validity of statistical models being used. In some papers, convenience samples came from only one university or only a few classes. Survey research within a single institution does not have the capacity to generate findings that are applicable to a larger population, although qualitative research at one institution can provide meaningful findings on teaching and learning, and provide the springboard for survey questions or test taken-for-granted assumptions about teaching, learning and student and faculty experiences.
Meanwhile, in most projects that drew samples from multiple higher education institutions, participants were from tier one universities and did not represent the diversity of Chinese higher education. We expect that researchers will turn more of their attention to student populations in second and third tier universities, whose experiences and learning outcomes are critical for assessing pedagogical and policy reforms.
Additionally, in most projects in our sample researchers considered student age, grade and major as analytical categories when examining correlations with individual learner characteristics. Adding to such studies family background, including socio-economic background, parents’ professions, and geographical origin could more comprehensively reflect individual dimensions crucial to understanding student learning. On the other hand, some surveys in our data pool include an item asking participants to choose whether they are from urban or rural areas. Questions such as this one may be problematic in projects with a small sample size, in which students’ identities are difficult to sufficiently protect. The question of whether and when to include items referencing student background raises important ethical concerns for broad discussion.
Our sample of studies also raise a question of the relationship between the researcher and the researched. To the best of our knowledge, there is no equivalent structure and process in China to what
Finally, two critical issues emerged from our overall analysis of studies in our pool with direct application to our implementation of
Discussion and Reflection
Through our review, we identified significant survey research projects on learner characteristics, attitudes and aptitude, and how these impact student learning outcomes, topics and findings relevant to the
Some of this type of survey research has been inspired by practical issues identified in the teaching and learning process, whereas a majority of the projects started with certain theoretical models or established surveys in the U.S., U.K., or Germany, but with significant revision to better “fit” the Chinese student population and context. Survey research in this category has the potential to significantly contribute to learning theories by bringing to transnational research data from Chinese students’ population. Providing the structures and opportunities for cross-cultural collaboration and data sharing is one of the key goals of our
Gaining Access for amistem : The Challenges of Collecting Survey Data in Cross-National Contexts
In the
In an effort to understand the nature of experiences that generate and maintain interest in
To facilitate participation we decided that for the first “foreign” stages of the study, it made sense to work with institutions where we already had contacts. Thus, in this large-scale “pilot” we sought universities in China and Australia where our School of Education or University had established collaborative research agreements. The rest of this section outlines our experiences in inviting and convincing those universities and students to participate in the study. Our intent is to share our experiences with those who might want to conduct similar work in cross-cultural settings.
Getting Noticed
In order to access students at universities we needed some point of entry. In the
Survey Fatigue
Once we are able to get the attention of an administrator and solicit their participation, the most common response we hear is: “We’re interested in the study and the results, but our students are already over-surveyed and thus we shouldn’t participate and make things worse.” This is akin to the
While we generally sense that smaller institutions are less likely to give us a NAMI response and to be interested in participating, this is not an absolute. In terms of conducting survey research, this situation can make things extremely difficult to create any sort of large dataset.
Thankfully, some institutions seem to be forward thinking and have figured out a solution. We have come across a few schools who are attempting to mitigate the potential for survey fatigue by randomly assigning students to groups each year and having a quota on the total number of surveys each group (or student) can be solicited for. This means that a given student sees no more than a handful of survey requests each year, but that his/her peer may get requests for different surveys based on the sampling. Researchers request access to one of the sets and once the sets are assigned for a semester, no more solicitations will be entertained. On the positive side, this can provide access to a randomly selected sample that may have been off limits previously. Conversely, this approach provides access to a sample that may not match the ideal sample. Based on the prevalence of fatigue, we think many of the larger higher education institutions will likely go the route of the quota system to control the number of studies students are asked to participate in.
Scheduling
Even if we get the opportunity to speak with administrators and they are willing to have their school participate in the study, another hurdle to overcome can be scheduling. Our first two attempts to survey schools in the
As with the quota system mentioned above, some schools have figured out that a centralized control of on-campus survey efforts is important. It seems that this task generally falls to offices of institutional research, if they exist. At a visit with one such manager, he unfolded a large schedule on which all of the survey efforts at his institution were marked. There was not much “space” on the schedule, and he was concerned about the amount of surveying going on that he was unaware of. While this approach may limit collecting data at the same time across institutions, the ability to get in the official queue probably means better chances for success.
Ethics Review
When the stars have aligned, and a university is willing to participate and has an opening in their schedule, the next hurdle may be passing their own ethics review or approval for human subjects research. A lengthy discussion of the need to have human subjects research approved in the
Conducting research internationally adds another layer of complexity.
Building a Sample
In our study we are collecting data from current undergraduate and graduate students, researchers and faculty/academic staff. When schools agree to participate, we need to discuss how a sample of participants will be created. Our primary goal is to collect data from a variety of institutions so we do our best to be flexible and work with the preferences of each school. What this means is that we are willing to accept a range of sampling options. At one end of the spectrum, the ideal end, schools provide us with contact information for students and faculty in our selected disciplines and we manage the distribution of all contacts. This gives us the opportunity to personalize solicitations, which has been shown to improve participation, and to calculate accurate response rates. Some institutions are willing to provide us names to conduct a census of all eligible members; others prefer to select a random sample of individuals. Schools generally understand our desire to have access to this information but they also have concerns about sharing contact details. Thus, we also include a set of schools that do not provide us with contact information but instead they distribute a general survey link to students and staff on our behalf.
Spam Filters
When all approvals are granted and someone at the participating institution has agreed to provide contact information, it is possible to move on to soliciting participants. There is a rich literature on the best methods for soliciting survey participants, but due to budget constraints and pragmatism, email is the only logical method for our study. While it is possible to automate distribution of such a large volume of messages, coordinating the sending of initial solicitations and reminders is no trivial task. Our survey software company (Qualtrics) manages distribution of messages, which is quite helpful. These messages are handled on the company’s servers, with the chance to specify a Reply-to address. Because these servers send out enormous volumes of messages each day, the company has gone through efforts to have their servers ‘whitelisted’ by many organizations so that emails originating from them are not blocked by institutional servers or email programs. Qualtrics now provides reporting of the number of undeliverable messages that are “bounced,” which really helps users to make more robust estimates of response rate and sampling bias. The whitelisting is definitely helpful in the U.S., but it is of questionable value internationally.
In the
Collecting Data
Once you’ve wended your way through the myriad obstacles mentioned previously, the final hurdle involves getting individuals to complete the survey you spent so much time developing (and maybe translating). When we first initiated discussions with colleagues in China and told them about our plans to conduct an online survey, we commonly were told that the norm in China was paper surveys and not online surveys. Many told us that it would be best if we printed the survey and then had faculty tell their students to complete the questionnaire. While this goes against the requirement of voluntary completion of surveys outlined in our IRB, we were assured that we would get a very high response rate if we adopted this practice. The realities of trying to distribute and collect large numbers of paper surveys in China was logistically and financially daunting. Despite these warnings, we forged ahead with the plan to collect surveys online.
Before we went forward with the survey distribution, a member of the project team (Adam) met with a former mentor who is a methodologist and Dean of a Faculty of Education in China. We discussed the plans for the project and, as if on cue, he commented that paper surveys are de rigueur for collecting data in China. After discussing the difficulties with our approach, we came up with a plan where if online data collection faltered we would change course and go with a short paper version of the survey that students and faculty could complete quickly in a classroom or office environment. This short version would include core survey items and a space where respondents could provide their email contact if they were open to completing more items. We piloted this version during the summer of 2013 and of 50 students who completed the survey, approximately 45% of respondents offered their contact information.
Improving Participation
In our most recent round of surveying in the
Evaluation of the results indicated that there were small positive gains in response based on incentive. We measured response in two ways: survey initiation and the proportion of survey items completed. The most notable difference in initiation across incentive amounts and respondent status (i.e., undergraduate, graduate, faculty) was during the initial solicitation, with small but noticeable improvements in initiation rate. This occurred for both undergraduates and graduate students, but not for faculty. When evaluating the proportion of items completed, there was a modest gain in completion between No Incentive and any incentive across all groups. Interestingly, there was a general increase across level, with undergraduates completing about 75% of the potential items, graduate students completing just over 80% and faculty averaging greater than 85%. For faculty, there was a small trend where higher incentives were associated with more items completed.
We decided to conduct a similar experiment in China, but based on the prior results we excluded the No Incentive group from the study and adjusted the incentives for relative value. We found no substantive difference in survey initiation (8%) or completion (6%) across groups (RMB 40, 80, 120 and 160). While our results indicate a minimal gain for including an incentive, the gain is likely not worth the effort.
While we planned to do a similar experiment in Australia to determine if results would hold, we discovered an extra hurdle that dissuaded us from doing so. In completing the paperwork for Ethics approval, we determined that any form of incentive would require us to obtain a permit from the Australian Capital Territory to administer the incentive draw for the survey. Given the lack of major gains in response tied to incentives, and the external permits necessary, we decided to forego running the experiment in Australia.
One other major issue came up with our consideration of improving participation. While in the
Translation and Adaption of stem Instruments
In addition to the steps to successful survey distribution above, translation plays a central role in multilingual survey projects and goes hand-in-hand with survey adaptation. 20 Below we provide detailed illustrations of the process of translating and adapting the English survey questionnaire into Chinese. We raise challenges that emerged from translation and adaptation and explain the measures we took to ensure the accuracy and comparability of the Chinese version of the survey. Such measures demanded of our research team strong commitment to self-reflexivity, that is attending to how and what we learn when we compare and think across cultures.
Envisioning Cross-Cultural Survey Research as a Scholarship of Mutual Engagement
21
Team Translation Approach
Informed by the literature on cross-cultural survey translation, 22, 23 we adopted a team translation approach that is particularly useful in addressing the challenges of cross-cultural translation. Our team initially consisted of six bilingual Chinese doctoral and master’s students with one student specializing in Science Education and the rest specializing in Comparative and International Education. All team members were female. At a later stage of survey refinement, when the team worked on revisions to address issues emerging from the pilot survey conducted at three Chinese universities, a Chinese male doctoral student majoring in Science Education joined the team. Including one student from Taiwan, all students were from different provinces of China. Before studying at Indiana University, team members majored in diverse undergraduate and master’s programs, including Education, English Literature and Nursing. One team member obtained her master’s degree in Chinese Language and Literature from Singapore. Team members also had various levels of experience in survey design, translation and implementation. The intention in creating a team of such diversity was that its mix of skills, experience and disciplinary expertise could produce an optimal survey version. Dr. Adam Maltese, the survey’s creator, oversaw the team’s work and served as a consultant.
The procedure of team translation was time-consuming, complex, and iterative. Two translators worked independently to produce two versions of the Chinese translation. Two reviewers were appointed based on their previous experience with survey research. By comparing the two versions, a reviewer marked items on which the two translators largely disagreed and the reviewer herself disagreed with both translators. The marked items were discussed by the whole team in order to produce the most accurate translation possible. Based on group discussions, some survey items were revised, deleted, or added to reflect Chinese social and educational contexts. We also purposely kept some items for multi-national comparisons and to capture the phenomenon of increasing student mobility, experience, and understandings of teaching and learning around the globe. 24
Challenges of and Issues in Translation and Adaption
Language-Driven Adaption
The first challenge we faced in survey translation was whether we should translate the acronym
Socio-Cultural Adaption
Socio-cultural adaptation of cross-cultural surveys is generally more difficult than translation involving linguistic expression. Although globalization has facilitated the interaction among nations, each society still has unique political, social, cultural and educational systems. Accordingly, cultural practices in one society might not have an equivalent in another society. The following examples illustrate the challenges the team faced when adapting English survey questions to fit the Chinese cultural context, as well as the approaches the team adopted to solve those challenges.
First, clear differences between the American and Chinese higher educational systems are reflected in professional ranking systems, flexibility in choice of majors by students, and teaching styles. The professional ranking system in
The team also revised some questions to reflect the influence of the Chinese higher education system on students’ choices and teaching styles. For example, Chinese high school seniors are required to select majors when they fill out their college application forms. Switching majors is also relatively difficult at Chinese universities and colleges, although a few universities have begun to implement policies that allow a small number of students to change their major field of study. For most students who intend to change their majors, the best time for doing so is when they apply for graduate school. Therefore, the question in the English language survey, “As an undergraduate how aware are/were you of the job market for employment related to the discipline of your intended major?” might cause confusion among Chinese students, some of whom might think they were being asked whether they understand the job market of their desired majors. After discussion, the team changed “the intended major” to “current major” and translated it accordingly. Similarly, we deleted the word “clickers” in a question that asks whether teachers use quizzes or clickers to provide students with immediate feedback in class, since clicker devices have not been commonly used in lecture halls in Chinese universities and colleges.
Second, we adapted and added some questions to reflect the social structure of Chinese society. When translating demographic questions, we deleted a question that asks respondents’ race, because race has not been a salient issue in Chinese demographic categorization systems and general public perceptions of difference. At the same time, we added a question asking respondents which provinces they are from since there are salient differences in economic and educational development among Chinese provinces and regions.
Meanwhile, cross-cultural translation requires translators to have a deep knowledge of cultural practices in both societies. Although all translators that made up our team have studied in the
Keeping Items for Multinational Comparison
Because
Handling Sensitive Issues
A few questions in the last part of the survey, which is an optional section, ask students about their sexual orientations and the gender of their parents. Because LGBT issues have not been openly discussed at most Chinese universities and colleges, some team members questioned whether the questions should be included in the Chinese version. The team decided to keep the questions since the survey would be conducted primarily online, which gives students privacy to answer such questions. Preliminary analysis of the data collected from the pilot survey conducted in three Chinese universities showed that among 104 respondents who answered the sexual orientation question, 13 students indicated that they did not consider themselves “straight (不是男女同性恋).” 26 In addition, when translating the question: “Were both of your parents/guardians the same gender?” the team encountered a similar challenge. “Parents” in Chinese, “fu mu” (父母), already implies two genders: “father and mother”; a literal translation, that is, whether “father and mother” have the same gender, is obviously awkward. However, no other colloquial word or phrase in Chinese replaces “fu mu” in a gender neutral way.
Revisions after the Pilot Survey: The Phenomenon of Moderatism
Survey translation, adaptation and modification is perhaps above all an iterative process,
27
as illustrated by one of the more intriguing discussions among the survey team about what we now call the phenomenon of “moderatism.” After the pilot survey, a new doctoral student joined the survey team and enriched the team’s discussion and understanding of the art and science of translation with his skills and expertise. Based on preliminary analyses of the results of the pilot survey, he raised concerns about the appropriateness of the translations for two statements: “You see yourself as a math person” and “You see yourself as a science person.” He pointed out that the current translations, “您认为您是数学达人” and “您认为您是科学达人,” might confuse respondents, leading them to understand that they were being asked whether they were math or science geniuses. Chinese cultural norms privileging modesty may prompt respondents to give negative answers to the two statements, even if they indeed are math or science persons. The significant difference in the proportion of respondents who chose “no” (strongly disagree and disagree) to the two statements between those in China and in the
The translation process illustrated above demonstrates that a cross-cultural and diverse team approach to survey translation and adaptation is an effective way to avoid bias and achieve accuracy. Also, bringing new members into the team after a pilot survey will facilitate the revision process, because translators close to the project may not be able to identify subtle translation errors after working on the survey for such a long time. When reflecting on the translation process, Dr. Maltese stressed that a high trust relationship built between translators and the survey designer is critical to ensure genuine communication between the two sides, especially in the case where the survey designer has no direct or deep knowledge of the target language and culture.
In the end, we did not anticipate the extent to which members of the translation team would learn about each other’s educational systems and what is taken for granted when we speak of teacher, learner, and classroom. The extent to which the translation team would provide participants with a significant experience in becoming globally-aware and culturally-sensitive scholars was one of the happy unintended consequences of this study.
Conclusion
Our primary purpose in this paper has been to introduce and share our experiences in preparing for and implementing the
As implied in part two and three of this paper, it is perhaps ironic that multi-national and cross-cultural survey research is becoming more difficult and time consuming, not to mention costly, to conduct in the current technological environment in which it can be so easy to get surveys in front of potential respondents. For example, each of the project phases discussed above, from translation to getting noticed to sampling, adds total time to the process of getting access and collecting data. There are many points along the research chain where a single person determines whether or not the effort can proceed. Nevertheless, it is still possible to conduct such work. In fact, it is essential for work like this to proceed to better inform institutions and policymakers about the myriad experiences of diverse students and faculty and to lay a foundation for greater collaboration on transnational and comparative survey research.
Footnotes
* Authors are listed alphabetically as all contributed equally to this manuscript.
1
. Accessed on September 2, 2010.
10 M. Zhang, Su, B., Li, J., Jiang, L, Chen, M., Li, X, and Gu, Q., “New Perspectives into the Employment Issues of College Students: Survey on psychological stage and the employment realities in Guanzhou in the year
,” Higher Education Exploration (Gaojiao Tansuo), no. 3 (2004): 88-90.
13 B. E. Woolnough, Guo, Y., Leite, M. S., Almeida, M. J. d., Ryu, T., Wang, Z. and Young, D., “Factors Affecting Student Choice of Career in Science and Engineering: parallel studies in Australia, Canada, China, England, Japan and Portugal,” Research in Science & Technological Education 15, no. 1 (
): 105-121. doi: 10.1080/0263514970150108.
15 Sara Lipka, “Want data? Ask students. Again and again,” The Chronicle of Higher Education, August 7, 2011,
.
16 S. R. Porter, Whitcomb, M. E. and Weitzer, W. H., “Multiple surveys of students and survey fatigue,” New Directions for Institutional Research
, no. 121 (2004): 63-73. doi: 10.1002/ir.101.
21 We want to thank the contributions of Dr. Yuhao Cen, Assistant Professor, Shanghai Jiaotong University, and Ms. Lei Wang, doctoral candidate, Indiana University, for their assistance in summarizing the translation and adaptation challenges of these projects.
23 Harkness, “Comparative.”
25 Result from the preliminary analysis on the Chinese pilot survey shows that 6 respondents chose the option of “assistant professor” and 18 respondents chose “lecturer.”
26 Among 13 students who answered that they did not consider themselves “straight, 1 student indicated that he/she is a gay/lesbian (男/女同性恋), 7 indicated that they are bisexual (双性恋), 3 indicated that he/she is something else (其他), 2 indicated that they don’t know their sexual orientations (我不知道).
27 Harkness, “Comparative survey research: goal and challenges.”
