Abstract
The purpose of this paper is to provide information that can benefit Higher Education Institutions (HEIs), which operate in a highly competitive environment. Understanding students’ perspectives regarding quality in higher education and areas of dissatisfaction can redirect an HEI’s strategy to address these concerns. The present study seeks to clarify how students define quality. A business student survey was conducted to understand students’ expectations, satisfaction, and dissatisfaction with their current educational environment. The analysis systematically explores students’ dissatisfaction by categorizing qualitative data using a modified version of the seven dimensions of service quality introduced by Evans and Lindsay.The level of detail provided in this report will assist in developing effective processes to improve student satisfaction at the university. The results conclude that improvements in completeness of the educational experience, both classroom and administrative services along with improvements in accessibility and convenience for academics and services will have the most impact on student satisfaction. Completeness addresses the quality of learning materials and services and accessibility and convenience address the ease of access to these learning materials and services. This paper expands upon the definition of quality in higher education, focusing on student dissatisfaction. The classification of student feedback provides a unique perspective. The limitations of the study include the response rate, area of study, geographic area, and learning modality. Tests of validity were not applied to the seven dimensions of service quality due to the exploratory nature of this study.
Keywords
Introduction
Service quality directly impacts the success or failure of an organization. Parasuraman et al. (1988) indicate that quality and customer satisfaction are directly linked. Institutions of higher education must be aware of the quality of their educational experience. Quality is one factor affecting enrollment and with the expected decline in enrollment for HEIs, quality becomes more especially important. According to Barshay (2018), the declining birthrate after the 2008 financial crisis is predicted to severely impact HEI enrollment, particularly in regions such as the Midwest. National undergraduate HEI enrollment for Fall 2021 was 3.1% less than Fall 2020 and Fall 2020 was 3.6% less than Fall 2019 (National Student Clearinghouse Research Center, 2021). It was also noted that Spring 2022 was 4.7% lower than Spring 2021 and Spring 2021 undergraduate enrollments were 4.9% less than Spring 2020 (National Student Clearinghouse Research Center, 2022). However, looking ahead the National Center for Education Statistics (2022) expects undergraduate student enrollment to increase 8% between Fall 2020 and Fall 2030.
Recognizing that students are clients for educational institutions (Ravindran & Kalpana, 2012) requires understanding of what students see as important. This may provide a strategic advantage for HEIs. Oja (2011) shows that satisfied students perform better and are more likely to continue their education. Student satisfaction is also recognized as a key to institutional image, given that displeased students can create a negative image for an HEI (Fitzpatrick et al., 2012). However, Abbas (2020) notes that one key concern when addressing quality in HEIs is identifying characteristics of quality specific to HEIs. It is important to seek out the student perspective to avoid the pitfall of working on non-issues that do not impact students’ level of satisfaction. Smeds (2022) refers to this as tampering, which results in an action that does not address the fundamental cause of the problem.
Support for improving the quality of the educational experience is provided by accreditation organizations such as the Higher Learning Commission and AACSB. These organizations have standards in place that must be met by institutions of higher education to earn and to maintain accreditation. For example, institutions may have to provide evidence that faculty are conducting quality research and publishing, and that assessment and curriculum processes are in place to improve student learning.
This paper explores the results of a survey conducted by a business school at a Midwest regional university who at the time was seeking AACSB accreditation (and is now AACSB accredited) and a projected decline in student enrollment. The survey was disseminated to all undergraduate and graduate students enrolled in the business school to better understand their concerns and to provide the school with direction for strategic planning. Students provided feedback in areas of satisfaction and dissatisfaction with their educational experience. Using a modified version of the seven dimensions of service quality as defined by Evans and Lindsay (1996), the authors categorize areas of dissatisfaction expressed by students in the open-ended survey questions. These dimensions allow qualitative data to be categorized. The seven dimensions include (a) completeness, (b) time and timeliness, (c) courtesy, (d) consistency, (e) accessibility and convenience, (f) accuracy, and (g) responsiveness. This survey captures the voice of the customer, the student engaged in acquiring a higher education and accessing services of the university.
The results presented in this paper identify areas for improvement in administrative processes and academic services. Several key areas for improvement in both academic services and instruction are identified. The survey reveals that students expect a relevant, current education that is available through multiple modes of instruction. Moreover, students want easy access to and awareness of academic services, internships, and career opportunities.
The rest of this paper includes a background discussion, methodology, findings, conclusions, limitations, and future research. The background section focuses on quality in institutions of higher education, measuring service quality, and quality improvement in HEIs. In the methodology section, the student survey and analysis of the responses to the open-ended questions are presented. The findings, conclusions, and limitations section discuss the results and limitations of the study. Finally, the authors discuss future research.
Background
Quality in Institutions of Higher Education
Quality in products and services is essential in today’s global environment since providing a poor-quality product or service erodes a firm’s competitive advantage. This also applies in the competitive environment facing institutions of higher education (HEI). The cost of poor quality is an inability to retain and to attract students. Recent literature confirms that service quality influences satisfaction which influences one’s decision to continue one’s education at the HEI in question (Fernandes et al., 2013; Mihanovic et al., 2016; Teeroovengadum et al., 2019). Arambewela and Hall (2013) show that quality of services provided by an HEI influence student satisfaction. Additionally, studies have shown that students’ perception of quality impacts student satisfaction, which in turn impacts retention of students (Appuhamilage & Torii, 2019; Haerizadeh & Sunder, 2019; Osman & Saputra, 2019). A study of public and private HEIs in Malaysia (Shahijan et al., 2018) suggests international students’ experiences impact perceived value and service quality thereby influencing students’ intention of further study. Their study draws on the SERVQUAL model to explain the dimensions of service quality. The implications from Shahijan et al.’s study are that managers and providers should seek to alter bad past experiences and emphasize the five SERVQUAL dimensions: responsiveness, tangibility, reliability, assurance, and empathy.
HEIs are important in society in that they educate society’s leaders, provide innovative technology, and positively impact economic development (Lu et al., 2017). Excellence in quality is not simply a by-product of “doing it right,” but the result of a management strategy to enhance and to align processes in a way that quality becomes inherent in the way of doing business (Crosby, 1979). The principles of a quality management practice include a customer-oriented focus on quality, where the customer defines quality (Deming, 1986). The customers in this study are business students at an institution of higher learning. Due to increasing competition in HEIs, Calma and Dickson-Deane (2020) and Teeroovengadum et al. (2016) stress the student as a customer and encourage applying a marketing approach to HEIs. However, defining and measuring quality for an HEI is difficult. The objective of this study is to provide direction for an effective strategy to improve the quality of the educational experience.
Quality attributes in services are harder to define than for products. However, Crosby (1979) claims there is no excuse for not doing things right, either for products or services. In the book Quality is Free, Crosby discusses what defines good service, where quality is defined as conformance to requirements, not just doing “good” work. Crosby states “the problem is not what people don’t know, but rather what they do know.”
Quality and its impact on student satisfaction are shown to be one element influencing student performance (McLeay et al., 2017; Mihanovic et al., 2016). McLeay et al. apply the importance-performance analysis (IPA) model to assess the gap between attributes of the HEI experience and performance based on student satisfaction. Duzevic (2020) argues that student satisfaction is an incomplete indication of quality. The author suggests the focus should be on student development.
This study seeks to add to the literature on student satisfaction given that there is a focus in the literature to identify factors impacting student satisfaction. Gregory (2019) uses the SERVQUAL model to survey doctoral students’ satisfaction with their program. Osman and Saputra (2019) investigate service and program quality, as well as institutional image and its impact on student satisfaction in HEI. The authors’ definition of service quality encompasses tangibles, reliability, responsiveness, assurance, and empathy. Likewise, program quality includes academic factors, curriculum, and teaching methods. They find that service quality does not directly impact student satisfaction and suggest that service quality alone is not enough to influence students. On the other hand, program quality was shown to be a significant factor influencing student satisfaction. However, both service and program quality reflect positively on institutional image.
Measuring Service Quality
The SERVQUAL instrument for measuring service quality is the most widely used method to measure quality in HEIs (Campos et al., 2017; Gupta & Kaushik, 2018). SERVQUAL (service quality gap model) can be used by managers in a variety of sectors. The SERVQUAL model is based on gap theory and compares perceived service quality to expected service quality. The five dimensions are tangibles, reliability, responsiveness, assurance, and empathy. Cronin and Taylor (1992) suggest that the SERVPERF model, a modification of the SERVQUAL model, may be a better measure for understanding variability in service quality. The SERVPERF model uses the same five dimensions as the SERVQUAL model, but is based on performance and the premise that quality is an antecedent to customer satisfaction. A meta-analytic analysis of both models finds that SERVQUAL and SERVPERF are equally valid measurements of service quality. Both rely on the conceptual notion that service quality is an attitude based on a comparison of expectations with performance (Carrillat et al., 2007).
Evans and Lindsay (1996) suggest that customers evaluate services primarily by the quality of the human interaction. Evans and Lindsay’s seven dimensions of service quality include completeness, time and timeliness, courtesy, consistency, accessibility and convenience, accuracy, and responsiveness. Russell and Taylor (2019) provide an enhanced definition of these seven dimensions which adds clarity. Table 1 provides the Russell and Taylor definitions of these dimensions.
Dimensions of Quality for Services with Examples of Student Feedback.
The authors of this research study have expanded the Russell and Taylor (2019) definitions into an academic context. The modified academic definitions are used to help address quality in HEIs and suggest opportunities for improvement. Table 1 also contains the modified academic definitions of the seven dimensions and examples of student feedback from the business survey.
Completeness is defined as “is everything the customer asked for provided?” (Russell & Taylor, 2019). In an HEI context, this may be defined by the questions: are all required learning materials available and are they appropriate for the subject matter, do they meet the learning needs of students, and are the services provided that students expect to have access to?
Time and timeliness address wait times for service. In terms of an HEI this may mean are students waiting for information (from either instructors, staff, or administration) which impacts their ongoing knowledge acquisition or are students obtaining needed services in a timely manner?
Courtesy examines how customers are treated. In an HEI context this may include the students’ relationships with instructors and university employees. This dimension can be especially critical for attracting or retaining new students.
Consistency is defined as “is the same level of service provided to each customer each time?” (Russell & Taylor, 2019). In terms of an HEI, consistency may address retention and on-going satisfaction, given that consistency for an HEI addresses commonality in course design and approach across all classes. It also addresses equal access to services, despite students’ geographical location, instructional mode, or economic factors.
Accessibility and convenience pertain to “how easy is it to obtain the service?” (Russell & Taylor, 2019). In an HEI context this may address ease of access to course materials and services and barriers to learning or obtaining services.
Accuracy asks the question “is the service performed right every time?” (Russell & Taylor, 2019). In an HEI this may address does the course material reflect current and best practices? For example, is the software used in an accounting class reflective of software used in businesses? Are students provided with accurate information regarding administrative services?
Finally, responsiveness looks at how well organizations respond to unusual circumstances. In an HEI this may include how well instructors adapt to the individual needs of students. Responsiveness is also critical in that the organization must be flexible and dynamic, responding to unusual circumstances in a timely manner. For example, implementing a quick and effective response to the Covid-19 pandemic was required in 2020.
The authors find these seven dimensions of service quality better capture the human interaction issues expressed by the students in this research study. For example, students request more consistency among instructors in course design and web-based platforms which is not easily categorized into one of the five SERVQUAL/SERVPERF dimensions. Cronin and Taylor (1992) note that any scale is a compromise between context and relevance to the context in which it is applied. Parasuraman et al. (1988) recognized that the SERVQUAL instrument could be adapted to specific research needs.
The authors conclude that the Evans and Lindsay (1996) dimensions of service quality as defined in the context of an HEI (see Table 1) are most appropriate for understanding student dissatisfaction in this survey. Given the narrow focus of this survey, these dimensions as applied do not address the gap between what students expect and what they receive. The authors do not seek to measure student performance.
Interaction between students and instructors is critical for student success and learning (McLeay et al., 2017; Mihanovic et al., 2016). Helpful interactions with staff and administration ease the process of registering for classes, accessing student activities, and searching for internships. Interactions between faculty/staff and students is especially critical for student success at a regional campus where students are often first-generation students or full-time employees with demanding work and family schedules. This interaction is critical for student retention (Appuhamilage & Torii, 2019; Haerizadeh & Sunder, 2019; Osman & Saputra, 2019). Like SERVQUAL and SERVPERF, the seven dimensions of quality used in this study explore the relative importance of each dimension and allow for a focused effort in areas where students express dissatisfaction.
Quality Improvement in HEIs
A quality strategy must focus on continual improvement and requires planning and organization (Deming, 1986). Developing processes that use quality methods, which have been successful in manufacturing, health care, and other service industries, can provide a pathway for continual improvement. Lu et al. (2017) point to the need for higher education institutions to become process-oriented organizations and strive for improvement. Focusing on processes and drawing on data is important given that a major challenge for continuous improvement in higher education has been resistance and unwillingness to change (Sunder & Antony, 2018). Data combats this resistance; thus, improving the opportunity for success.
One challenge is finding appropriate measures of quality, given that quality must be measurable to implement improvement (Juran, 1979). Managing by facts is discussed in the case study conducted by Doman (2011) who points to the necessity for metrics to track improvements and maintain control. In order to establish measurements, requirements must be developed, as suggested by Bandyopadhyay (2014), in a framework for online education programs. Once requirements are identified, failures can be recognized. Sunder and Antony (2018) note that failures in higher education are more difficult to define. For example, teaching failures could be attributed to multiple defects, such as unacceptable grading turnaround time or failure to respond to student inquiries.
Campos et al. (2017) reveal student expectations change during one’s educational experience. Senior students tend to be more demanding than beginning students. Therefore, schools need to conduct ongoing surveys of their students to better understand needs, concerns and recognize areas for improvement (Calma & Dickson-Deane, 2020). It is suggested by Sunder and Mahalingam (2018) that quality improvement efforts using a collaborative team-based approach see increased benefits. The authors identify, through case studies, that stakeholders’ participation and participative leadership, along with student teams, are important contributors to success. Sunder and Mahalingam’s recommendations include obtaining information on the voice of the internal and external customers.
Methodology
Business Student Survey
The survey was developed by the authors’ (of this paper) business school to allow the opportunity for students to provide feedback on their educational experience. The business school developed and distributed a survey to all business students enrolled in the business school in Spring 2019. This included both undergraduate and graduate business students. The survey was sent to 786 business students and resulted in 11.2% or 88 students responding to the survey in Spring 2019.
The survey includes questions regarding class standing, cumulative GPA, transfer status, majors and concentrations, and primary learning delivery modes. The main objective of the survey is to understand students’ level of satisfaction with various aspects of their degree and educational experience, identify areas that had a strong impact on student success, find areas of weakness, and provide suggestions for changes to the program and educational experience. The open-ended responses for some of the questions (for example, “What would you like to see changed about your degree program and educational experiences?”) provide a rich source of feedback and information for the school.
The survey response rate was 11.2% resulting in a sample size of 88. Graduate students represented 6.8% of the business survey respondents and the remaining 93.2% of the respondents were undergraduate students. The response rate was 5.7% for first year students, 8.0% for sophomores, 31.8% for juniors, and 47.7% for seniors. Thus, the majority of the survey respondents (79.5%) were juniors or seniors.
In addition, of those responding, 73.9% had a GPA of 3.0 or higher and 68.2% of the respondents were transfer students from other academic institutions. Furthermore, 36.4% of the respondents had been at the university for less than 1 year and 23.9% had been at the university between 1 and 2 years. Figure 1 shows the primary learning delivery modes for the students surveyed. The majority of the students were completely online (52.3%), while 35.2% were taking a mixture of campus and online courses. In total 87.5% of the students took some form of an online class.

Primary learning delivery mode.
Students were surveyed about their satisfaction level for their primary mode of learning. The survey used a seven-point Likert scale where seven represents “Very Satisfied” and one indicates “Very Dissatisfied.” The survey did not specify what the other points 2, 3, 4, 5, and 6 indicated. Thus, it was up to the student respondent to interpret the scale. Table 2 shows the rankings and reveals that the majority of the students (63.6%) gave either a six or seven for their satisfaction level.
Satisfaction Level of Primary Learning Delivery Mode.
Analysis of Responses to Open-ended Questions
The goal of the survey is to provide feedback to the business school on students’ educational experience. The feedback can then be used to drive changes to improve the educational experience. The study is exploratory and qualitative in nature. The authors analyze the open-ended response questions to explore areas where improvements can be made to improve the educational experience. These questions were designed to create an opportunity for students to express their satisfaction or dissatisfaction with their educational experience.
The open-ended questions are as follows:
What aspects of your experience have had a strong positive impact on your success as a student?
What do you consider to be weaknesses in the degree program and educational experiences?
What would you like to see changed about your degree program and educational experiences?
Additional comments
The questions “what aspects of your experience have had a strong positive impact on your success as a student” and “what do you consider to be weaknesses in the degree program and educational experiences” provide feedback on positive impacts and weaknesses in the educational experience. Student responses included not being aware of academic support services and advanced courses only being available online.
The question “what would you like to see changed about your degree program and educational experiences” provides insight into areas where enhancements can be made to improve students’ educational experience. Students requested additional supplemental instruction, more face-to-face classes, more interaction with faculty, more awareness of career and internship opportunities, and more consistent course platforms.
The question “Additional Comments” provided additional feedback from students including requests for more live teaching videos, more timely grading of assignments, better advising, and less online courses.
The responses to the open-ended questions are insightful and provide rich qualitative data for this research. The question responses allow the authors to explore areas of satisfaction and dissatisfaction. Responses with suggestions for improvement were analyzed and point to potential areas where gains could be made in student satisfaction. Each applicable response to the open-ended questions was reviewed, analyzed, and bucketed into one of the seven dimensions of service quality. Each author separately categorized every applicable response into one dimension of service quality (completeness, time and timeliness, courtesy, consistency, accessibility and convenience, accuracy, or responsiveness) to eliminate bias. The author noted the additional dimensions that may be appropriate when there was doubt about which dimension best applied.
A method similar to the Delphi Technique was applied after each author had categorized the responses independently. According to Hsu and Sandford (2019), the Delphi Technique “employs multiple iterations designed to develop a consensus of opinion concerning a specific topic.” After each author had reviewed and bucketed each response to a particular service quality dimension, the authors together reviewed discrepancies and items that did not appear to “fit” one dimension. Responses that covered multiple areas of concern were separated and bucketed individually. No area of concern was categorized into more than one dimension. A follow-up discussion further clarified the academic definition and consensus was reached across all responses.
Findings, Conclusions, and Limitations
Findings
Student satisfaction is an important variable in the success of higher education institutions because of its impact on student retention and performance (Appuhamilage & Torii, 2019; Haerizadeh & Sunder, 2019; Osman & Saputra, 2019). This survey assists with understanding areas of dissatisfaction of business students at a Midwest regional university. These results help to provide a strategic focus for faculty and administration. The research findings are based on an analysis of 80 responses from the four open-ended questions where students provided feedback on their educational experience and improvements that could be made to the academic experience.
The pareto chart in Figure 2 shows how the 80 responses were bucketed in the areas of accessibility and convenience, completeness, consistency, courtesy, accuracy, time and timeliness, and responsiveness. The highest category is accessibility and convenience with 35% of the responses. Completeness is the second largest category with 26.3% of the responses bucketed under this quality service dimension. Thus, these two categories comprise 61.3% of the responses and provide information on where to focus on educational improvements.

Suggested areas for improvement.
The other remaining dimensions of service quality comprise 38.7% of the responses. The consistency and courtesy results are fairly even with 12.5% and 11.3% of the bucketed responses falling under these categories. Consistency and courtesy comprise 23.8% of the bucketed responses. The final dimensions comprise 15% of the total. Accuracy received 7.5% of the bucketed responses. Time and timeliness received 5.0% and the smallest category was responsiveness with 2.5%.
Students expressed dissatisfaction in multiple categories with the top category being accessibility and convenience with 28 responses. Student suggestions include having an increased awareness of academic services, more internships, and additional career service opportunities. Thus, opportunities may exist to improve student awareness of academic services with collaborative efforts between faculty, staff, and administration. In addition, students also request more connection with local business and opportunities to participate in clubs, speaker events, and meet with faculty. Therefore, opportunities exist to increase engagement for both online and campus students. Also, there were students requesting more on campus classes indicating other areas for enhancement which might include offering a variety of learning modalities. The opportunities for engagement and information about these opportunities are available in many instances, but communication of and awareness by the students are lacking. The school needs to focus on the processes for sharing information, making it easily accessible, and marketing the opportunities provided by the school.
The 21 responses related to completeness include suggestions for more videos of professors teaching, mentors, and more supplemental instruction. This information is particularly valuable to faculty who can individually make improvements to courses given the autonomy of faculty to adjust course materials. Conducting research on effective online learning and engagement should be encouraged. Promoting course certifications, such as QM (Quality Matters) for all required business courses may address student concerns. Looking at a variety of processes to promote student learning and development needs to be a focus for the school.
Consistency and courtesy concerns were significantly less than accessibility and convenience and completeness. However, given the exploratory nature of this research, these dimensions are worth discussing. Concerns that were categorized under the dimension of consistency address accurate estimates of course workload, course design, and web-based platforms across all instructors. Instructors can improve their courses by using course analytics, frequently provided by learning management systems and online learning sources. Courtesy might be improved with increased faculty interaction and an increased focus by academic advisors on student concerns. Recognizing the importance of personal interactions and the processes for sharing information may assist faculty and staff to be more attentive and aware of student needs.
Conclusions
The purpose of this research is to better define students’ expectations for a quality educational experience. This requires understanding and measuring student satisfaction, using qualitative data. Student satisfaction is important due to its impact on student performance and retention. Recently this business school achieved AACSB initial accreditation and is in the process of developing a long-range strategic plan to attract and to retain students. The results of this survey are critical for developing an effective plan that focuses on processes promoting student success and engagement. The seven dimensions for service quality provide a means to understand the processes that engage students. Quality improvement is achieved when faculty and staff address the processes involved in providing services and educational material (Crosby, 1979). This paper, using the seven dimensions of service quality, assists faculty and staff to see their services as processes and to focus on improving these processes to provide a quality educational service.
The results from this study partially address the concerns of Sunder and Antony (2018) who discuss the difficulty of defining failures. Sunder and Antony encourage metrics to combat resistance. Once the definitions of completeness and accessibility and convenience are specific, measurables can be developed and tracked. Bandyopadhyay (2014) and Doman (2011) advocate for metrics as a driver of quality improvement in HEIs. However, questions remain. How does an HEI improve in these areas?
These results raise additional questions and require further clarification, such as what learning materials should be required for courses, how do we define service accessibility, what does it take to have information readily available to students and to have assistance with accessing these services? Measurables for these problem areas can be developed when the definitions for completeness and accessibility and convenience are better clarified.
Limitations
The limitations of the study include the response rate, area of study, geographic area, and learning modality. The response rate to the survey was 11.2% and the survey was only sent to undergraduate and graduate business students enrolled at a university in the Midwest. The majority (87.5%) of the student respondents were taking at least one online class for their learning modality. Thus, the results may not be applicable to students in other fields of study, other learning modalities, and geographic areas. In addition, tests of validity were not applied to the seven dimensions of service quality due to the exploratory nature of this study. A follow up to non-respondents was not done, which would have checked for response bias.
Direction for Future Research
Although the objective of the research is accomplished and focus areas have been identified, the definition of quality can be further defined. Future surveys need to focus on what makes for a good or bad experience. Based on the work of Elken and Stensaker (2018), who discuss quality work as a process that focuses on practices and local definitions, it is important to understand the concerns of a specific HEI student body. Thus, the authors of this paper propose further, more detailed surveys to expand the definitions for (a) completeness and (b) accessibility and convenience, thereby translating student expectations into measurable action items. These results may address the question of measurability of quality indicators as discussed by Calma and Dickson-Deane (2020).
Lean Six Sigma methods may help to drive educational process improvements. There is a significant amount of literature that details case studies using Lean or Six Sigma principles to improve the higher education system (Haerizadeh & Sunder, 2019; Li et al., 2019; Koromyslova et al., 2018; Sunder, 2016; Sunder & Mahalingam, 2018). Sunder and Mahalingam (2018) suggest that Lean Six Sigma serves as a template for problem-solving and promotes improvement in quality. Their research highlights key takeaways for implementing Lean Six Sigma in HEI and addresses the benefits and challenges. Lu et al. (2017) point to the need for higher education leadership to use new methods to reduce the cost of education and increase graduation and retention rates.
To et al. (2018) find that having a customer focus has the largest total effect on quality improvement. This concurs with George (2003) who advocates for organizations to hear the “Voice of the Customer.” The authors of this paper focus on the “Voice of the Student” and thus this research provides new insights for administrators, staff, and faculty in higher education who seek to improve the quality of the educational experience.
To summarize, there is a vast amount of literature covering quality in institutions of higher education. Most research is based on the perceived needs of the student as opposed to using the voice of the student to define student expectations. The authors know of no other research that attempts to define student expectations of quality using the Evans and Lindsay (1996) seven dimensions of service quality. Therefore, this research addresses a gap in the literature and provides an expanded definition of quality from the student perspective.
Footnotes
Acknowledgements
Not applicable
Declaration of Conflicting Interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.
Ethical Approval
Not applicable
Data Availability Statement
Data sharing not applicable to this article as no datasets were generated or analyzed during the current study.
