ABSTRACT
This study examines service and learning quality across demographic variables in Punjab’s state universities. With the rapid expansion of technical education, understanding student perceptions is crucial. Analyzing responses from 720 students across five universities and disciplines—Management, Computer Science, Engineering, Pharmacy, and Hotel Management—the study uses a structured questionnaire to assess service quality dimensions (Assurance, Tangibility, Reliability, Empathy, and Responsiveness) alongside learning quality indicators. Findings show gender has no significant impact, ensuring an inclusive environment. Postgraduates report higher satisfaction in Assurance, Reliability, and Learning Quality, while undergraduates perceive greater Empathy. Course-wise differences are minimal, but universities vary, with Punjabi University, Patiala excelling in Reliability and Learning Quality. The study highlights the need for tailored educational services to enhance student satisfaction. Addressing disparities in service reliability and learning quality can improve academic outcomes, guiding policy improvements, faculty development, and infrastructure enhancement in Punjab’s technical education sector.
Keywords: Service Quality, Learning Quality, Student Satisfaction, Technical Education, India.
Introduction
Punjab, a state located in the northern part of India, has been making remarkable strides in improving its educational infrastructure, particularly in the realm of technical education. As one of the country’s rapidly developing regions, Punjab is facing a significant demand for skilled labor, especially in sectors such as engineering, manufacturing, and information technology. The state’s industrial growth, spurred by both private and public investments, has created a substantial need for technical professionals. However, addressing this demand has posed several challenges, primarily related to the quality and accessibility of education, particularly in the growing number of technical institutions such as engineering colleges and polytechnics. As these institutions expand, the focus on service quality and learning outcomes becomes increasingly critical, particularly in the face of challenges such as unregulated technical academies and overcrowded classrooms (Pardeep, 2008; Palshikai, 2010).
Service quality, defined as the degree to which educational institutions meet or exceed student expectations, has become a key determinant of student satisfaction and learning outcomes. Several studies have explored the relationship between service quality and learning quality, emphasizing various aspects that influence students’ educational experiences. Kwan and Ng (1999) found that students from different cultural backgrounds place varying levels of importance on different aspects of educational services, such as faculty engagement, support services, and infrastructure. Similarly, Barnes (2007) extended the SERVQUAL framework to include elements such as university support and mentoring, which further enhance the student learning experience. Stodnick and Rogers (2008) applied the SERVQUAL model to traditional educational settings and concluded that factors such as tangibility and responsiveness have a lesser impact on student satisfaction compared to other service quality dimensions like reliability and assurance. This finding has important implications for understanding how service quality affects students’ perceptions of learning quality in educational institutions.
In the context of Punjab’s technical education system, the service quality of institutions plays a pivotal role in shaping learning outcomes. The rapid expansion of both public and private technical institutes has led to varying levels of infrastructure, teaching quality, and administrative support, which in turn affects the overall learning experience for students. Students who enroll in these institutions expect high standards of education, which include well-equipped classrooms, up-to-date learning materials, and faculty with industry-relevant knowledge. However, challenges such as outdated infrastructure, limited resources, and faculty shortages often hinder the delivery of high-quality education (Pardeep, 2008). Thus, understanding how service quality impacts learning outcomes in Punjab’s technical education sector is crucial to improving the state’s educational standards and addressing the skills gap.
Learning quality refers to the effectiveness of the educational process in terms of students’ knowledge acquisition, skill development, and overall personal growth. It has been widely acknowledged that learning outcomes are influenced by various institutional factors such as course design, faculty support, and availability of resources (Kintu & Zhu, 2016). Furthermore, demographic variables such as age, gender, academic background, and socioeconomic status significantly shape students’ learning experiences and their ability to achieve academic success. Kintu and Zhu (2016) demonstrated that factors such as age, family support, and attitude play a crucial role in students’ engagement and success in blended learning environments. In line with this, several studies have also emphasized the importance of personalized learning experiences and peer interaction in enhancing learning quality (Zafiropoulos and Vrana, 2008; Diette & Raghav, 2015).
Demographic factors also play a significant role in students’ perceptions of service quality and learning outcomes. For instance, the HEDPERF model introduced by Abdullah (2006) specifically addresses the role of student demographics in shaping service expectations and experiences. Zafiropoulos and Vrana (2008) examined how faculty and student perceptions of service quality differ, finding that demographic factors such as age, background, and academic standing significantly influence students’ expectations of educational services. In a similar vein, Kintu and Zhu (2016) highlighted the influence of family support, workload management skills, and academic ability on students’ performance in blended courses. These findings suggest that demographic variables must be considered when evaluating both service quality and learning quality, as they can significantly impact students’ satisfaction, engagement, and academic success.
In Punjab, the rapid expansion of technical education has been accompanied by significant demographic changes, with an increasing number of students from diverse backgrounds seeking admission to state universities. The demographic composition of the student body, including factors such as age, gender, family support, and digital literacy, influences how students perceive and experience both service quality and learning quality. Byungura et al. (2018) highlighted that disparities in digital access and prior exposure to technology create significant barriers to e-learning adoption, which is increasingly important in the modern educational landscape. In addition, research by Patra et al. (2021) has demonstrated that students’ digital literacy and network accessibility significantly affect their engagement with online learning platforms. Given the growing reliance on digital tools and online learning in Punjab’s educational institutions, addressing these disparities is crucial to ensuring equitable access to quality education.
The relationship between service quality, learning quality, and demographic factors is of particular relevance in the context of Punjab’s evolving technical education system. As the demand for skilled labor in industries such as engineering, IT, and healthcare continues to rise, it is imperative to examine how state universities in Punjab are meeting the needs of their diverse student populations. This research aims to explore the relationship between service quality and learning quality across different demographic variables in Punjab’s state universities. By identifying how demographic factors influence students’ perceptions of service quality and their learning outcomes, this study seeks to provide valuable insights into how institutions can enhance their educational offerings and improve overall student satisfaction.
The goal of this study is to investigate how demographic differences shape students’ experiences in Punjab’s state universities regarding service quality and learning quality. With the state’s technical education sector expanding rapidly, understanding the intersection of service quality, learning quality, and demographic variables will be essential to improving the quality of education and ensuring that students are equipped with the skills required to succeed in a competitive labor market. This research will contribute to the ongoing efforts to modernize and reform Punjab’s higher education system, ensuring that it meets the needs of both students and industry stakeholders.
Literature review
Service quality and learning quality
Numerous studies have looked at how service quality affects learning quality, emphasizing several aspects that affect students’ experiences. Students from different cultural backgrounds place varying values on different components of educational services, according to Kwan and Ng’s (1999) comparison of service quality indicators in China and Hong Kong. In order to evaluate the quality of learning among Chinese students, Barnes (2007) extended the SERVQUAL framework by adding new elements including university assistance and mentoring. After applying SERVQUAL to conventional educational settings, Stodnick and Rogers (2008) came to the conclusion that tangibility and responsiveness had less of an impact on predicting student satisfaction and learning quality.
Faculty and student views of service quality were evaluated by Zafiropoulos and Vrana (2008), who discovered that staff expectations were typically greater than students. In order to assess online learning, Udo et al. (2011) modified SERVQUAL, renaming tangibility as website content and finding that Reliability was the least important component. Kintu and Zhu (2016) investigated the connection between learning outcomes and service-related elements like student support and course design. Their research highlighted how peer interaction, course design, and student characteristics all influence the quality of learning. The SERVQUAL model was further altered by Tere et al. (2020), who added learning material quality and LMS quality as new criteria for evaluating online learning services. The significance of SERVQUAL aspects was emphasized by Abu-Rumman and Qawasmeh’s (2021) analysis of the relationship between e-learning service quality and learning quality. According to Stankovska et al. (2024), responsiveness has the largest gap score, although other aspects of service quality have a favourable impact on learning quality. All of this research highlights the importance of institutional support in improving student experiences and confirms the close connection between service quality and learning quality.
Service quality and demographic variables
Student characteristics and demographic factors play a crucial role in shaping learning quality, engagement, and service quality in education. Abdullah (2006) introduced the HEDPERF scale, tailored for higher education, incorporating academic and non-academic factors to measure service quality, highlighting the role of student demographics in shaping service expectations. Zafiropoulos et al. (2008) examined faculty and student perceptions of quality in education, revealing that demographic factors such as age and background significantly influence students’ expectations and experiences. Kintu and Zhu (2016) found that factors such as age, attitude, and family support significantly impact student success in blended courses. Similarly, Sánchez-Mena and Martí-Parreño (2017) highlighted motivation as a key driver for gamification, while knowledge gaps hinder engagement, emphasizing the role of individual learning traits.
The digital divide remains a significant challenge, as disparities in access to technology, infrastructure, and policy limitations create major barriers to e-learning adoption (Byungura et al., 2018; Kibuku et al., 2020; Mishra et al., 2020). Danjuma et al. (2018) compared SERVQUAL, SERPERF, and HEDPERF models in higher education, assessing their strengths and applicability in measuring service quality across diverse student demographics. Mobile learning, influenced by learner demographics and social factors, was explored by Ng and Wong (2020), who used the FRAME model to assess its effectiveness among Chinese students. The impact of digital literacy on virtual class effectiveness was further analyzed by Inan and Karaca (2021), who found that access to technology and ICT skills, often shaped by demographic factors, significantly affect engagement in online learning (Wolverton et al., 2020; Ansari et al., 2020). Collectively, these studies underscore the importance of addressing demographic disparities to enhance learning experiences and service quality in education.
Learning quality and demographic variables
Several studies have explored the relationship between learning quality and demographic variables, highlighting how factors such as student background, institutional support, and digital accessibility influence learning quality. Kintu and Zhu (2016) found that student demographics significantly impact intrinsic motivation, knowledge acquisition, and satisfaction with course design. Diette and Raghav (2015) and Koc and Çelik (2015) investigated the impact of class size on learning quality, particularly among students with varying academic abilities. Both studies concluded that first-year students, low-performing students, and high-achievers are the most affected by larger class sizes, emphasizing the importance of maintaining an appropriate student-to-teacher ratio to enhance learning quality.
Nakayama et al. (2017) analyzed the effect of note-taking support services on student self-efficacy and learning quality, finding that students who received note-making instructions exhibited improved self-regulated learning, linked to Big Five personality traits. Byungura et al. (2018) addressed digital divides in online learning, emphasizing that prior exposure to digital tools, computer training, and access to technology impact student learning outcomes. Similarly, Patra et al. (2021) identified digital literacy and network accessibility as major factors affecting e-learning adoption among students and faculty, indicating that demographic differences in digital competency can significantly influence learning quality. Collectively, these studies demonstrate that demographic variables, including student background, technological access, academic standing, and personality traits, play a crucial role in shaping learning experiences and outcomes.
Methods
Participants and procedures
The five state universities in Punjab that offer courses in a variety of subjects, such as Management Studies, Computer Science, Engineering, Pharmacy, and Hotel Management & Catering, are the subject of the study: Punjabi University, Patiala; Panjab University, Chandigarh; Guru Nanak Dev University, Amritsar; Maharaja Ranjit Singh Punjab Technical University, Bathinda; and IKG Punjab Technical University, Kapurthala. Undergraduate students studying engineering, pharmacy, and hotel management as well as postgraduate students studying management and computer science provided primary data.
A total of 720 students were chosen, 30 in each of the five disciplines at each institution (instead of 750 as IKG PTU, Kapurthala does not have a pharmacy department). This sampling strategy, which takes into account 50% of a typical class unit of 60 students, complies with NAAC and AICTE standards. A comprehensive dataset for evaluating service quality, learning quality, and students’ demographic profile is provided by this systematic sample, which guarantees broad representation across institutions and fields.
Measures
Service quality
The 15-item Service Quality scale is broken down into five smaller subscales. Respondents’ opinions on service quality and learning quality were categorized into three groups based on a 7-point scale: “P < E” (service performance falls short of expectations), “P = E” (service performance meets expectations), and “P > E” (service performance exceeds expectations). Three important items under the five subscales of Assurance are related to collaborative learning experiences, prompt assessment feedback, efficient knowledge transfer, and making sure students receive dependable educational support. This scale’s Cronbach’s alpha was 0.852.
The three components of Tangibility are the availability of links to downloadable content on the LMS, the cleanliness of the student facilities, and the simplicity of LMS installation. The Cronbach’s alpha for this scale was 0.918.
Three elements are used to measure Reliability, and representative items include the capacity of teachers to capture students’ attention again, the clarity of their explanations of concepts, and the timely distribution of information through effective podcasting. This scale had a Cronbach’s alpha of 0.787.
Three components make up Empathy: giving each student individualized attention, encouraging mutual respect through positive learning assistance, and letting students evaluate and reflect on their own work. This scale’s Cronbach’s alpha was 0.862.
Three components make up Responsiveness: quick communication between teachers and students, attention to misbehaviour, and a readiness to hear students’ perspectives. This scale’s Cronbach’s alpha was 0.869.
Learning quality
Nine factors make up learning quality: exposure to the industry, student control over learning activities, flexible exam design, and the development of a skill set in students that includes higher-order thinking, creativity and critical thinking, problem-solving and innovation, self-managed learning, interpersonal and leadership skills, and global competitiveness. Learning Quality’s reliability estimate was 0.782.
Demographic profile of students
Demographic profile of the study participants, including information on how they were distributed by gender, class, course, and university. A balanced representation of both genders was there with 337 male students (46.80%) and 383 female students (53.2%). Academically, 300 students are post-graduates (41.67%) and 420 are under-graduates (58.33%). Student enrolment in different courses is further broken down into five streams viz. Management Studies, Computer Sciences, Engineering and Technical Course, and Hotel Management and Catering each have 150 students, while Pharmacy has 120 students. In a same vein, it shows a fair distribution across universities, with 150 students from each of the following institutions: Guru Nanak Dev University, Amritsar (GNDU); Punjabi University, Patiala (PUP); Panjab University, Chandigarh (PUC); and MRSPTU; 120 students from IKGPTU, as there was no pharmacy department in this university. A sophisticated comprehension of the student demographics is made easier by this thorough split, which guarantees sufficient representation for insightful analysis and interpretation of study results.
Analysis
Determining the various viewpoints and experiences of students in the technical education programmes provided by Punjab state universities requires analyzing aspects of learning and service quality in relation to demographic variables. A variety of traits are included in demographic variables, such as age, gender, academic class, study programme, and university affiliation. This investigation seeks to identify potential differences in students’ views of learning quality and service quality by exploring these demographic distinctions.
Analysis - gender and class wise
Examining how gender and academic class are analysed in the context of technical education programmes explores the possible differences in students’ perspectives according to these demographic variables. Education establishments looking to improve the overall quality of education must comprehend how gender and academic status may affect the aspects of learning and service quality.
Table 1: Independent samples test (Gender wise)
| Factor | Male (N=337) | Female (N=383) | T-Value | Sig. (2-tailed) |
|---|---|---|---|---|
| Assurance | 3.87 | 3.89 | -0.208 | 0.836 |
| Tangibility | 3.80 | 3.74 | 0.602 | 0.548 |
| Reliability | 4.05 | 4.04 | 0.013 | 0.989 |
| Empathy | 3.99 | 3.92 | 0.681 | 0.496 |
| Responsiveness | 3.98 | 3.98 | 0.089 | 0.929 |
| Learning Quality | 3.84 | 3.77 | 0.911 | 0.363 |
The independent samples test revealed no significant differences between male and female students across Assurance, Tangibility, Reliability, Empathy, Responsiveness and Learning Quality, suggesting gender may not significantly influence perceptions within technical education programs.
Table 2: Independent samples test (Class wise)
| Factor | Under-graduate (N=420) | Post-graduate (N=300) | T-Value | Sig. (2-tailed) |
|---|---|---|---|---|
| Assurance | 3.78 | 3.99 | -2.265 | 0.024 |
| Tangibility | 3.67 | 3.87 | -1.916 | 0.056 |
| Reliability | 3.94 | 4.16 | -2.021 | 0.044 |
| Empathy | 4.06 | 3.83 | 2.272 | 0.023 |
| Responsiveness | 3.87 | 4.11 | -2.506 | 0.012 |
| Learning Quality | 3.58 | 4.00 | -3.492 | 0.001 |
Further, Table 2 depicts Class wise analysis revealed significant differences in several Dimensions of service quality and learning quality between undergraduate and postgraduate students. Specifically, undergraduate students reported significantly lower mean Assurance scores (M = 3.78) compared to postgraduate students (M = 3.99, p = 0.024), indicating a potential disparity in the perceived level of Assurance between the two groups. While there was no statistically significant difference in Tangibility scores (p = 0.056), undergraduate students had lower mean scores (M = 3.67) compared to postgraduate students (M = 3.87), suggesting a trend toward higher Tangibility perceptions among postgraduates. Furthermore, undergraduate students reported significantly lower mean scores in Reliability (M = 3.94) compared to postgraduates (M = 4.16, p = 0.044), indicating a perceived difference in the Reliability of service delivery between the two groups. In terms of Empathy, undergraduate students had significantly higher mean scores (M = 4.06) compared to postgraduates (M = 3.83, p = 0.023), suggesting a stronger perception of Empathy among undergraduate students. Additionally, undergraduate students reported significantly lower mean Responsiveness scores (M = 3.87) compared to postgraduates (M = 4.11, p = 0.012), indicating a potential difference in the Responsiveness of educational services. Moreover, undergraduate students had significantly lower mean scores in Learning Quality (M = 3.58) compared to postgraduates (M = 4.00, p = 0.001), suggesting variations in the perceived quality of learning experiences between the two groups. These findings underscore the importance of addressing and bridging potential disparities in service quality and learning quality across different academic levels to ensure equitable and enriching educational experiences for all students.
Analysis with respect to courses
The purpose of the analysis done in relation to various academic courses is to identify any possible differences in how students perceive the aspects of service quality and the quality of learning across a range of disciplines.
Table 3: ANOVA results with respect to courses
| Attribute | Management Studies | Computer Sciences | Engineering Technical | Pharmacy | Hotel Mgmt Catering | Total | F Value | Sig. |
|---|---|---|---|---|---|---|---|---|
| Assurance | 3.85 (1.27) | 3.85 (1.41) | 4.02 (1.19) | 3.81 (1.08) | 3.92 (1.12) | 3.88 (1.24) | 0.579 | 0.678 |
| Tangibility | 3.77 (1.24) | 3.62 (1.35) | 3.88 (1.35) | 3.86 (1.17) | 3.76 (1.31) | 3.77 (1.28) | 0.967 | 0.425 |
| Reliability | 4.17 (1.49) | 3.88 (1.59) | 4.25 (1.32) | 3.98 (1.38) | 3.97 (1.42) | 4.04 (1.46) | 1.760 | 0.135 |
| Empathy | 3.94 (1.35) | 4.05 (1.41) | 3.85 (1.20) | 4.04 (1.25) | 3.83 (1.36) | 3.96 (1.32) | 0.817 | 0.514 |
| Responsiveness | 4.01 (1.15) | 3.86 (1.31) | 3.95 (1.33) | 3.95 (1.24) | 4.18 (1.29) | 3.98 (1.26) | 1.223 | 0.300 |
| Learning Quality | 3.80 (1.54) | 3.85 (1.61) | 3.71 (1.62) | 3.76 (1.56) | 3.72 (1.55) | 3.78 (1.57) | 0.198 | 0.939 |
Each value is in Mean (standard deviation)
The ANOVA results with respect to different courses, as depicted in Table 3, provide insights into potential differences in the mean scores of service quality Dimensions and learning quality across various academic disciplines.
Assurance: The mean Assurance score reported by students at Punjabi University, Patiala, is 3.92, whilst the score at Panjab University, Chandigarh, is marginally higher at 4.07. The scores for Guru Nanak Dev University, Amritsar, MRSPTU, and IKGPTU are 3.89, 3.88, and 3.72, respectively. There is no discernible variation between universities in the total mean Assurance score (F = 0.956, p = 0.431), which stands at 3.88. This suggests that students at these universities have a constant degree of confidence.
Tangibility: Among all universities, Panjab University in Chandigarh receives a mean Tangibility score of 3.82, while students at Punjabi University in Patiala report a mean score of 3.99. The mean scores for IKGPTU, Guru Nanak Dev University, Amritsar, and MRSPTU are 3.72, 3.76, and 3.69 respectively. There is no discernible variation between colleges in the mean Tangibility score, which is 3.77 overall (F = 1.056, p = 0.377). This implies that students have a consistent understanding of the concrete elements of high-quality services.
Reliability: With a mean reliability score of 4.31, Punjabi University, Patiala, leads, followed by Panjab University, Chandigarh, with 4.17. The results for Guru Nanak Dev University, Amritsar, are 4.03, MRSPTU, 4.15, and IKGPTU, 3.59. The average reliability score across all universities is 4.04, with a considerable variation between them (F = 4.317, p = 0.002). This emphasizes the necessity of focused increases in reliability, especially at colleges with lower average scores.
Empathy: The average Empathy score reported by students at Punjabi University in Patiala is 4.29, whilst the score at Panjab University in Chandigarh is 3.87. MRSPTU scores 3.91, IKGPTU scores 3.82, and Guru Nanak Dev University, Amritsar, scores 3.95. The average Empathy score across all colleges is 3.96, with no discernible variation between them (F = 2.069, p = 0.083). This suggests that students consistently sense a certain level of emotional support.
Responsiveness: With a mean Responsiveness score of 4.11, Punjabi University, Patiala, tops, followed by Panjab University, Chandigarh, with 4.04. The mean scores for MRSPTU, IKGPTU, Guru Nanak Dev University, Amritsar are 3.97, 3.87 and 3.97, respectively. There is no discernible variation between universities in the overall mean Responsiveness score of 3.98 (F = 0.557, p = 0.694). This indicates that there is a widespread belief that issues are resolved quickly at universities.
Learning quality: With a mean learning quality score of 4.07, Punjabi University, Patiala, leads, followed by Panjab University, Chandigarh, at 3.81. MRSPTU scores 3.78, IKGPTU scores 3.69, and Guru Nanak Dev University, Amritsar, scores 3.68. There is no discernible difference across universities in the overall mean learning quality score of 3.78 (F = 1.176, p = 0.320). This shows differences in learning quality scores, highlighting the significance of customised improvement tactics.
The findings indicate that students have a consistent positive experience in this area and that they perceive high levels of reliability across universities. There is no substantial difference in Assurance and Tangibility scores amongst universities, suggesting that students see tangible characteristics and have a consistent level of trust. There is no statistically significant variation in Empathy scores, indicating that universities experience the same level of emotional support. Consistency in Responsiveness scores shows that students believe all universities swiftly answer their complaints. A need for focused improvements in this area is shown by the notable variation in reliability scores, particularly for universities with lower mean scores. Variations in learning quality scores highlight the need for customized approaches to improve the overall learning process, especially for universities with lower scores. Universities can use these insights as a foundation to improve and bolster particular areas of their educational offerings.
Table 4: Post hoc analysis with respect to courses
| Dependent Variable | Comparison | Mean Difference (I-J) | Std. Error | Sig. | 95% Confidence Interval (Lower - Upper) |
|---|---|---|---|---|---|
| Empathy | Management Studies - Computer Sciences | 0.29244 | 0.25155 | 0.773 | -0.3989 to 0.9837 |
| Management Studies - Engineering and Technical course | -0.08885 | 0.24250 | 0.996 | -0.7553 to 0.5776 | |
| Management Studies - Pharmacy | -0.15481 | 0.31799 | 0.989 | -1.0287 to 0.7191 | |
| Management Studies - Hotel Management and Catering | -0.57147 | 0.24991 | 0.153 | -1.2583 to 0.1153 | |
| Computer Sciences - Management Studies | -0.29244 | 0.25155 | 0.773 | -0.9837 to 0.3989 | |
| Computer Sciences - Engineering and Technical course | -0.38129 | 0.27295 | 0.630 | -1.1314 to 0.3688 | |
| Computer Sciences - Pharmacy | -0.44725 | 0.34177 | 0.686 | -1.3865 to 0.4920 | |
| Computer Sciences - Hotel Management and Catering | -0.86392 | 0.27955 | 0.019 | -1.6322 to -0.0957 | |
| Engineering and Technical course - Management Studies | 0.08885 | 0.24250 | 0.996 | -0.5776 to 0.7553 | |
| Engineering and Technical course - Computer Sciences | 0.38129 | 0.27295 | 0.630 | -0.3688 to 1.1314 | |
| Empathy (cont.) | Engineering and Technical course - Pharmacy | -0.06596 | 0.33517 | 1.000 | -0.9871 to 0.8552 |
| Engineering and Technical course - Hotel Management and Catering | -0.48263 | 0.27143 | 0.389 | -1.2286 to 0.2633 | |
| Pharmacy - Management Studies | 0.15481 | 0.31799 | 0.989 | -0.7191 to 1.0287 | |
| Pharmacy - Computer Sciences | 0.44725 | 0.34177 | 0.686 | -0.4920 to 1.3865 | |
| Pharmacy - Engineering and Technical course | 0.06596 | 0.33517 | 1.000 | -0.8552 to 0.9871 | |
| Pharmacy - Hotel Management and Catering | -0.41667 | 0.34057 | 0.738 | -1.3526 to 0.5193 | |
| Empathy (cont.) | Hotel Management and Catering - Management Studies | 0.57147 | 0.24991 | 0.153 | -0.1153 to 1.2583 |
| Hotel Management and Catering - Computer Sciences | 0.86392 | 0.27955 | 0.019 | 0.0957 to 1.6322 | |
| Hotel Management and Catering - Engineering and Technical course | 0.48263 | 0.27143 | 0.389 | -0.2633 to 1.2286 | |
| Hotel Management and Catering - Pharmacy | 0.41667 | 0.34057 | 0.738 | -0.5193 to 1.3526 | |
| Teaching Qual | Management Studies - Computer Sciences | 0.13200 | 0.15989 | 0.923 | -0.3074 to 0.5714 |
| Management Studies - Engineering and Technical course | 0.19470 | 0.15414 | 0.714 | -0.2289 to 0.6183 | |
| Management Studies - Pharmacy | 0.00256 | 0.20212 | 1.000 | -0.5529 to 0.5580 | |
| Management Studies - Hotel Management and Catering | -0.30785 | 0.15885 | 0.300 | -0.7444 to 0.1287 | |
| Computer Sciences - Management Studies | -0.13200 | 0.15989 | 0.923 | -0.5714 to 0.3074 | |
| Computer Sciences - Engineering and Technical course | 0.06271 | 0.17349 | 0.996 | -0.4141 to 0.5395 | |
| Computer Sciences - Pharmacy | -0.12943 | 0.21724 | 0.976 | -0.7265 to 0.4676 | |
| Computer Sciences - Hotel Management and Catering | -0.43985 | 0.17769 | 0.100 | -0.9282 to 0.0485 | |
| Engineering and Technical course - Management Studies | -0.19470 | 0.15414 | 0.714 | -0.6183 to 0.2289 | |
| Engineering and Technical course - Computer Sciences | -0.06271 | 0.17349 | 0.996 | -0.5395 to 0.4141 | |
| Teaching Qual (cont.) | Engineering and Technical course - Pharmacy | -0.19214 | 0.21305 | 0.896 | -0.7776 to 0.3934 |
| Engineering and Technical course - Hotel Management and Catering | -0.50256 | 0.17253 | 0.032 | -0.9767 to -0.0284 | |
| Pharmacy - Management Studies | -0.00256 | 0.20212 | 1.000 | -0.5580 to 0.5529 | |
| Pharmacy - Computer Sciences | 0.12943 | 0.21724 | 0.976 | -0.4676 to 0.7265 | |
| Pharmacy - Engineering and Technical course | 0.19214 | 0.21305 | 0.896 | -0.3934 to 0.7776 | |
| Pharmacy - Hotel Management and Catering | -0.31042 | 0.21648 | 0.606 | -0.9053 to 0.2845 | |
| Teaching Qual (cont.) | Hotel Management and Catering - Management Studies | 0.30785 | 0.15885 | 0.300 | -0.1287 to 0.7444 |
| Hotel Management and Catering - Computer Sciences | 0.43985 | 0.17769 | 0.100 | -0.0485 to 0.9282 | |
| Hotel Management and Catering - Engineering and Technical course | 0.50256 | 0.17253 | 0.032 | 0.0284 to 0.9767 | |
| Hotel Management and Catering - Pharmacy | 0.31042 | 0.21648 | 0.606 | -0.2845 to 0.9053 |
The mean difference is significant at the 0.05 level.
Analysis with respect to university
The ANOVA results with respect to various universities as depicted in Table 5 reveal the following results:
Table 5: ANOVA results with respect to university
| Attribute | PUP | PUC | GNDU | MRSPTU | IKGPTU | Total | F Value | Sig. |
|---|---|---|---|---|---|---|---|---|
| Assurance | 3.9248 (1.2603) | 4.0746 (1.1889) | 3.8868 (1.1254) | 3.8782 (1.2449) | 3.7225 (1.4176) | 3.8788 (1.2376) | 0.956 | 0.431 |
| Tangibility | 3.9918 (1.2832) | 3.8171 (1.2658) | 3.6991 (1.1250) | 3.7163 (1.3769) | 3.7674 (1.3920) | 3.7660 (1.2841) | 1.056 | 0.377 |
| Reliability | 4.3092 (1.5008) | 4.1660 (1.4016) | 4.0310 (1.2884) | 4.1520 (1.4382) | 3.5930 (1.7093) | 4.0442 (1.4619) | 4.317 | 0.002 |
| Empathy | 4.2881 (1.3271) | 3.8744 (1.2696) | 3.9508 (1.3096) | 3.9115 (1.3557) | 3.8218 (1.2890) | 3.9566 (1.3229) | 2.069 | 0.083 |
| Responsiveness | 4.1069 (1.1788) | 4.0448 (1.0966) | 3.9718 (1.2753) | 3.9727 (1.2683) | 3.8674 (1.3547) | 3.9795 (1.2579) | 0.557 | 0.694 |
| Learning Quality | 4.0654 (1.5377) | 3.8143 (1.4588) | 3.6827 (1.5436) | 3.7843 (1.6232) | 3.6873 (1.6257) | 3.7777 (1.5745) | 1.176 | 0.320 |
Each value is presented as Mean (Standard Deviation).
Assurance: The mean Assurance score reported by students at Punjabi University, Patiala, is 3.92, whilst the score at Panjab University, Chandigarh, is marginally higher at 4.07. The scores for Guru Nanak Dev University, Amritsar, MRSPTU, and IKGPTU are 3.89, 3.88, and 3.72, respectively. There is no discernible variation between universities in the total mean Assurance score (F = 0.956, p = 0.431), which stands at 3.88. This suggests that students at these universities have a constant degree of confidence.
Tangibility: Among all universities, Panjab University in Chandigarh receives a mean Tangibility score of 3.82, while students at Punjabi University in Patiala report a mean score of 3.99. The mean scores for IKGPTU, Guru Nanak Dev University, Amritsar, and MRSPTU are 3.72, 3.76, and 3.69 respectively. There is no discernible variation between colleges in the mean Tangibility score, which is 3.77 overall (F = 1.056, p = 0.377). This implies that students have a consistent understanding of the concrete elements of high-quality services.
Reliability: With a mean reliability score of 4.31, Punjabi University, Patiala, leads, followed by Panjab University, Chandigarh, with 4.17. The results for Guru Nanak Dev University, Amritsar, are 4.03, MRSPTU, 4.15, and IKGPTU, 3.59. The average reliability score across all universities is 4.04, with a considerable variation between them (F = 4.317, p = 0.002). This emphasizes the necessity of focused increases in reliability, especially at colleges with lower average scores.
Empathy: The average Empathy score reported by students at Punjabi University in Patiala is 4.29, whilst the score at Panjab University in Chandigarh is 3.87. MRSPTU scores 3.91, IKGPTU scores 3.82, and Guru Nanak Dev University, Amritsar, scores 3.95. The average Empathy score across all colleges is 3.96, with no discernible variation between them (F = 2.069, p = 0.083). This suggests that students consistently sense a certain level of emotional support.
Responsiveness: With a mean Responsiveness score of 4.11, Punjabi University, Patiala, tops, followed by Panjab University, Chandigarh, with 4.04. The mean scores for MRSPTU, IKGPTU, Guru Nanak Dev University, Amritsar are 3.97, 3.87 and 3.97, respectively. There is no discernible variation between universities in the overall mean Responsiveness score of 3.98 (F = 0.557, p = 0.694). This indicates that there is a widespread belief that issues are resolved quickly at universities.
Learning quality:With a mean learning quality score of 4.07, Punjabi University, Patiala, leads, followed by Panjab University, Chandigarh, at 3.81. MRSPTU scores 3.78, IKGPTU scores 3.69, and Guru Nanak Dev University, Amritsar, scores 3.68. There is no discernible difference across universities in the overall mean learning quality score of 3.78 (F = 1.176, p = 0.320). This shows differences in learning quality scores, highlighting the significance of customised improvement tactics.
The findings indicate that students have a consistent positive experience in this area and that they perceive high levels of reliability across universities. There is no substantial difference in Assurance and Tangibility scores amongst universities, suggesting that students see tangible characteristics and have a consistent level of trust. There is no statistically significant variation in Empathy scores, indicating that universities experience the same level of emotional support. Consistency in Responsiveness scores shows that students believe all universities swiftly answer their complaints. A need for focused improvements in this area is shown by the notable variation in reliability scores, particularly for universities with lower mean scores. Variations in learning quality scores highlight the need for customized approaches to improve the overall learning process, especially for universities with lower scores. Universities can use these insights as a foundation to improve and bolster particular areas of their educational offerings.
Post hoc analysis with respect to university (Tukey HSD)
The study conducted a post hoc analysis using Tukey’s Honestly Significant Difference (HSD) test, as shown in Table 5 to identify subtle variations in the Reliability Dimension among different universities. The findings showed that there were no appreciable differences in reliability scores between the universities for the majority of pairwise comparisons. But a significant difference was found between IKGPTU and Punjabi University, Patiala, as well as between MRSPTU and IKGPTU, suggesting that students thought the latter two universities were more reliable. These results highlight the significance of evaluating and resolving particular reliability aspects within particular educational establishments. Policymakers and administrators at universities should be aware of these differences in order to better customize and improve their approaches to enhancing the reliability of educational service delivery
| Dependent Variable | Comparison | Mean Difference (I-J) | Std. Error | Sig. | 95% Confidence Interval (Lower - Upper) |
|---|---|---|---|---|---|
| Reliability | PUP - PUC | 0.14325 | 0.22789 | 0.970 | [-0.4799, 0.7664] |
| PUP - GNDU | 0.27822 | 0.17228 | 0.488 | [-0.1929, 0.7493] | |
| PUP - MRSPTU | 0.15722 | 0.17298 | 0.894 | [-0.3158, 0.6302] | |
| PUP - IKGPTU | 0.71626 | 0.19336 | 0.002 | [0.1875, 1.2450] | |
| PUC - PUP | -0.14325 | 0.22789 | 0.970 | [-0.7664, 0.4799] | |
| PUC - GNDU | 0.13497 | 0.20109 | 0.963 | [-0.4149, 0.6848] | |
| PUC - MRSPTU | 0.01397 | 0.20169 | 1.000 | [-0.5655, 0.5655] | |
| PUC - IKGPTU | 0.57301 | 0.21942 | 0.069 | [-0.0270, 1.1730] | |
| GNDU - PUP | -0.27822 | 0.17228 | 0.488 | [-0.7493, 0.1929] | |
| GNDU - PUC | -0.13497 | 0.20109 | 0.963 | [-0.6848, 0.4149] | |
| GNDU - MRSPTU | -0.12100 | 0.13574 | 0.900 | [-0.4922, 0.2502] | |
| GNDU - IKGPTU | 0.43804 | 0.16091 | 0.052 | [-0.0020, 0.8780] | |
| MRSPTU - PUP | -0.15722 | 0.17298 | 0.894 | [-0.6302, 0.3158] | |
| MRSPTU - PUC | -0.01397 | 0.20169 | 1.000 | [-0.5655, 0.5375] | |
| MRSPTU - GNDU | 0.12100 | 0.13574 | 0.900 | [-0.2502, 0.4922] | |
| MRSPTU - IKGPTU | 0.55904 | 0.16166 | 0.005 | [0.1170, 1.0011] | |
| IKGPTU - PUP | -0.71626 | 0.19336 | 0.002 | [-1.2450, -0.1875] | |
| IKGPTU - PUC | -0.57301 | 0.21942 | 0.069 | [-1.1730, 0.0270] | |
| IKGPTU - GNDU | -0.43804 | 0.16091 | 0.052 | [-0.8780, 0.0020] | |
| IKGPTU - MRSPTU | -0.55904 | 0.16166 | 0.005 | [-1.0011, -0.1170] |
The values in bold are significant at the 0.05 level.
Discussion
The study on service quality and learning quality in technical education programmes at state universities in Punjab provides a comprehensive analysis of students’ perceptions across various demographic variables such as gender, academic class, course, and university affiliation. The investigation seeks to identify the key differences in how students from different backgrounds view these crucial elements of education and service delivery. Understanding these differences allows universities to better tailor their services and educational experiences to meet the diverse needs of their student populations.
The gender-based analysis of service quality and learning quality revealed interesting insights. The results show that there are no significant differences between male and female students in their perceptions of Assurance, Tangibility, Reliability, Empathy, Responsiveness, and Learning Quality. This finding suggests that gender does not play a substantial role in shaping students’ views of these aspects of technical education. Both male and female students seem to perceive the quality of education and services in a similar manner, which is a positive indication of gender equality in the learning environment. The consistency in perceptions across genders suggests that the educational experience is largely inclusive and that universities in Punjab are providing equitable opportunities and services to all students, irrespective of gender.
This finding also underscores the importance of maintaining and promoting inclusive learning environments where gender does not become a barrier to the quality of education. Institutions can continue focusing on creating genderneutral policies and practices to ensure that all students, regardless of their gender, have access to high-quality educational services and outcomes. Moreover, the consistency of service quality perceptions across genders should inspire educational administrators to focus on other factors, such as teaching quality, infrastructure, and support systems, rather than gender-based concerns.
When examining the differences between undergraduate and postgraduate students, the results highlighted significant disparities in several dimensions of service quality and learning quality. Postgraduate students, for instance, consistently reported higher scores in several areas, including Assurance, Reliability, Responsiveness, and Learning Quality. This suggests that postgraduate students perceive the educational services and learning experiences in these areas to be of higher quality compared to their undergraduate counterparts.
These differences may be due to the fact that postgraduate students often have more specific academic and professional goals, which could influence their expectations and perceptions of educational services. Postgraduates are likely to have more direct engagement with their courses and instructors, and their expectations regarding service quality might reflect their more advanced academic standing. This could explain their higher ratings in areas such as Reliability, where they may expect more personalized attention and consistent delivery of services.
On the other hand, undergraduate students reported higher perceptions of Empathy, indicating that they feel a stronger emotional connection and support from their educational institutions. This could be attributed to the fact that undergraduate students are in the initial stages of their academic journey and may require more guidance and emotional support compared to their postgraduate peers, who may be more self-sufficient in managing their academic and personal challenges. The significant differences in Learning Quality scores between the two groups further underscore the need for universities to address the diverse learning needs of students at different academic levels. Universities should aim to create tailored educational experiences that account for the varying expectations and requirements of both undergraduate and postgraduate students.
The analysis based on different academic courses revealed that while there were differences in mean scores across various dimensions of service quality and learning quality, none of these differences were statistically significant. This suggests that students, regardless of their course of study, generally perceive the quality of education and services in a similar manner. While certain courses, such as Engineering and Technical Courses, had slightly higher mean scores in some areas like Assurance and Reliability, these differences were not substantial enough to be considered statistically significant.
One possible explanation for the lack of significant differences is that service quality and learning quality are largely perceived in similar ways across courses due to the standardization of educational services across different disciplines. Universities may be providing similar infrastructure, teaching methods, and support services to all students, regardless of their academic discipline. The consistency in service quality perceptions across courses could also be a reflection of the growing efforts by universities to ensure that all students receive a high standard of education, regardless of their field of study
However, the study’s results suggest that universities may need to pay closer attention to course-specific nuances in service delivery. While the overall perception of quality may be similar, students in certain courses may have unique expectations and needs that are not fully addressed. For example, technical and engineering students may require more specialized resources, such as state-of-the-art laboratories, while students in social sciences or humanities may need a different set of learning support services. Tailoring services to specific academic disciplines could further enhance students’ experiences and ensure that each group receives the support and resources they need to succeed.
The analysis with respect to university affiliation provided valuable insights into the variations in service quality and learning quality across different universities. While most of the dimensions, such as Assurance, Tangibility, and Responsiveness, showed no significant differences across universities, notable variations were observed in Reliability and Learning Quality. Punjabi University, Patiala, for instance, reported the highest mean scores in Reliability and Learning Quality, indicating that students at this university perceived these aspects to be of higher quality compared to students at other universities. In contrast, IKGPTU and MRSPTU had lower scores, suggesting that students at these institutions may be less satisfied with the reliability and quality of their learning experiences.
The differences in service quality and learning quality perceptions across universities highlight the need for targeted improvements at specific institutions. While some universities appear to be performing well in certain areas, others may require additional resources or reforms to address the concerns of their students. The variation in Reliability scores, in particular, indicates that some universities may need to focus more on delivering consistent and dependable educational services. Students’ perceptions of learning quality also suggest that there is room for improvement, especially at universities with lower scores in this area.
The post-hoc analysis using Tukey’s Honestly Significant Difference (HSD) test further revealed that significant differences in reliability scores were observed between IKGPTU and Punjabi University, Patiala, as well as between MRSPTU and IKGPTU. This finding emphasizes the importance of addressing specific aspects of reliability within individual universities. The identification of these differences provides actionable insights that universities can use to enhance their service quality in the areas where students perceive shortcomings.
Implications
The findings of this study highlight the importance of understanding the diverse needs of students across demographic variables such as gender, academic class, course, and university affiliation. Since gender did not significantly affect students’ perceptions of service and learning quality, institutions can maintain their efforts to promote gender equality and inclusivity within academic environments. However, significant differences between undergraduate and postgraduate students in areas like Assurance, Responsiveness, and Learning Quality suggest that tailored interventions are necessary. Postgraduate students tend to report higher satisfaction, possibly due to more personalized academic experiences, which implies that similar enhancements could be beneficial for undergraduates, such as offering more focused mentoring and specialized resources. Additionally, although the analysis of courses did not reveal major disparities, recognizing the unique needs of students in different disciplines could lead to more targeted improvements in infrastructure and academic support. The study also uncovered variations in perceptions across universities, particularly in terms of reliability and learning quality, signaling a need for institutions with lower scores to address these issues through enhanced service delivery and faculty development. By embracing these findings, universities can better meet student expectations, foster a supportive learning environment, and ensure that all students, regardless of their background, receive an equitable and enriching educational experience.
Conclusions
In conclusion, the study provides a detailed examination of service quality and learning quality in the technical education programmes of state universities in Punjab, with a particular focus on demographic variables. The findings suggest that while gender does not significantly influence students’ perceptions, academic class, course, and university affiliation play important roles in shaping these perceptions. To improve the overall quality of education, universities must consider the diverse needs of their students and work towards providing a more personalized and tailored learning experience. Focusing on the areas of service quality that require improvement, such as reliability and learning quality, will help universities enhance student satisfaction and academic outcomes. By addressing these issues, educational institutions in Punjab can foster a more supportive and effective learning environment that meets the needs of all students.
References
- Abdullah, F. (2006). The development of HEdPERF: A new measuring instrument of service quality for the higher education sector. International Journal of Consumer Studies, 30(6), 569–581. https://doi.org/10.1111/j.1470-6431.2005.00480.x
- Abu-Rumman, A., & Qawasmeh, R. (2022). Assessing international students' satisfaction at a Jordanian university using the service quality model. Journal of Applied Research in Higher Education, 14(4), 1742–1760. https://doi.org/10.1108/JARHE-05-2021-0166
- Ansari, J. A. N., & Khan, N. A. (2020). Exploring the role of social media in collaborative learning: The new domain of learning. Smart Learning Environments, 7(1), 1–16. https://doi.org/10.1186/s40561-020-00118-7
- Barnes, B. R. (2007). Analyzing service quality: The case of postgraduate Chinese students. Total Quality Management & Business Excellence, 18(3), 313–331. https://doi.org/10.1080/14783360601152558
- Byungura, J. C., Hansson, H., Muparasi, M., & Ruhinda, B. (2018). Familiarity with technology among first-year students in Rwandan tertiary education. Electronic Journal of e-Learning, 16(1), 30–45. https://files.eric.ed.gov/fulltext/EJ1175337.pdf
- Danjuma, I., Bawuro, F. A., Vassumu, M. A., & Habibu, S. A. (2018). The service quality scale debate: A tri-instrument perspective for higher education institutions. Expert Journal of Business and Management, 6(2), 127–133. https://business.expertjournals.com/23446781-612/
- Diette, T. M., & Raghav, M. (2015). Class size matters: Heterogeneous effects of larger classes on college student learning. Eastern Economic Journal, 41, 273–283. https://doi.org/10.1057/eej.2014.31
- Inan, S., & Karaca, M. (2021). An investigation of quality assurance practices in online English classes for young learners. Quality Assurance in Education, 29(4), 332–343. https://doi.org/10.1108/QAE-12-2020-0171
- Kibuku, R. N., Ochieng, D. O., & Wausi, A. N. (2020). E-learning challenges faced by universities in Kenya: A literature review. Electronic Journal of e-Learning, 18(2), 150–161. https://doi.org/10.34190/EJEL.20.18.2.005
- Kintu, M. J., & Zhu, C. (2016). Student characteristics and learning outcomes in a blended learning environment intervention in a Ugandan university. Electronic Journal of e-Learning, 14(3), 181–195. https://files.eric.ed.gov/fulltext/EJ1107126.pdf
- Koc, N., & Celik, B. (2015). The impact of the number of students per teacher on student achievement. Procedia - Social and Behavioral Sciences, 177, 65–70. https://doi.org/10.1016/j.sbspro.2015.02.335
- Kwan, P. Y., & Ng, P. W. (1999). Quality indicators in higher education: Comparing Hong Kong and China's students. Managerial Auditing Journal, 14(1/2), 20–27. https://doi.org/10.1108/02686909910245964
- Mishra, L., Gupta, T., & Shree, A. (2020). Online teaching-learning in higher education during the lockdown period of the COVID-19 pandemic. International Journal of Educational Research Open, 1, 100012. https://doi.org/10.1016/j.ijedro.2020.100012
- Nakayama, M., Mutsuura, K., & Yamamoto, H. (2017). How note-taking instruction changes students' reflections upon their learning activity during a blended learning course. Electronic Journal of e-Learning, 15(3), 200–210. https://files.eric.ed.gov/fulltext/EJ1146029.pdf
- Ng, E., & Wong, H. (2020, April). Comparing university students' mobile learning practices in China. International Journal on E-Learning, 19(2), 181–203. Association for the Advancement of Computing in Education (AACE). https://eric.ed.gov/?id=EJ1246647
- Patra, S. K., Sundaray, B. K., & Mahapatra, D. M. (2021). Are university teachers ready to use and adopt e-learning systems? An empirical substantiation during the COVID-19 pandemic. Quality Assurance in Education, 29(4), 509–522. https://doi.org/10.1108/QAE-12-2020-0146
- Sánchez-Mena, A., & Martí-Parreño, J. (2017). Drivers and barriers to adopting gamification: Teachers' perspectives. Electronic Journal of e-Learning, 15(5), 434–443. https://files.eric.ed.gov/fulltext/EJ1157970.pdf
- Stankovska, G., Ziberi, F., & Dimitrovski, D. (2024). Service quality and student satisfaction in higher education. In Education in developing, emerging, and developed countries: Different worlds, common challenges (Vol. 22, pp. 153–160). BCES Conference Books. https://bces-conference.org/onewebmedia/2024.153-160.Gordana_Stankovska_et_al.pdf
- Stodnick, M., & Rogers, P. (2008). Using SERVQUAL to measure the quality of the classroom experience. Decision Sciences Journal of Innovative Education, 6(1), 115–133. https://doi.org/10.1111/j.1540-4609.2007.00162.x
- Udo, G. J., Bagchi, K. K., & Kirs, P. J. (2011). Using SERVQUAL to assess the quality of the e-learning experience. Computers in Human Behavior, 27(3), 1272–1283. https://doi.org/10.1016/j.chb.2011.01.009
- Wolverton, C. C., Hollier, B. N. G., & Lanier, P. A. (2020). The impact of computer self-efficacy on student engagement and group satisfaction in online business courses. Electronic Journal of e-Learning, 18(2), 175–188. https://doi.org/10.34190/EJEL.20.18.2.006
- Zafiropoulos, C., & Vrana, V. (2008). Service quality assessment in a Greek higher education institute. Journal of Business Economics and Management, 9(1), 33–45. https://doi.org/10.3846/1611-1699.2008.9.33-45
HPUJ