Overemphasis On Numbers, Overlooking Teaching Quality: Study Flags Discrepancies In NIRF Rankings
Overemphasis On Numbers, Overlooking Teaching Quality: Study Flags Discrepancies In NIRF Rankings
The study highlights that while NIRF rankings aim to enhance transparency and accountability, there are several inconsistencies inherent in the framework that raises concerns about their reliability

Overemphasis on bibliometrics, overlooking teaching quality, inadequate transparency in methodology, huge fluctuations in rankings and partial evaluation of online education, are some of the major “inconsistencies” in the Indian institutional rankings flagged in a research paper.

The study, published by Prof V Ramgopal Rao, former IIT-Delhi director, who is currently the group vice-chancellor of BITS, Pilani campuses, aims to suggest ways for increased transparency in the rankings framework.

The National Institutional Ranking Framework (NIRF) rankings report 2024 was released by Union Education Minister Dharmendra Pradhan on August 12. The framework was developed by the central government in 2015 released its first report in 2016 to assess higher education institutions in the country on different parameters.

The study highlights that while NIRF rankings aim to enhance transparency and accountability, there are several inconsistencies inherent in the framework that raises concerns about their reliability. The study titled ‘Unpacking Inconsistencies in NIRF’ authored by Rao and Abhishek Singh, also from BITS, Pilani, is published in the Current Science journal.

The study suggests that the identified inconsistencies underscore the need for an ongoing dialogue and refinement of the ranking framework. “It is essential to acknowledge that the rankings, by their nature, subtly influence perceptions. Thus, the identified issues, if left unaddressed, may impact the credibility and relevance of the NIRF rankings, potentially affecting the perceptions of stakeholders such as students, parents and policymakers,” the paper stated.

Challenges in NIRF’s Bibliometric Approach

According to the study, overemphasis in the rankings on numbers falls short of encompassing critical elements like relevance, innovation, social impact and contributions beyond traditional publishing. This approach raises concerns about the comprehensive evaluation process, especially within the research and professional practice metric.

“The focus just on numbers like how many research papers or citations doesn’t capture the actual contribution of an institution. For example, if a researcher has developed a technology that has helped improve life in rural areas would go unaccounted for,” said Prof Rao speaking to News18.

“A truly effective assessment of institutional performance should consider a broader spectrum of characteristics to avoid an incomplete and skewed representation of the contributions of an institution to academia. Exacerbating these problems is the bibliometric methodology employed by the NIRF, which relies entirely on commercial databases for data collection. This dependence reveals shortcomings in terms of scope, precision and the incorporation of non-traditional research outcomes. Consequently, there is a risk of overlooking institutions that contribute significantly in non-traditional ways, thus undermining the diversity of academic contributions,” the paper stated.

Overlooking Teaching Quality

The study points out that the NIRF rankings lack specific mechanisms to directly assess teaching quality, overlooking crucial aspects such as classroom observations, student evaluations and alumni feedback.

“The omission of these evaluation methods hinders a comprehensive assessment of teaching effectiveness, leading to an incomplete depiction of the educational prowess of an institution. Also, the limited focus of the NIRF rankings on practical training elements, such as hands-on projects and internships, leads to an undervaluation of institutions that prioritize experiential learning,” the paper stated.

According to Prof Rao, student feedback is crucial to assess teaching quality. “Another important way of assessing teaching quality is by looking at the placements, the quality of jobs that students have achieved and the median salaries,” he said.

Partial Evaluation of Online Education

The paper sated that while the on-line education sub-metric encompasses data pertaining to both the Online Completion of Syllabus and Exams, and Swayam (an initiative by the Government of India aimed at achieving access, equity and quality in education), the assessment is singularly focused on the quantity of courses developed and made available on the Swayam platform. This is discernible through instances where certain institutions with extensive on-line programmes receive no score, despite providing data for the Online Completion of Syllabus & Exams.

“The rankings only capture those hosted on the Swayam platform, which is possible only for the IITs since it has been developed in such a way. This completely overlooks the online education being provided by other institutions. For example, BITS runs the Work Integrated Learning Programme, which is one of the country’s largest such programme, but it’s not accounted for in the current rankings framework,” said Prof Rao.

Inadequate Transparency in Methodology

According to the paper, while the NIRF does provide insights into its rating methodology, there is need for a more exhaustive and accessible documentation. To mitigate ambiguity and potential misinterpretations, it is imperative to establish unambiguous and explicit definitions of metrics, especially those which capture financial data. Formulating clear and well-defined rules and criteria is essential to ensure a standardized and equitable assessment. This proactive step not only promotes a consistent and fair evaluation, but also eliminates disparities that may arise due to varying interpretations.

Limited Global Benchmarking

The study highlights that the NIRF rankings have a significant limitation, which is limited global benchmarking. The absence of a robust mechanism for international comparison impedes institutions that aspire to attain global recognition. “In an era where cross-border collaborations and the exchange of academic ideas are increasingly prevalent, the NIRF rankings fall short in providing a comprehensive evaluation that extends beyond national borders,” it stated.

Singh, co-author of the paper said that unlike global rankings, Indian rankings are still evolving and are expected to have a global reach over the years. “It is crucial to reset the parameters and incorporate more detailed view of an institution. It should also increase the number of stakeholders taking the survey including students, alumni and industry,” said Singh.

What's your reaction?

Comments

https://umorina.info/assets/images/user-avatar-s.jpg

0 comment

Write the first comment for this!