NEWS

How We Can Have A Credible NIRF Ranking

By Professor N RAVICHANDRAN
April 06, 2023 14:54 IST

Whenever there is an evaluation scheme, it is human tendency to score as high as possible.
None would like to highlight their shortcomings, and everyone likes to highlight their positives.
This is what is happening in this context of rankings as well, points out Profesor N Ravichandran.

Illustration: Dominic Xavier/Rediff.com
 

There have been news reports on the National Institutional Ranking Framework* (NIRF) showing the ranking systems and processes in poor light.

It appears that individuals/groups of individuals have had access to the source code of the NIRF ranking system and therefore, they had been able to inflate their ranking.

This has resulted in lower credibility to the ranking system itself.

The courts have started an investigation on this, and the regulators concerned will also do whatever is necessary with the government's support to fix the problem and improve the system in the near future. Therefore, I will not be delving into that aspect of the story in this note.

Indian institutions -- whether they are focused on technology, science, management or biological sciences, law, architecture -- have been generally ranked based on an international ranking framework.

It is a welcome addition that we have our own ranking system. This would bring the much-needed focus on the context in which Indian institutions operate and reduce the international biases on certain parameters.

Also, the use of purchase power parity construct in terms of placement, admission, salaries, course fee and many other activities may not be very appropriate.

The ranking process and the associated outcome have evolved into something complicated than anticipated.

Whenever there is an evaluation scheme, it is human tendency to score as high as possible.

None would like to highlight their shortcomings, and everyone likes to highlight their positives. This is what is happening in this context of rankings as well.

It is also a human tendency that given a frame of reference, one would try to outsmart that frame of reference as much as they can.

That brings its own inherent biases in responses and presentation of data, lack of transparency, etc. Somehow, some actors (institutions) have gone to the extent of influencing their rankings by accessing or manipulating the source code.

This is a serious breach in the system and should be avoided at any cost. Regardless of the amount of supervision we do, it will be difficult to ensure that the system is 100 percent foolproof.

The stakeholders involved are the institutions, the students, the alumni, community, government, regulatory environment, etc, and each one has their own agenda/objectives which makes it very difficult to call for accountability on the process and ensure that whatever is submitted on the ranking portal is authenticated.

This is the crux of the problem.

A time has come wherein we need to look at some innovative and radical solutions. The solutions can be on the following lines:

Instead of the regulatory agency coming up with a frame of reference and asking the institutions to fill out or upload the data, one may consider a system where any institution which wishes to be a part of the ranking process and system can upload the information on their relevant Web sites with a copy mirror image marked to a central Web site.

The various components of the ranking can be indicated.

The procedure by which marks are allocated to the individual components can be mentioned along with the weighted averages.

The system will automatically give the overall index.

It is the institution's responsibility to make statements/data for any line item and provide by hyperlink the documents necessary to support its response.

This will be made accessible to all stake holders.

The regulatory agency will have no role to play except keeping a mirror copy of the data that has been uploaded on the respective web sites of the institutions.

This change would bring significant pressure on the institutions since they are now accountable for what they are saying, and all information is made public.

The student community, staff, the faculty, and the stakeholders will have a role to play since they will be able to examine whether the claims made by respective institutions are true or otherwise.

There can also be a whistle blower policy to bring out the discrepancies if any, between the information on the Web site and the documents that are uploaded to support that information.

The whistle blower complaints can be used to investigate what is really happening and based on the severity of the situation, appropriate warning messages/penalties can be imposed to the institution.

The display, by the institution of the content and the number of warning messages, on their Web site can be made mandatory.

This, in some sense, will bring public accountability and automatically ensure transparency.

The institutions will have the right to choose to participate in ranking or otherwise.

This is a very radical way to put the onus of the ranking on the institution itself rather than the regulatory agency.

Obviously, this may need enacting relevant provisions of the law to deal with perpetual deviations.

This means comprehensive exposure and education of various stakeholders, opening the ranking system to the individual institution and if need be, create a legislative framework by which they can be brought to accountability should they consistently report wrong information.

Such a system will evolve dynamically and stabilise over a period of time.

If, despite of all this, people get cheated, the responsibility will not lie with the regulatory agency. It will be the responsibility of the institution.

The whistle blower policy can be effectively used to keep a check on the system.

To simplify, I suggest that an eminent committee is formed for addressing the whole issue of implementation of the ranking system.

Let's not get into how the current situation can be fixed.

Let's address the larger issue as to what can be the relevant ranking system for institutions in our country.

I have experimented with this kind of a self-management system in a small setting of a classroom environment.

I had asked students to grade their own answer scripts given a rubric, and invariably they came up with a conservative assessment rather than an inflated assessment.

Since the fear of being transparent and being evaluated by the peer is high, people tend to converge to what is appropriate.

We may want to try a similar approach. This is a very simple, yet a powerful change.

Professor N Ravichandran is a former professor at the Indian Institute of Management, Ahmedabad.

*Wikipedia states the 'National Institutional Ranking Framework is a methodology adopted by the Ministry of Education, Government of India, to rank institutions of higher education in India. The Framework was approved by the MHRD and launched by Minister of Human Resource Development on 29 September 2015.'

Feature Presentation: Ashish Narsale/Rediff.com

Professor N RAVICHANDRAN

Recommended by Rediff.com

NEXT ARTICLE

NewsBusinessMoviesSportsCricketGet AheadDiscussionLabsMyPageVideosCompany Email