Top Ten Reasons Why HEC Abandoned Quality Ranking of Universities

Recent decision of HEC to abandon its quality rankings [1] needs to be welcomed. It was an initiative motivated by hubris with scant regard to societal and industry requirements. It was conceived and designed by those who wanted all the universities to become research universities and clones of each other. It was a whip with which to make all the universities fall into the same line ignoring their local, geographical, societal, and industry specific contexts and distinct advantages. The criteria appeared to have been designed to satisfy the Western industry and societal requirements, and not from the point of view of national and regional requirements.  The ranking was called quality ranking but was nothing more than measurement of quantities devoid of any qualitative input. It was plagued by political considerations for a quota system. It had the audacity to measure quality of teaching with PhD degrees, and did not discriminate at all across different disciplines with their unique requirements. Above all, there was no mechanism of verification of data. The ranking criteria had too many infirmities and was modified each time under political pressure. It gave too much weight to input and too little weight to outcomes and impact under the assumption that output measurements will suffice. Societal and Industry impact measurement was conspicuous by its absence. Bureaucratic measures of impact were considered to substitute for feedback from industry and society. There was little recognition of the quality of university processes. Above all, HEC being a funding agency and a regulator could not simultaneously also act as a ranking agency because it would try to give more weight to its own funding decisions. Hence, the ranking criteria became nothing more than the criteria of measuring the decisions of HEC fundings.

[Work in Process]

Distortions Caused in Higher Education Focus

Measuring Quality vs Measuring Quantity

Designers of ranking criteria focused on convenience and availability of data rather than on trying to measure the qualitative parameters. No measure of process.

Equated Teaching Quality with Number of PhDs

HEC ranking naively assumed that having a PhD automatically improves the quality of teaching and engagement with the students. It ignored that a PhD is only an acknowledgement of attaining the status of independent researcher. A PhD says nothing about the quality of teaching. In fact, good researchers are often worse teachers. 

Quota System for Provincial and Public Sector
Assumed all disciplines are same and numbers have the same significance for different disciplines.
No System for Independent Verification of Data

Criteria Disconnect with Society and Industry Needs

Too much Criteria weight to Inputs
Too much Criteria weight to output and ignored outcomes and process
Negligible Measurement of Societal and Industry Impact
Didn’t Consult and Measure Industry and Employers’ Feedback
A Regulator and Funding Agency Can’t be Ranking Agency


[1] HEC Abandons Ranking of Universities

See also:


    Leave a Reply

    Your email address will not be published. Required fields are marked *