Monday, December 14, 2009

On the Absurdity of the use of Journal Impact factors as a measure of Individual Academic Excellence

 Recently the University Grants Commision of  India as well as our  University has begun the process to use Impact factor of Journals published in by faculty  as a basis for rewarding academic performance. However the truth is that the impact factor of a Journal, while perfectly reasonable as a way of evaluating Journal performance or rank , is almost completely discredited as a means of evaluating individual performance. To quote the Wikipedia (ImpactFactor)

"The impact factor is often misused to evaluate the importance of an individual publication or evaluate an individual researcher. This does not work well since a small number of publications are cited much more than the majority - for example, about 90% of Nature's 2004 impact factor was based on only a quarter of its publications, and thus the importance of any one publication will be different from, and in most cases less than, the overall number. The impact factor, however, averages over all articles and thus underestimates the citations of the most cited articles while exaggerating the number of citations of the majority of articles. Consequently, the Higher Education Funding Council for England was urged by the House of Commons
Science and Technology Select Committee to remind Research Assessment Exercise panels that they are obliged to assess the quality of the content of individual articles, not the reputation of the journal in which they are published."

Yet the UGC and following it  Indian Universities, including ours,     have  blithely set in motion an elaborate exercise   precisely to use Journal  Impact factors  as a metric for academic performance !  Such a belated  imitation of long passe international fashions is not unknown to our ex-colonial country !  Many issues,  not the least of which is an  implicit acceptance that Indian academics can never hope to achieve globally competitive academic status, are involved in the absurdity of implementing this "reform'' long after the Journal  Impact Factor (JIF)  has been completely  discredited as a measure of individual scientific performance. It is interesting that individual citation measures, for example the number of citations received by a particular paper in the each of the preceding three years,  which is based on the same idea as the JIF applied to Individuals(or any of many others for example , just to cook up a new one I have not seen used before,  the change in the square of the H-index in the previous year ),  though also  imperfect, but still much more reliable for measuring individual performance than Journal Impact factors which only compare journals on the average and have nothing to say on individual merit, are simply not considered either by the UGC or by this University.  Could this be because they do not lend themselves to the misinterpretations pointed out above and below ? 


To appreciate the cogency of these remarks  one must first recall the definition of the impact factor. The Wikipedia defines it thus  :

"In a given year, the impact factor of a journal is the average number of citations to those papers that were published during the two preceding year. For example, the 2003 impact factor of a journal would be calculated as follows:

A = the number of times articles published in 2001 and 2002 were cited by indexed journals during 2003

B = the total number of "citable items" published in 2001 and 2002. ("Citable items" are usually articles, reviews, proceedings, or notes; not editorials or Letters-to-the-Editor.)

2003 impact factor = A/B

(Note that 2003 impact factors are actually published in 2004; it cannot be calculated until all of the 2003 publications had been received by the indexing agency.)"

Now with such a definition the absurdity of using the Impact Factor as a measure of individual performance  becomes easy to demonstrate by quite realistic looking examples.  Imagine that we have an author (say a Physicist for concreteness) who publishes a paper in 2005 in a ``low'' impact factor journal (say 2.0)  which receives   20 citations in 2005, 25 citations in 2006 and 30 citations in 2007. (clearly he will have set his subfield agog by his papers !).   At the same time another author  publishes an article in a ``high''  impact factor journal (say JIF=3 : the actual cutoff suggested for Physics faculty)  as published in 2008 : but this JIF  would be based on the journal's performance in 2005 and 2006 as revealed in citations during 2007  (see definition above).     Then according to the academic evaluation criteria based on  Journal Impact Factor , the academic performance of the faculty member whose articles are cited 20 times or more each year and in fact 30 times in 2007 are judged to be less worthy of recognition and encouragement than the mere fact of publication  in  a "high impact factor'' i.e  JIF=3   journal in 2007  which   is statistically nearly certain  (like most publications in so called high impact factor journals)  to never be cited even 3 times a year (see first quote from Wikipedia)!   This makes it evident that the proposal to implement JIF based assessment of academic performance is actually not based on the identification of excellence but the dressing up of mediocrity on the basis of  a pseudo objective measure actually applicable to the rating of completely different entities namely Journals and not individuals at all.
This ``category mistake"  is aided  somehow by an implicitly  subservient attitude that sees simply being accepted by the established scientific powers by being allowed to publish in their journals as an acceptable("realistic/objective") "consolation  substitute"  for an objective measure of scientific excellence as revealed by consistent citation by one's peers .  The issue of  using actual  citation measures adapted to and designed for the evaluation of individual scientific performance is simply brushed under the carpet because it is implicitly assumed to be too hard to achieve a high score on these. However such half measures can never actually lead to scientific excellence since the first precondition of a dynamic and self confident Science is truthfulness.


 The UGC and university authorities should wake up to the developing absurdity that will soon entrench itself and become established wisdom that will entail another 50 years of academic mediocrity . They should  adopt numerical measures-- if at all they have the ability and self confidence to use numerical measures objectively and not a la Disraeli i.e as the evil third in ``lies , damn lies and statistics"-- that are in line with the best metrics available globally for measuring individual performance, and not further nourish the absurd  laurels for mediocrity  that are the bane of the quest for excellence in the Indian acdemic system.















 
 

Sunday, November 29, 2009

Why the web-page ? 29 November 2009

CHARANJIT SINGH AULAKH'S BLOG


Why the web-page ? 29 November 2009

 Well here I am, after prolonged resistance, also trying to be an active presence on the web. So why, after resisting for nearly 15 years, did I finally decide to put up a web page and start this Blog and do all the things that are such a routine part of generation NeXt ? Well maybe its only that I don't like to fall more than a technology wave or two behind the surging masses of the cyberalive .
Moreover there is much to be said for the view that all persons in a public positions -including University faculty who daily impinge upon the mental formation of coming generations of social elites- have an obligation to make available the  facts concerning their academic performance for the benefit of informed choice by students and the public seeking connection with the University system.

However what actually finally pushed me to make the effort and overcome my rather archaic distaste for self publicity was a rather tendentious, even meretricious, article by Neha Miglani, a reporter for the The Tribune, a newspaper based in Chandigarh, India where I work . The article is objectionable to several faculty in my University because it gives a partial, distorted, even laughable account of the citation status of Panjab University faculty. From the omissions and inclusions it seems designed to serve the end of promoting as stupendous achievements what are, to any informed person, rather routine and mundane levels of citational visibility. Unfortunately, possibly in order to perform meretricious services, the reporter omits all mention of  several faculty members in the Physics Department(and possibly others elsewhere) whose citations are much higher than those artificially elevated (by this omission ) to the top of the list actually published. In itself this might have been ascribed to a minor lack of diligence on the part of the reporter. However, quite in keeping with the traditional hauteur of the print media - especially amplified in India by their cozy and well oiled nexus with the fonts of power - not only did the Editor of the Tribune refuse to even acknowledge an email from me, but even a joint statement by three senior Professors and a very well cited Reader at my university , and a further open letter (with the same content and four authors ) to the Tribune ``Letters to the Editor'' elicited absolutely no acknowledgment. Thus they made clear that there would be no attempt at rectifying the misleading and tendentious article which by its omissions had obviously  unjustifiedly   impacted   perceptions of our academic status in the community. Now all this will come as no surprise to world weary and cynical academics and the general public, at least in India. Indeed the reaction for as long as one can remember would always be ``I told you  its no use protesting ....''.

However many things that were seeming permanent absolutes-such as the power and invulnerability of the Newspaper media- have begun begun to wobble in the Internet age. The Internet offers a recourse against oppression by the print media, to those who for all too long would have had none. Therefore to make a modest contribution to the struggle against the insensitivity of the established local media to any questioning of the dubious authenticity of their publications, I decided to put on record my basic academic data on my webpage   along with a link to the article.    This is in the hope that this will begin to take the discussion beyond just establishing what are the correct raw citation data (which right now are imagined to be easily falsifiable by the likes of Neha Miglani !) .  I hope this  will aid more informed discussion locally and perhaps even in India generally , of issues concerning academic performance and its objective rating using the slew of quantitative citation indices that have been developed. The  need to  improve the quality  of academic and research  performance in India   is a perpetual occasion for  moaning, hand wringing and Crocodile Tears generally, by all concerned, but    almost no effective steps are ever actually taken because it would imply noting too many inconvenient facts and implementing incommodius reforms , not to speak of treading of various naked toes! . It may also contribute a little to discourage the blatant perversion of facts often indulged in by local media, for dubious local motives, secure in the knowledge that affected (localized) individuals are simply incapable of dispersing the distorted images and facts propagated by such local journalists and "sources '' with ulterior motives.

Issues concerning the value of the manifold new forms of quantitaive ranking of academic performance are ever more to the forefront of academic attention these days. In my specialization (Theoretical High Energy Physics) we have a long established and very complete Bibliographic and Citation research service called HEP-SPIRES that is openly accessible and free of charge . Unfortunately the other services available like SCOPUS , and Web of Science are paid for and hence hard to access for the general public. Various measures including the famous H-index (1995) and the latest Phys author rank based on the Google Page Rank algorithm have caught the attention and imagination of many in the academic community and the percolation of these measures, which can be seen as part of the List-making frenzy prevalent on the Internet generally, is an important instance of new modes of social interaction that have arisen due to the advent of the internet. We scientists are only human, all too human, and all too ready to agonize on our place in the scheme of things. I hope to write    an article explaining these issues to beginners  so this is as good a place as any to start putting down the relevant arguments. Readers are welcome to comment and weigh in for or against any particular method or for all .

The hope is also that I may use this Blog from time to time to express my musings on diverse subjects : as one grows long in tooth one feels the need to ramble on, but a live audience is always harder and harder to corall. So you,  dear clickers, will be my not-so-captive ears if and when I have something to say : hopefully it will not always be a rant about injustices done to me personally!