Archive for the ‘Physics’ Category

Physics and political correctness

November 24, 2011 Leave a comment

I have been meaning to write this since I saw the excerpt from Lubos Motl’s post in Peter Woit’s blog, but for reasons pointed out in an earlier post, I was unable to connect to and read Motl’s post in full. I did manage to read the post today, but (apart from a confusion between 10^(10^100) and the headquarters of Google, Inc.) what interested me was already in that quote:

I think that all the people are being bullied into not criticizing the junk written by other people who are employees of the academic system, especially if the latter are politically correct activists. And be sure, some of the authors of this nonsense are at the top of it.

I would guess, from the experience of occasionally reading his posts, that by `politically correct activists’, Motl means people whose political beliefs are left of center, support liberal positions including feminist and climate-control positions. Thus he seems to suggest that if physicists who are politically on the left write junk, most of their colleagues are `bullied into’ silence.

I would be very surprised if people like Witten, Strominger, Polchinski or Sen (to name a few) could be `bullied into’ silence. And on the flip side, I would consider myself as much on the political left in my beliefs as any leading physicist, and there has been no shortage of criticism of my work even if it was not `junk’. A likelier explanation is that political beliefs are not very important in the criticism or the lack of it.


Bhagavadgita in IIT

November 14, 2011 Leave a comment

It seems that in a thermodynamics class at IIT Kanpur, the teacher referred to the Bhagavadgita — a reversible process is apparently what the book had in mind when it said `do your work without any hope of benefit’. Shouldn’t that be an irreversible process?

New way of doing physics

May 21, 2011 Leave a comment

There is a new way of doing physics. It involves doing no calculations and making no predictions. At least not in the way of what generations of physicists have been taught as calculations or predictions. It involves intermixing various phrases from physics and philosophy of science, and reaching the inevitable conclusion that while numerous predictions can be made in principle, not one specific prediction can be made at this time.

To be fair, this is not an entirely new phenomenon, the `foundations of quantum mechanics’ people have been writing such papers every now and then. On the other hand, string theory and supersymmetry have been making predictions from the very beginning, but not one of the specific predictions has been observed, leading to the observable scale of these predictions being pushed to higher and higher energies. Now with the LHC producing voluminous data showing that there is nothing at low energies except the Standard Model (and maybe not even that, if the Higgs fails to show up), more and more people form the `currently unobservable predictions’ camp are moving into the `unspecific predictions’ camp.

The latest paper of this sorry genre is discussed here and here.

Indices again

April 23, 2011 1 comment

Via Nanopolitan: Current Science carried a letter by Diptiman Sen and S. Ramasesha, two physicists at the IISc, pointing out that the h-index is not a good `scientometric’ index. Unfortunately, the arguments they have used to establish that are not all equally good. In particular they suggest that the Nobel Laureate V. Ramakrishnan has a low h-index. This was picked upon by two other scientists who point out the actual figures are not particularly low. And in between poor arguments and rejoinders, many other important arguments against using the h-index got lost.

Some of these other arguments were given here and here, and many more can be found all over the cyberspace and blogosphere. My arguments against scientometric indices in general, and h-index (and journal impact factor) in particular, are similar to those given in these links, but also try to take into account the conditions special doing science in India. These are, not necessarily in any particular order, are the following.

1. Most (or all?) such indices are based on only the number of citations (total or per year), so no matter how a metric is designed, it is only the number of actual citations which enters the metric — so any of these metrics is ultimately calculated from only one parameter. Or perhaps two, since the rate of growth of citations may be included. Some metrics would include the number of papers (total or per year) as well. That is another parameter, but the number of citations is not completely independent of the number of papers, and the number of papers is usually not a good measure of their quality, so including that number does not improve the quality of the index either.

2. Even if different ways of using the citation count (and paper count) lead to qualitatively different indices, the standard indices are still numbers associated with only one individual (or one institution), the one being evaluated. This cannot make any sense, since high or low values may be systemic. For example, mathematics has fewer papers than  medicine, than even specialized branches of medicine like oncology, and consequently fewer number of citations as well. So any index that can be applied to both mathematics and medicine will have to take into account its behavior specific to that field, and thus require some sort of comparison within the field. This is never done as far as I can gather, either in the construction or in the usage of these indices.

3. Even if we can make a comparative index, for example by taking ratios or percentiles within a field, that is not likely to hold up against historical data.  That is, given some index — h, g, or whatever — for some string theorist, we can come up with a `normalized’ one by taking its ratio with the same index for Witten, but the same normalization is not likely to make any sense for Born or Einstein, say. Of course they were not string theorists, and neither is `normal’ a word one should use for any index related to Witten. Still, the explosion of citations is a relatively recent phenomenon, and related to the explosion of papers, so the variation of any index with time — for individuals as well as within fields — need to be taken into account.

4. Many scientists work across disciplines, many more work across subfields. It makes no sense for such people’s work to be evaluated by a single index, as the index may have different ranges in the different fields or subfields. For example, someone working mostly in mathematics and occasionally in string theory may end up with an index which is low compared to string theorists and high compared to mathematicians. How should something like that be used?

5. Indices are used for different purposes at different career stages. So it does not make sense to use the same index for people at different stages of their career.

6. There may be `social’ factors in the rise of citation count of specific papers or individuals — some are obvious and `nearly academic’ ones, like the popularity or currency of a field or a problem — the bandwagon effect. Then there is the `club’ effect — I cite you, you cite me, friends cite friends — which can work wonders in small subfields. There may also be less academic and more career-oriented reasons — it is very likely that papers cite probable referees more often, so that a paper does not come back for revisions simply because `relevant literature was not cited.’ I would not be surprised if this mechanism gets reinforced for people with many collaborators — a paper might be rejected if it did not cite the papers of the referee’s collaborators.

There are also several issues special to Indian science, which have to do with how appointments and promotions are usually effected in India. As noted by G. Desiraju in the letter to Current Science,

it was possible, in the days before we had scientometric indicators, for committees of wise men to simply declare an incompetent as an outstanding scientist.

Unfortunately, it is still possible. But that is another discussion.

Not Physics

September 28, 2010 Leave a comment

Some years ago, in a discussion about physics (in India, but it could be generalised easily) I described the work of some physicists as “they check the consistency of conjectured properties of non-existent objects.” I was talking about some very well-known physicists working in string theory and related fields.

This paper is somehow worse, as the authors turn a non-existent non-object (a cutoff) into something physical, and then use what appears to be very poor logic to claim results out of this.

For more discussion, see here or here.

Categories: Physics Tags:

Chinese competition

August 13, 2010 Leave a comment

Apparently the Dept. of Science and Technology has sent out a question to all the research institutes under it — how can India be more competitive with China and Japan in science and technology. Apparently this was a question asked in the parliament.

To many, a big part of the answer is obvious, dedicate more resources to school and college education, open more universities, and do not concentrate finances in research institutes. But the research institutes will not say this.

A smaller part of the solution is to support Indian science journals.

Spending without a research grant

June 6, 2010 Leave a comment

We all get to hear stories about how research grants are spent or wasted. Recently I came to know about another channel of flow. Every research institute buys instruments from research grants, but also outside research grants. This comes from the institute’s `research budget’, which is usually finagled out of the total budget by the Director. This usually means that there is no money for other things,  whether it is a students’ hostel, or library subscription.

In some places, or perhaps some cases, the allocation of money to buy instruments (or computers) from the institute’s budget follows a procedure similar to a grant application in which a detailed budget has to be made by the principal investigator (PI) and the research proposal has to be defended in front of a committee. Usually this committee is not as strict as the grant committees of the national agencies, but still, there has to be a defence.

A few days ago, I saw some figures for one of the smaller research institutes in the city. In the last five years, this institute has spent about Rs. 100,000,000, mostly in foreign exchange, to buy instruments on its own. Yes, that is 10 crores. On average 2 crores a year. This is without taking into account the instruments bought from sponsored grants, but that figure is not comparable. In other words, granting agencies did not agree to give similar grants to this institute for buying instruments. And I was told that in many cases, these instruments were paid for by the centre because the granting agencies refused to.

The annual maintenance of all these instruments is more than a crore. The support expenses, due to power, air conditioning, and consumables, is probably of the same order. Which is less than what it would cost to promote all the scientists there to the highest pay grade. But that procedure requires application, CV, letters of recommendation, and an interview committee. Why should similar amounts of money be spent without any checks?

Someone suggested that it is because cut money is involved in the purchases. I do not wish to believe him, but I do not have a satisfactory answer either.