Bryan Lawrence : Bryan's Blog 2004/12/07

Bryan Lawrence

... personal wiki, blog and notes

Bryan's Blog 2004/12/07

Citation

I recently had to make a statement on the impact of publishing in a particular journal. I did it, but didn't like doing it. Most scientists probably have a subliminal hierarchy of journals, something like (for me):

  1. Wow (Nature)

  2. Impressive (Tellus, J.Atmos.Sci., Q. J.Roy.Met.Soc, J. Climate. Climate Dynamics)

  3. Very Good (J. Geophys. Res., Geophys. Res. Letters., Annales Geophysicae)

  4. Good (J. Atmos. Terr. Phys.)

  5. Worth Doing (Weather, Bull. Am. Met. Soc)

  6. Not worth it (censored :-) ).

The order has no real meaning to anyone but me, and reflects my interests at the moment (When I concentrate on mesospheric work, I would rate JASTP higher and climate journals lower for example). But these sorts of lists seem to have rather a lot of resonance with bean counters, one such list is the in cites journal impact factors ( my local copy). These impacts are critically important to the evaluation of University research (e.g. the UK RAE), but I wonder how sensible they really are. Of course it's easy to produce such a list, but given my own priorities change, surely so too do the communities - are we really a normal distribution? What about people working in areas where there are simply less people working? Should academic research rankings depend on the herd? I'm sure the RAE normalises these things somehow, but it's still rather discomforting.

And what about Open Access publishing? Those in the scholarly publications game know that the world is moving towards Open Publishing. There is evidence that open access publishing increases impact, but most of us don't quite match that with our subliminal ranking of journals with our actions. It is certainly true now that i am far more likely to read a journal article if I can get to it online, and that is starting to sway my decision about where to publish (actually, I don't get time to think about publishing science at the moment, so this is a bit theoretical).

Of course, wearing my data hat, the really big changes we can expect in ranking scientific output will come when we have effective methods of citing datasets! One very interesting project doing work in that area is introduced here. Of course, there is formal citation, and there is google ... and now http://scholar.google.com. Hopefully we can move towards the same "easy" form of data citation too.

by Bryan Lawrence : 2004/12/07 : Categories climate curation (permalink)


DISCLAIMER: This is a personal blog. Nothing written here reflects an official opinion of my employer or any funding agency.