Tag Archives: publishing

What does arXiving mean?

What does it mean to post a paper to arXiv?  More specifically, a paper that has not been accepted to a peer-reviewed venue; less specifically, to any easily-searchable, time-stamped, respected depository.

Scenario A: You have a result, but there is no decent deadline for another few months.  Maybe you know that a ‘competing’ team is working on the same result.  Should you post to arXiv?  Would that actually protect you from being scooped if someone else published the result in the meantime (perhaps at a venue that you deemed unsuitable)?

Scenario B: You are building on result B that has appeared in arXiv, but has not been accepted (yet?) at a peer-reviewed venue.  You have verified the work in B.  Can you reference an un-traditionally-published work?

Scenario C: You are reviewing a paper C and, being a diligent reviewer, you brush up on the latest in the area.  You find a very relevant paper posted on arXiv, paper X, dated before paper C would have been submitted.  Paper C makes no reference to paper X.  What do you do if: Paper C seems awfully similar (similar techniques, similar results) to paper X? Does your opinion change if Paper C is a subset or superset of paper X?
I suppose as a reviewer, you would review the paper and point out paper X to the editor/PC member.  But as an editor/PC member, what do you do?  After all, it is possible for independent researchers to come up with the same result using similar techniques at the same time (I have seen this happen).

What does arXiving mean?  Does it do more than provide an easy repository for papers?  Do we (in TCS) treat arXiv differently than other areas?

Journals ranked by turnover times: now with colour!

Based on David’s link to the AMS data on journal backlogs in my last post (thanks Dave!) and the ISI Web of Knowledge citation report, I’ve wasted some time making the following fancy graph. There are some obvious missing journals that I didn’t have the data for: Theory of Computing (no impact factor), JACM (no backlog times), etc. If you have this data, I would be happy to add them.

Right now, the plot shows time from submit to accept against the impact factor (IF) with journal’s coloured by publisher and size indicating their volume by number of articles. All data is for 2008. It’s interactive! Switch to the 5-year impact factor! Fun!

So, now, I know that impact factors have little meaning in our field. I’d be happy to switch to some other more meaningful ranking. Feel free to comment your suggestions.

But what do you think: would you actually not submit to the SIAM Journal on Discrete Math based on this?

[iframe http://oj0ijfii34kccq3ioto7mdspc7r2s7o9.spreadsheets.gmodules.com/gadgets/ifr?up__table_query_url=http%3A%2F%2Fspreadsheets.google.com%2Ftq%3Frange%3DB2%253AI13%26headers%3D1%26key%3DttBltOeX1ZK-992JA1GeOiA%26gid%3D4%26pub%3D1&up_title=Journal+wait+times&up_initialstate=%7B%22duration%22%3A%7B%22timeUnit%22%3A%22Y%22%2C%22multiplier%22%3A1%7D%2C%22nonSelectedAlpha%22%3A0.4%2C%22yZoomedDataMin%22%3A6%2C%22yZoomedDataMax%22%3A17.7%2C%22iconKeySettings%22%3A%5B%5D%2C%22yZoomedIn%22%3Afalse%2C%22xZoomedDataMin%22%3A0.421%2C%22xLambda%22%3A1%2C%22time%22%3A%222008%22%2C%22orderedByX%22%3Afalse%2C%22xZoomedIn%22%3Afalse%2C%22uniColorForNonSelected%22%3Afalse%2C%22sizeOption%22%3A%227%22%2C%22iconType%22%3A%22BUBBLE%22%2C%22playDuration%22%3A15000%2C%22dimensions%22%3A%7B%22iconDimensions%22%3A%5B%22dim0%22%5D%7D%2C%22xZoomedDataMax%22%3A2.336%2C%22yLambda%22%3A1%2C%22yAxisOption%22%3A%223%22%2C%22colorOption%22%3A%222%22%2C%22showTrails%22%3Atrue%2C%22xAxisOption%22%3A%225%22%2C%22orderedByY%22%3Afalse%7D&up__table_query_refresh_interval=300&url=http%3A%2F%2Fwww.google.com%2Fig%2Fmodules%2Fmotionchart.xml&mid=4&nocache=1&synd=spreadsheets 550 450]

(The above works for me on Safari; I’m not sure how the gadget will work under other browsers. If you can’t see the embedded gadget, try this published spreadsheet.)

Update: I forgot to “give props” to Hans Rosling and GapMinder.org for popularizing these graphs.  The graph was created in Google Spreadsheets using the “motion graph” gadget.

Update: JACM added thanks to Dave pointing me to JACM’s self-reported backlog. It is also nicely consistent with the impact factor/wait time correlation.  I’d like to comment more on this in a later post: I don’t think this happens in other fields.

Journals ranked by turnover times?

I had a search of the blogs and web at large to see if there was any evidence (anecdotal or otherwise) about the turnover rates for TCS (and friendly) journals.  Short answer: I couldn’t find much.  I would (and I am sure many other people would) appreciate any help in deciding what journal to submit to if you are particularly in favour of short turnaround times.  Of course, I am sure most would also not like to sacrifice quality – at least not too much.  Thanks in advance!