Journal of Transport and Health and Publication Bias?

Scientific journals exist as an outlet for the promotion and discussion of scientific ideas and and the presentation of evidence that can support or not support such ideas. This is a crucial aspect of scientific discourse as it allows for the dissemination scientific research to a wide audience (even if that research is not understood by the populace). Otherwise, for example, Gossett’s (aka student) t-test might only have been known to those taking a tour of the Guinness brewery at St James’s Gate (however, I can attest that the best milieu for discussing statistics is over a few beers).

The peer-review process is used to decide what gets published and what does not as judged by one’s scientific peers. Importantly, those contributing to the peer-review process should check their personal biases at the door and judge the quality of the evidence presented. This is, generally speaking, the mantra of the highly respected Public Library of Science (PLoS) journals.

This brings me to the Journal of Transport and Health (JTH), a new Elsevier journal which published its first issue this month. The journal is officially affiliated with the Transport and Health Study Group (THSG) which has as one their policy objectives “To promote a more balanced approach to cycle safety and oppose cycle helmet legislation.” On the THSG website, is an article that questions the use of even promoting helmet use by way of the common Straw Man argument that motor vehicle drivers and passengers should also be compelled to wear helmets. This argument ignores the many safety features of modern motor vehicles that aren’t even possible on a bicycle. Think about it this way, if I turn the argument around a bit to state “Why mandate airbags (or insert any other safety feature) for cars since we don’t make them mandatory for cyclists?” Is that a valid argument?

As you would expect, the editorial board of the JTH is littered with members of the THSG (it’s their journal, so why not?). Below is a list of JTH editorial board members and their role in the THSG (thanks goes out to Tim Churches for supplying the information in this table). This is as far as can be discerned from their website and I suppose it’s entirely possible others could simply be members of the THSG.

Name Institution JTH role THSG role
J. Mindell University College London (UCL), London, England, UK Editor-in-Chief Co-Chair
S. Alvanides Northumbria University, Newcastle, England, UK Associate Editor Executive Committee
A. Davis University of the West of England, Bristol, UK Associate Editor Executive Committee
S. Gray University of the West of England, Stapleton, Bristol, England, UK Associate Editor Executive Committee
S. Handy University of California at Davis, Davis, CA, USA Associate Editor
R.A. Kearns University of Auckland, Auckland, New Zealand Associate Editor
T. Sugiyama Baker IDI Heart and Diabetes Institute, Melbourne, VIC, Australia Associate Editor
L.B. Andersen University of Southern Denmark, Odense, Denmark Editorial Board
J. Dill Portland State University, Portland, OR, USA Editorial Board
R. Ewing The University of Utah, Salt Lake City, UT, USA Editorial Board
L. Frank University of British Columbia, Vancouver, BC, Canada Editorial Board
J. Hine University of Ulster, Newtownabbey, UK Editorial Board
S. Inoue Tokyo Medical University, Tokyo, Japan Editorial Board
L.R. Int Panis Flemish Institute for Technological Research (VITO), Mol, Belgium Editorial Board
R. Mackett University College London (UCL), London, UK Editorial Board Executive Committee
C. Perez Agencia de Salut Publica de Barcelona, Barcelona, Spain Editorial Board European Committee
J. Pucher University of North Carolina at Chapel Hill, Chapel Hill, NC, USA Editorial Board
C. Rissel The University of Sydney, NSW, Australia Editorial Board
L. Rizzi Pontificia Universidad Católica de Chile, Santiago, Chile Editorial Board
D. Rodriguez University of North Carolina at Chapel Hill, Chapel Hill, NC, USA Editorial Board
H Rutter London Sch. of Hygiene & Tropical Medicine, London, England, UK Editorial Board
J.F. Sallis University of California at San Diego (UCSD), San Diego, CA, USA Editorial Board
Y. Shiftan Technion – Israel Institute of Technology, Haifa, Israel Editorial Board
N. Silverstein University of Massachusetts at Boston, Boston, MA, USA Editorial Board
M. Wardlaw Edinburgh, Scotland Editorial Board Executive Committee
S. Watkins Stockport Metropolitan Borough Council, Stockport, England, UK Editorial Board Co-Chair

What is surprising here is the lack of balance on the editorial board — there are no members who could provide counterpoint to the anti-helmet views of the THSG. Non-THSG editorial board members include Chris Rissel who is an outspoken advocate against bicycle helmet legislation (see here, for example). Another JTH editorial board member, Malcolm Wardlaw, is an editorial board member of the anti-helmet website Bicycle Helmet Research Foundation (by the way, I’m  not convinced just anyone with a gripe can legitimately establish an editorial board). I searched for Wardlaw on the academic search engine Scopus (also published by Elsevier) and I found 12 documents — 10 seemed to be commentaries against helmets, one is a magazine article, and one seemed to be an original research article.

What really concerns me is this — will a paper that says anything positive about helmets or helmet legislation (and backed by evidence) be peer-reviewed without the biases of the THSG influencing the decision to publish? Also, will authors shy away from citing peer-reviewed articles that demonstrate evidence supportive of helmets or helmet laws? Will reviewers insist on citing anti-helmet arguments that have no evidence base like the DAI hypothesis? This is important as selective citation seems to be common practice when publishing on this topic.[1]

The very first issue contains commentaries from the Editor-in-Chief Jennifer Mindell (also co-chair of the THSG) and another by Stephen Watkins (the other co-chair of THSG). On page 3 of the very first issue of JTH, Watkins states[2]

“We still find ourselves having to fight defensive campaigns, such as resisting proposals for compulsory cycle helmets which will do little for cycle safety but will have a serious adverse effect on cycling levels, not least because they feed misconceptions about cycle safety.”

This comment is not surprising as Watkins has also been openly critical of the Spanish government for concluding (original in Spanish, translated with the help of Google) “the scientific evidence of the decrease of morbidity and mortality due to the helmet use by cyclists is absolutely conclusive.” This makes me wonder if an article submitted to JTH would be outright rejected for such a comment even if supported by evidence.

As far as I can tell, the only justification given for the THSG’s position on bicycle helmets comes from this presentation. In this presentation, Mindell, Wardlaw and Franklin discuss a paper I co-authored.[3] Regarding trends in cycling injuries around the NSW helmet law, the authors present this graph

MindellHead

and state

“It has repeatedly been claimed that such improvements have occurred. But to date, not a single such claim has stood the test of close inspection. In New Zealand, promotion and legislation increased helmet wearing to nearly universal use, yet there is no noticeable improvement in the %HI trend for cyclists (red) relative to the control (black). In NSW, ratios of head to arm and head to leg injuries were compared before and after the helmet law. The helmet law rapidly increased the helmet wearing rate from about 20% to 80%. But again, it is difficult to discern any particular reduction in head injuries to cyclists (red) compared with pedestrians (blue), although the data are rather “noisy”.”

A similar argument is also presented by the BHRF, an anti-helmet advocacy group. A major problem with this plot is it doesn’t actually correspond to the plots or the data we used in our paper, as discussed elsewhere.[4] Instead, both time series plots have been rescaled and shifted so they overlap and are therefore not comparable (this is evidenced by the lack of units on the y-axis). This plot, by itself, is also problematic as the trend in either head or arm injuries are unclear. Considering this plot and the comments made by the authors, I wonder if they understand what “noisy” means in this context.

Time series data, when finely aggregated, can usually follow an up-and-down pattern. This is not necessarily random variability (or “noise”), but can be cyclical or seasonal patterns in the data. For example, there may be more cycling injuries during summer months and less during the winter (due to more or less cycling). This can be modeled or removed prior to analysis (as we did in our original paper). The amount of scatter around the fitted model gives us a measure of “noise”. In our analysis, we found very little “noise” relative to the effect of the helmet law.

The appearance of the “up-and-down pattern” can be lessened by aggregating the data at a higher level (e.g., quarterly or yearly). This is usually not a good idea for the analysis due to the reduction in efficiency (in this case, the number of time points would decrease from 36 to 12 for quarterly aggregated data). Below is a plot of the head and arm injury data when aggregated by month, quarter and semester (the data can be found here and courser aggregations were computing by averaging monthly injury rates over 3 and 6 month periods).

NSW-Head-Arm

It is clear that as the aggregation gets courser, the apparent noisiness disappears. It is also clear there was a profound change in head injury from the third to the fourth semesters which correspond to either side of the helmet law. As I stated before, it would be unwise to analyze the semesterly data (there are only six time points and the full model estimates eight parameters), but it may be useful to visually present data in this manner to non-statisticians to avoid confusing systematic with random variability. My guess is our paper would’ve been rejected by the JTH editors due to an apparent lack of statistical understanding — whether biased against helmets or not.

With those issues in mind, I wonder whether an anti-helmet advocacy group should be in charge of a scientific journal? I have written several original research articles and commentaries regarding the evidence around bicycle helmet legislation. Since much of my work has demonstrated positives, I’ve been labeled as “pro-helmet” by some. I find this categorization disturbing as I don’t ever set out to demonstrate helmets or helmet laws are beneficial. It always comes down to drawing scientific conclusions from performing rigorous analysis using the best available data. If I find strong evidence to the contrary, I will adjust my views accordingly, as would any scientist. However, I’m not convinced any of my research would be given a fair shake at the JTH.

This is my opinion and I could certainly be mistaken. What is your take on this journal?

  1. Olivier, J. (in press) . The apparent ineffectiveness of bicycle helmets: A case of selective citation. Gaceta Sanitaria. 
  2. Watkins, S.J. (2014). The Transport and Health Study Group. Journal of Transport & Health, 1, 3-4.
  3. Walter, S.R., Olivier, J., Churches, T. & Grzebeita, R. (2011). The impact of compulsory cycle helmet legislation on cyclist head injuries in New South Wales, Australia. Accident Analysis and Prevention, 43, 2064-2071.
  4. Olivier, J., Grzebieta, R., Wang, J.J.J. & Walter, S. (2013). Statistical Errors in Anti-Helmet Arguments. 2013 Australasian College of Road Safety Conference – “A Safe System: The Road Safety Discussion” Adelaide.