More Cherry-Picking from Cycle-Helmets.Com

Last month I posted a commentary regarding incorrect information on I attributed a problem in their analyses of cycling surveys in South Australia to a “transcribing problem”. However, the issues seem to be much more serious than that.

In the comment section of an article I authored on The Conversation, Dorothy Robinson stated

Australians generally consider cycling a healthy activity, so the discrepancy between the two sets of tables in the South Australian report might reflects a reluctance to admit they cycled less because of helmet laws. The “really big” boo-boos Linda talks about were caused by her looking at the wrong tables. Table numbers are now included in so that others will not make the mistake of attributing the differences between these tables to “transcribing errors”.

The website now refers the reader to Tables 5a and 5b (Destination of bicycle trips in the last 7 days). This isn’t completely correct as total responders were taken from Tables 1a and 1b (Frequency of bicycle riding).

The total cycling in the past week do not match up between Tables 1a/1b and 5a/5b. This is likely due to (near) complete responses for amount of cycling but missing responses for destinations. This is common when conducting surveys and highlights the problem with combining such tables, especially when there is no need to do so. In other words, if you’re really interested in comparing cycling rates before and after helmet legislation, why would you not use the frequency of cycling tables?

There is also the issue of throwing away usable data. Tables 1a and 1b contain information for four categories of cycling frequency (“At least once a week”, “At least once a month”, “At least once every 3 months”, “Less often or Never”). This information is mostly thrown out by combining the total responses for destinations in Tables 5a and 5b with the total cyclists in Tables 1a and 1b. Here is a summary of the proportions of cycling in South Australia across age groups and gender for years 1990 and 1993.

Cycling in South Australia
1990 1993
At least weekly 21.8 21.0
At least monthly 5.2 6.0
At least every 3 months 3.9 4.4
Less often or never 69.1 68.6

These results suggest the SA helmet law had no impact on the amount of cycling. The suggestion by Robinson that responders are reluctant “to admit they cycled less because of helmet laws” is unsubstantiated. If someone is reticent to admit they don’t exercise, this would apply to both the 1990 and 1993 surveys.

I’d like to be wrong about this, but the analysis on this website reeks of fishing for results that support a pre-determined conclusion.


Transcribing Problems with Analysis

I recently discussed problems replicating the results found in an assessment of mandatory helmet legislation in Australia published in Accident Analysis and Prevention (Robinson, 1996). This issue was introduced to me by Linda Ward who has pointed to a related issue.

The anti-helmet website has a page titled “Spinning” helmet law statistics. Under the heading Measuring changes in cycle use, the webpage states

Similarly, in South Australia a telephone survey found no significant decline in the amount people said they cycled but there was a large, significant drop in how much they had actually cycled in the past week 24. In 1990 (pre-law), 17.5% of males aged at least 15 years reported cycling in the past week (210 out of 1201), compared to 13.2% (165 out of 1236) post-law in 1993. For females, 8.1% (102 out of 1357) had cycled in the past week in 1990 compared to 5.9% (98 out 1768) in 1993 24.

These reductions (24% for males, 26% for females aged at least 15 years) are statistically significant (P < 0.005 for males, P = 0.025 for females).

The citation given is a technical report that evaluated the introduction of helmet legislation in South Australia.[1] Table 1 of the SA report gives frequencies of bicycle riding from two surveys, one in 1990 and the other in 1993, for those aged 15 years or older separated by gender. In this survey, the amount of cycling split into four categories: “At Least Once A Week”, “At Least Once A Month”, “At Least Once Every 3 Months” and “Less Often or Never”. The SA helmet law went into effect on 1 July 1991.

The main problem here is the numbers in the above quote don’t match up to the data in the original report. Here is a screenshot of the table.


When these numbers are corrected and a comparison is made for those cycling at least once a week versus everyone else, the p-values are 0.279 and 0.450 for males and females respectively. Additionally, the relative risks are 0.90 (95% CI: 0.76,1.08) and 0.91 (95% CI: 0.71, 1.17) for males and females respectively. The point estimates for changes in the proportion cycling in the past week are much less than those reported on the webpage.

In addition to using the wrong data, I don’t agree with the analysis. There are four cycling categories which have been collapsed into two — those who cycle at least once a week and those who don’t. A lot of information is needlessly removed from the data. Instead, a chi-square test for independence could’ve been performed and individual changes could be assessed through an investigation of the residuals.

The Pearson residuals for an individual cell from a chi-square test are


where O are the observed frequencies and E is the expected frequency under an assumption of independence, i.e., no relationship between helmet legislation and the amount of cycling. These residuals are asymptotically normal, so residuals with absolute value greater 1.96 may be considered “statistically significant”. The sign would indicate observing more than expected (if positive) or less than expected (if negative).

When analyses are performed on the full tables, the chi-square tests give p-values of 0.20 and 0.85 for males and females respectively. None of the residuals have absolute value anywhere near 1.96. The largest residual pair is for males cycling “at least once every 3 months”. The signs of the residuals indicate there is less cycling than expected in 1990 (r=-1.04) and more cycling than expected in 1993 (r=1.02) if there is no relationship between helmet legislation and amount of cycling. Here is some R code to do those analyses.





The analyses above are stratified by gender and we could perform a unified analysis using Poisson regression. This model is essentially


I’ve simplified things a bit here because the variable FREQ has four categories and therefore gets estimated by three dummy variables.

The important comparison here is the interaction between YEAR and FREQ. If significant, this would indicate helmet legislation and amount of cycling are associated. Using the given South Australian data, the three-way interaction was non-signficant, so was removed from the model. The p-value of the interaction between YEAR and FREQ is not statistically significant (p=0.41).

No analysis I’ve performed indicates a significant relationship between helmet legislation and amount of cycling in South Australia among those 15 years or older when using the correct data.

Note: The anti-helmet website is maintained by Chris Gillham. I previously discussed problems with this website here. If you download the PDF version of this report, the author is listed as “Dorre” who I believe is Dorothy Robinson. Both Gillham and Robinson are editorial board members of the anti-helmet organisation Bicycle Helmet Research Foundation.

  1. Marshall, J. and M. White, Evaluation of the compulsory helmet wearing legislation for bicyclists in South Australia Report 8/94, 1994, South Australian Department of Transport, Walkerville, South Australia.

Something Amiss in Robinson (1996)

A 1996 article titled “Head Injuries and Bicycle Helmet Laws” published in Accident Analysis and Prevention is one of the most highly cited papers assessing the effect of helmet legislation.[1] (148 citations, Google Scholar, 4 Sept 2014) Additionally, this seems to be the first article purportedly demonstrating a negative impact of such laws. The conclusions of this paper state

Consequently, a helmet law, whose most notable effect was to reduce cycling, may have generated a net loss of health benefits to the nation.

In this paper, secondary analyses were performed on data contained in other reports. I’ve pointed out in a previous paper[2] that NSW adult cycling counts exist from sources cited in this paper although they are not presented. This is curious because the counts of adult cyclists from NSW helmet use surveys increased from pre- to post-helmet legislation which contradicts the conclusions of this paper. Adult cycling also increased by 44% in Victoria following helmet legislation.[3]

Linda Ward has pointed to another issue with this paper regarding a comparison of the proportion of head injury hospitalizations to cyclists before and after legislation in Victoria. Some of the relevant data is given in Table 6.[1] In this table, the proportion of head injuries are 31.4% for 1989/90 and 27.3% for 1990/91 for hospital admissions in Victoria. During this period, there are a total of n=2300 cycling hospitalizations. The author notes a comparison of these proportions is non-significant by a chi-square test.

The 2×2 table for this data can be reproduced using the source material.[4] Figure 25 of this report gives “All Other Injuries” of about 900 for year 1989/90. This allows us to fill in the rest of the table given below.

Year Other Injury Head Injury
1989/90 900 412
1990/91 718 270

The frequencies of the other cells seem to correspond to the other values in Figure 25. The chi-square test for this table results in \chi^2=4.49, p=0.03 and OR=0.82. This result could be influenced by the need to estimate the number of cases from a plot. We can assess the influence of this estimate by repeating the analysis for other values near 900. Choosing values from 890 to 910 results in the plot of p-values below.


As you can see, there is a statistically significant decline in head injury in each instance for cycling injury in Victoria before and after helmet legislation. R code to reproduce these results is given below.



colnames(tab)=c(‘Other’,’Head Injury’)


This re-analysis has important ramifications. First, the author’s conclusions are not fully justified. Cycling head injuries fell at a rate greater than other cycling injuries following legislation. It is possible there was less cycling following legislation, but head injuries fell at a significantly greater rate. We also found this to be true in NSW in a 2011 paper. Secondly, organizations that have used this paper to justify their opposition to helmet legislation should reconsider their stance. This includes the Transport and Health Study Group (THSG) which is affiliated with the Journal of Transport and Health (JTH). Finally, the editors of Accident Analysis and Prevention and the journal’s publisher Elsevier should seriously investigate the reproducibility of the analyses in this paper with a keen eye for the information found in the source material that was not included in the paper.

Note: Dorothy Robinson is a patron and editorial board member of the anti-helmet organization Bicycle Helmet Research Foundation.

  1. Robinson, DL (1996) Head injuries and bicycle helmet laws. Accident Analysis and Prevention, 28, 463-475.
  2. Olivier J, Grzebieta R, Wang JJJ & Walter, S. (2013) Statistical Errors in Anti-Helmet Arguments. Proceedings of the Australasian College of Road Safety Conference.
  3. Cameron, M., Vulcan, AP, Finch, CF & Newstead, SV. (1994) Mandatory bicycle helmet use following a decade of helmet promotion in Victoria, Australia — An evaluation. Accident Analysis and Prevention, 26, 325-337.
  4. Cameron, M, Heiman, L & Neiger, D. (1992) Evaluation of the bicycle helmet wearing law in Victoria during its first 12 months. Report No. 32, Monash University Accident Research Centre.