DOT-Certified Helmets & Government Performance Tests

Since 2014, 124 helmets claiming DOT compliance have been tested; 52 (41.9 percent) of them failed performance tests. Want to know more? Read on.

We recently wrote about several of the world’s motorcycle helmet performance standards.

The next step is to see how the system—one that affects those of us who wear helmets in the United States—is working.

Federal helmet test data provides insight into how the mandatory helmet standard known as DOT FMVSS 218 and its testing and enforcement mechanism works.

For consumers interested in getting the best personal protective gear no matter what they ride, be it an on-road or off-road motorcycle, scooter, ATV/UTV, snowmobile, mini-bike or mini-cycle, helmet selection can be a little confusing.

The descriptions of the technical aspects of the various helmet performance standards in many ways seem the same—yet they are each a little different.

Not every helmet standard has criteria for every aspect of performance that allows easy identification of the best or most protective helmet for each type of application. Not every helmet standard includes performance testing for face shields, either, which some consumers may wish to know about.

For stateside helmet purchasers, if the helmet is to be used on public roads, it has to be self-certified by the manufacturer as meeting the U.S. Department of Transportation Federal Motor Vehicle Safety Standard 218 (DOT FMVSS 218).

The standard is enforced—if indirectly—by the National Highway Traffic Safety Administration (NHTSA).

Enforcement is carried out by post-marketing inspection. NHTSA sends a list of helmets to be checked out to a third-party testing lab—ACT Labs in California and in the past Southwest Research Institute in Texas—which then acquires off-the-shelf examples and tests them against the DOT standards to verify compliance.

The rationale for this approach is that it is intended to prevent the manufacturer from selecting or producing a special sample of helmets designed specifically to pass the tests.

This assumes that it would be economically feasible for the manufacturer to whip up a batch of custom-made helmets that would be superior to the helmets they mass produce in order to meet the DOT standards. That assumption seems both unrealistic and self-defeating for the manufacturer.

Helmets that fail to meet the standards based on the lab’s testing may result in the manufacturer facing fines and having to pull the product off the market by recall, which has significant financial implications.

There are two problems with this type of after-the-fact system.

First, the manufacturer doing its own performance testing may make an honest mistake in its interpretation of the test methods and/or results data, leading to helmets later being found to be non-compliant.

Second, by the time the post-marketing testing catches up with a non-compliant product, many of the helmets may already be out on the streets. If the consumer doesn’t register the helmet purchase with the manufacturer for warranty coverage, or by some other mechanism, they may not receive a recall notice from the manufacturer, should a recall be initiated.

To learn how effective the post-marketing testing system is, NHTSA test reports from 2014 to 2018 were reviewed.

Testing data shows that between 2014 and 2018 a total of 124 helmets were subjected to post-marketing testing by ACT Labs or Southwest Research Institute on behalf of NHTSA for compliance to FMVSS 218 performance standards.

Results data from reports available on the NHTSA website as of February 15, 2019, are summarized in the table below. Duplicate or incomplete reports were not counted.

Summary of NHTSA Helmet Testing Data 2014 to 2018 Year Total Tested Total Failed Failed on Performance Failed labeling only Investigations Recalls 2018 2 (Reports posted to date, 2/15/2019) 1 1 (also failed labeling) 0 0 0 2017 34 17 10 (5 of those failed labeling also) 7 14 0 2016 24 13 9 (5 of those failed labeling also) 4 12 3 2015 33 19 16 (5 of those failed labeling also) 3 14 6 2014 31 19 16 (9 of those failed labeling also) 3 3 1 Totals: 124 69 52 (41.9% of those tested) 17 43 10

To find that 41.9 percent of the helmets tested since 2014 failed on actual performance criteria (i.e.: impact attenuation, penetration, retention system, each tested at ambient temperature, low temperature, high temperature and with water immersion) could be a concern to helmet purchasers.

The helmets tested were labeled such that they indicated compliance with DOT standards, yet many were found not to pass the required tests once they were on the market.

The reports don’t indicate how many of the failed helmets had already been sold to consumers, how long they had been out on the market before being found not to pass the required tests, or what percentage of those helmets that failed were sold and were ever recovered under the 10 recalls that have occurred.

That also made me wonder if those consumers that had their helmet recalled returned it and either got a full refund or received a replacement helmet model that had been tested and proven to pass all the tests.

One recall example I found data on called for the helmet owner to follow a process to verify that their helmet was among those subject to the recall and get a replacement. The owner had to submit a form. If the manufacturer confirmed the helmet was subject to recall, the company would then send a prepaid shipping label to the owner.

The owner was directed to not use the helmet and that a new replacement of the same helmet model would be sent when the recalled helmet was received by the manufacturer. If the owner doesn’t have any other helmets on hand to use while waiting through the exchange process, and the owner will have to opt to ride without a helmet, where legal, or not ride at all. The recall process is more than a little inconvenient.

It is also possible for a helmet manufacturer to initiate a recall on its own that won’t necessarily be reflected in the NHTSA data.

The documents reviewed did not provide details on the 43 investigations noted in the website data, nor on whether financial penalties that can be imposed were used. That data was not requested.

Clearly, the post-marketing testing and self-certification approach to quality assurance for helmets has some serious limitations with potentially even more serious implications for consumers.

The NHTSA data indicates that retrospective validation of performance can allow substandard products to reach the street before they can be discovered and improved. It provides evidence that a number of substandard products have reached consumers in the past five years, all while displaying labels that would lead consumers to believe the helmets meet DOT requirements.

Self-certification by the manufacturer might be likened to allowing doctors to self-license for medical practice. None of the other standards systems considered here—there are others not included in this coverage of helmet performance standards—allow self-certification. Other standards typically require a third party to do the performance testing prior to the helmet being allowed to be labeled as compliant.

In addition, any changes to a helmet’s design specifications that could affect performance require recertification, and post-marketing inspection may also occur to verify continued compliance.

Also, there is the issue of the FMVSS 218 performance standards themselves. At ≤400g, the DOT standard has the highest allowable peak acceleration of any of the standards considered here. Impact attenuation—the degree to which the helmet prevents impact energy from reaching the brain—is one of the key functions of any helmet. If judged on that criteria, the DOT standard allows the highest impact energy to reach the rider among the current standards considered.

ECE 22.05, FIM FRHPhe-1 and Snell allow only ≤275g peak acceleration (depending on sample, some are even more stringent), which is 31.2 percent lower than what the DOT standard allows. FIM FRHPhe-1 allows peak acceleration of ≤208g (depending on linear/oblique sample), which is 48 percent lower than DOT. The 2020 FIM FRHPhe-2 standard will allow peak acceleration ≤160g, which will be 60 percent lower than the DOT standard. The DOT standard does not test for rotational energy absorption; ECE and FIM FRHPhe, for example, do.

Conversely, FMVSS 218, FIM FRHPhe-1 and Snell standards call for testing the helmet’s protective capability against penetration, while ECE 22.05 standards do not include a shell penetration test.

Instead, the ECE 22.05 standard tests for overall rigidity of the helmet shell. ECE 22.05 includes testing for faceshields and include standards for children’s helmets, a category not addressed in any other standard.

Snell has a test for emergency removal of the helmet that no other system includes. In the DOT system, faceshields must meet what is called VESC 8 (Vehicle Equipment Safety Commission Standard 8), but that testing is not done as part of NHTSA helmet testing.

Despite the potential limitations of the aging DOT FMVSS 218 standard (first issued March 1, 1974, with the most recent update on May 13, 2013), it remains the mandatory minimum performance standard for helmets sold for road-going use in the United States.

The problem with the minimum standard approach is that in many instances, the minimum becomes the maximum; it becomes the design target and going beyond that point may be regarded as adding unnecessary cost due to overdesign.

Some would argue that nothing prevents a helmet manufacturer from designing their helmets to meet more rigorous standards in addition to DOT standards, and it is evident that a number of them do. That argument forgets the equipment, staffing, and materials costs required to do the testing to meet the DOT standards in-house or to contract testing out to a third party.

Simply stated, if a manufacturer produces a helmet that meets more rigorous standards than the DOT standard prescribes—and proves it by the product passing pre-marketing testing required for certification under a more stringent standard—why should it have to bear the cost of additional testing to prove DOT compliance?

Ideas for improvement

The data suggests that there is room for improvement in the system. When you buy a helmet with the DOT label on it, you have no assurance that it has actually been tested and been proven to pass all the performance tests. If it is tested later and found to fail in any aspect of performance, what does that really mean?

The system as it exists now is pass/fail, so you do not know if a helmet that failed only one of the multiple impact or penetration or retention system tests is as deficient as one that failed two, three, or all four tests (ambient temperature, low temperature, high temperature and with water immersion)?

Here are some ideas for potential system improvement. One deals with modifying FMVSS 218 that may require no major investment by any stakeholder and could result in lower cost for consumers as well as wider choice among helmet options.

A cost-effective approach like that already used by the federal government in healthcare quality could be applied to reforming FMVSS 218. Under current federal policy, if a hospital, nursing home, or other facility or provider achieves accreditation by JCAHO, NCQA, AAAHC or other recognized private accreditation body, the federal inspection process is not done. Instead, the accredited provider is “deemed” to have met the federal standards that apply by achieving that accreditation. Duplicative administrative costs are eliminated.

Why not do the same for helmets? Here’s an example of how that could be done—add provisions to FMVSS 218 that state something along these lines:

Any helmet having been certified by recognized testing laboratories as defined by the certifying authority as meeting the applicable performance standards for motorcycle helmets intended for use on public roads shall be deemed as compliant with all provisions of this part. Helmets approved under this provision must display labeling indicating the certification or approval the helmet has, but are not required to bear the DOT label. This provision shall apply to helmets compliant with the following standards: UNECE 22.05, FIM FRHPhe-1, Snell M2020 D or R, JIS T8133 2015, NBR-7471:2001 (and/or others that substantially meet or exceed FMVSS 218).

This would allow a helmet buyer to go to the internet or their local dealer and buy an ECE 22.05 compliant, Snell Memorial Foundation approved, or FIM FRHPhe-1 (or other deemed standard system) compliant helmet with or without the DOT label. That would enable consumers to get the helmet with the performance and features they prefer, without being constrained by the presence or absence of a DOT certification label that may or may not guarantee a certain level of performance.

Manufacturers set up to do their own DOT compliance testing or have arrangements to contract it out could continue to do so, if they wish. Those manufacturers already pursuing Snell Memorial Foundation certification, ECE 22.05 certification, or FRHPhe-1 could continue that, as well, but they would not have to do DOT testing if they achieve compliance on other deemed standards. They would label their helmets as they currently do when they achieve compliance under those other systems.

ACT Labs, the independent lab that does the most recent FMVSS 218 compliance testing for NHTSA, is also qualified to conduct compliance testing to ECE 22.05 standards and may include FIM FRHPhe standards testing in the future.

This entails some trade-offs. Helmets certified to ECE 22.05 standards may offer more impact attenuation capability, but are not tested for resistance to shell penetration as DOT standards call for. Instead, ECE 22.05 standards call for testing overall shell rigidity, which DOT standards do not address.

The ECE 22.05 certified helmet is a known quantity. It must pass the applicable tests before it can be placed on the market. A helmet under DOT standards may not have been correctly tested or proven by the manufacturer or its testing contractor to meet the penetration, impact, or retention system standards prior to going on the market as the post-marketing test failure data suggests. In fact, if the manufacturer is not required to submit any evidence of that testing for self-certification, the testing may not have been done at all.

So, another thing that could be done to update FMVSS 218 that shouldn’t be particularly burdensome to manufacturers that are already testing for compliance, would be for NHTSA to require submission of the manufacturer’s test results (or that of their testing contractor) for each helmet they plan to label as compliant with DOT standards prior to marketing or, if the helmet is already on the market, submit completed test data on a one-time basis to confirm compliance.

The DOT label could be applied only after the test procedures and results are reviewed by NHTSA. Post-marketing testing as directed by NHTSA could still be used to verify continued compliance.

If the helmet design or specifications change in such a way that performance could be affected, a new test report could be submitted. For example, if shell material or impact liner material or dimensions change, then retesting and submission of the new results could be required.

There is a range of options that could be used to make helmet standards and regulation more consistent, more effective, and less costly. There are also ways to allow consumers greater choice in the helmet market and make things a little less confusing. These are just a few ideas to start a conversation that the data suggests is due.

To have a look at the NHTSA helmet test reports that provide results for each helmet tested in detail, visit this NHTSA page: http://www.nhtsa.gov/cars/problems/comply/

To view motorcycle helmet compliance test reports:

Select “Equipment” and choose to search by “FMVSS” Click “Submit Search” Select “218” in the FMVSS menu Select the desired year Click “Submit Choices”



Acknowledgments: Special thanks to Paloma Lampert, Safety Compliance Engineer, NHTSA Office of Vehicle Safety Compliance, U.S. Department of Transportation and Julie Kelly of SOAR Communications for their kind assistance in development of this article.

NOTE: The views expressed here are those of the author alone, and don’t necessarily represent an official position of Ultimate Motorcycling.