Cannabis field research gets worse by the minute, and the ridiculous headlines attached to retraction-ready ‘research,’ is at an all-time high. Case in point, new article out of Canada claiming research shows cannabis is associated with a higher risk of driving accidents. Read on to understand more about why the methodology of a study, really matters; and why this one is flawed.
Study on cannabis and driving injuries
We’ll get to the part where nothing makes sense, in a minute. But first, let’s take a look at the study in question. The study is called Cannabis-Involved Traffic Injury Emergency Department Visits After Cannabis Legalization and Commercialization; and undertook the question of whether cannabis use has had an impact on driving accidents.
The study did not collect its own data, rather it pulled from data sources. I always like to remind, in these cases, it means investigators are merely looking at records, with no ability to ask questions. This includes of patients, emergency room staff, or anyone else involved; like witnesses, or other participants. It involved no informed consent. It focused just on the province of Ontario, and looked at records for all emergency visits related to traffic injuries, which included cyclists and pedestrians, as well. Only the records of those 16 years and above were used.
The data was from January 2010 to December 2021. The investigators broke this down into three separate time frames for comparison: 1) January 2010-September 2018, which accounted for before legalization. 2) October 2018-March 2020, which accounted for the beginning period of legalization without a large-scale market. And 3) April 2020-December 2021, which accounted for after the market expanded out.
Investigators looked at reports of traffic injuries via the International Statistical Classification of Diseases and Related Health Problems, Tenth Revision (ICD-10). According to researchers, “We then identified traffic injuries with a documented diagnosis of cannabis involvement when an ICD-10 code for a mental and behavioral disorder due to cannabis use (F12.X) or for cannabis poisoning (T40.7) was listed as the main or contributing reason for the traffic injury ED visit.” Realistically (and logically), this only accounts for certain, on a positive cannabis test, physical product found, or a person admitting they used it.
This is important. As are the terms ‘main’ or ‘contributing.’ ‘Main’ means it’s the reason something happened. While ‘contributing’ doesn’t have to have any value to anything. For example, if a person goes to the hospital because they feel sick, and they’re diagnosed with lead poisoning as the main reason, that makes sense. But if that person also shows a positive cannabis test, the cannabis could then be considered a contributing factor, even though realistically, we know it was the lead.
In fact, researchers continue, “We also considered traffic injury ED visits to have cannabis involvement if a cannabis code was used during admission to the hospital or transfer to another ED.” Which means, if this is identified at the time of admission to the hospital, it ONLY means cannabis was found at the scene of the accident, or a person confirmed they used it; as no other tests could have been performed by that time. These are important things to remember.
Results of the study
Within the time frame investigated, researchers report there were 947,604 emergency room visits from traffic accidents. According to researchers, there were 426 defined as ‘traffic injury ED visits’ involving cannabis. That’s .04%, compared to the 7,564 involving alcohol, which accounts for .8%. Then the writers give a gender breakdown, but change the number of cannabis documented cases to 418, calling them ‘individuals with documented cannabis involvement.’ Perhaps only one implies documentation was available? Its unnecessarily unclear.
Investigators claim “Annual rates of cannabis-involved traffic injury ED visits increased 475.3% over the study period.” Apparently, during the same time, alcohol related emergency room visits went up 9.4%. The paper is saying that driving injuries went up when weed was legalized, at a far higher rate than alcohol-related driving injuries did, during the same time.
The authors conclude: “This cross-sectional study found large increases in cannabis involvement in ED visits for traffic injury over time, which may have accelerated following non-medical cannabis commercialization. Although the frequency of visits was rare, they may reflect broader changes in cannabis-impaired driving. Greater prevention efforts, including targeted education and policy measures, in regions with legal cannabis are indicated.”
Why this study on cannabis and traffic injuries should be retracted
This study is nothing more than an example of a study that needs to be retracted. Not only does it not inform us with any new information, but it seemingly seeks to make connections that aren’t there, in an effort to make a statement about cannabis being unsafe for roadways, even given that the numbers themselves are essentially negligible. Of course, like all bad research, it fails on many levels. Let’s get into it.
- People drive on cannabis all the time, and often while making other bad, and unrelated decisions. Like drinking alcohol. Like talking on a phone. Like eating. Like changing a music playlist. It’s like the example I gave of going to the hospital with lead-poisoning. That lead-poisoning is the issue, whether the person turns up positive for a cannabis test or not. The cannabis is not always (or seemingly, generally) relevant. But if you pull all hospital visits involving a positive cannabis test or situation where product was found, tada! There it is. This leads to the second point.
- We have 0 other information of what else was going on. For example, how many of the people identified for cannabis, swerved from hitting a deer? How many were first-time drivers perhaps, or driving in horrible snow storms. It’s not even just that the weed presence doesn’t have to be a factor, but we don’t know literally anything else about the environment. There are like a million confounding variables from deer to ice, and we’re essentially being told who showed up positive for weed in an emergency room visit, or had it found in their car. It snows a lot in Canada. Let’s remember that.
- This study goes by the logic that no one smoked before legalization, only a few smoked during the beginning period, and then everyone else joined in after. This, of course, is known to be untrue. In fact, the push for the legalization was because of the massive size of the illicit market. People didn’t just start smoking and driving. Its downright silly to even make the assertion that a legalization increased the number of people driving stoned, at all. Its just a difference of whether they used legal or illicit weed to do it. It was already happening. Weed was already everywhere.
- I saw no mention of anything related to testing frequency for cannabis in road accidents. As in, perhaps at a certain time, testing for that picked up speed, when it might not have otherwise been tested for all the time. Likewise, did all hospitals always test traffic accident victims for weed? Or did this increase at points during the time frame investigated? If it changed at any time, literally none of this is relevant. That information is painfully, and vitally important; and it wasn’t included.
- There is mixed research on this topic, but everything else with a similar statement, was investigated similarly. When looking at it in terms of insurance premiums (you know, the fees that go up when you drive badly?) this study found that locations with medical cannabis had decreased premiums, which implies less accidents and injuries. That was in the US. This isn’t to say that that study is definitely the right one. But if cannabis is related to confidently to so many accidents and injuries, wouldn’t we be talking about higher premiums for drivers, like, all the time?
Research retractions
These days, it’s easier to find examples of studies that should be retracted, than ones that provide any kind of decent, or useful information. To be fair, I deal with big headlines, and the kind of research that can be advertised to the public. There are plenty of studies that investigate topics that aren’t interesting to the public, and might not be as much fishing for headlines. I can only attest to what I come across, and I am certainly not in the world of heavy medical research. However…
Research is retracted all the time because of faulty research methodology (this study is an example in my mind), flubbed results, and actual human error; though the latter accounts for the least, unfortunately, according to reporting on retractions. This was documented recently in an article by the Guardian, which runs a watchdog site called RetractionWatch. Though 2022 had close to 5,500 retractions, this only accounted for about one in one thousand pieces of research. However, many, including the authors of the article, believe this number should be 10X higher, or even more.
The reality is that we’re in a fast-paced world where things come up, and pass by, incredibly fast. And a study needs a large negative response to get to the point of a retraction. Even though this study received a lot of coverage, it’ll be out of headlines soon enough, because it really doesn’t offer much. Which means, the focus probably won’t be great enough to force it down. Publications probably use this to their advantage at this point.
A further reality is that publications don’t want their work questioned, and will go to great lengths to protect what they publish, even if it’s a study like this, with a million holes. Publications are just as competitive as the researchers who they publish; and retractions hurt their reputations. So it suffices to say, that there are a lot of studies out there like this one, that are not only useless, but possibly damaging; and which will likely remain.
Conclusion
Does cannabis promote a greater number of driving accident injuries? It doesn’t seem likely, but we certainly aren’t going to know from this investigation. Perhaps if it was as obvious as alcohol, this would all make sense. But its not, and studies like this seem like nothing more than groundwork to inform policy. I can totally see new fines being based on faulty research like this.
Hello all ye faithful! Cool that you’re here with us at Cannadelics.com; where we work to report on the best stories in the cannabis and psychedelics spaces. We’re always here, so come on by regularly for updates. And subscribe to the Cannadelics Weekly Newsletter; so you’re always up on the latest news.