This week's POLL article for discussion is:
Timing and Completeness of Trial Results Posted at ClinicalTrials.gov and Published in Journals, by Riveros et al.
The authors searched clinicaltrials.gov for trials with posted results. They then tracked down the published articles corresponding with these trials.
They looked at a random sample of 600 trials with results. 50% had no published article.
Of the 202 that did, "...the median time between primary completion date and first results publicly posted was 19 months." The corresponding time to journal publication was 21 months.
As for what was reported, "Reporting was significantly more complete at ClinicalTrials.gov than in the published article for the flow of participants (64% versus 48% of trials, p<0.001), efficacy results (79% versus 69%, p = 0.02), adverse events (73% versus 45%, p<0.001), and serious adverse events (99% versus 63%, p<0.001)."
85% of the trials were industry funded.
They know what percentage of the trials they looked at posted results/adverse events/serious adverse events. They don't know what percentage of results/adverse events/serious adverse events were actually reported, even in the trials they looked at. In other words, all the data they have to go on was what the trials voluntarily reported.
Why more complete reporting at Clinicaltrials.gov? Is the expectation that fewer people will check it than the literature, especially since the results often don't agree, and look more favorable in publications than on clinicaltrials.gov?
Serious adverse events reported in only 63% of publications? Oy vey.
My experience with clinicaltrials.gov has been that results are rarely reported at all. What percentage of trials reported any results?
This wasn't what the above article was about, so I replicated the search terms they used to find the trials they considered. They searched for "'Closed study' in the Recruitment field, 'with results' for Study Results, 'Interventional studies' for Study Type, and 'Phase III and IV' for Phase." Then they excluded certain studies.
First, I used the same search terms, except I used, "all studies" rather than "with results", and I got 29,261 such studies. Then I searched again, using "with results" for Study Results, and I got 4885 such studies. That means that 17% of registered trials reported results.
This despite the fact that:
"Section 801 of the Food and Drug Administration Amendments Act (FDAAA 801) requires Responsible Parties to register and submit summary results of clinical trials with ClinicalTrials.gov."
"The International Committee of Medical Journal Editors (ICMJE) requires trial registration as a condition for the publication of research results generated by a clinical trial."
This means that the vast majority of registered trials aren't bothering to post results on clinicaltrials.gov, despite the fact that they are required by law to do so. And when they do post results, it's generally after the one year time limit. So the clinicaltrials.gov people are not reinforcing their own rules.
Furthermore, the ICMJE requires trial registration to publish articles, but also doesn't generally check up on whether results have been posted, and if they have, whether they agree with the results in the articles they're about to publish.
If you read the Riveros article, and I encourage you to do so and comment on POLL, you'll see that there are a lot more problems than the ones I've just cited.
What a mess.