Evidence Methodology
Transparency matters. Here is exactly how we evaluate, rate, summarize, and organize the research in the RethinkTHC database.
Database at a Glance
The RethinkTHC research database is one of the most comprehensive publicly accessible cannabis research collections available. As of March 2026:
6,500+
Peer-reviewed studies
365
Evidence-based articles
20
Consensus reports
200+
Topic tags
How We Rate Evidence Strength
Every study in our database receives one of three evidence strength ratings based on its design, sample size, and replicability.
Meta-analyses, systematic reviews, or large randomized controlled trials (RCTs) with consistent results across multiple independent research groups. Sample sizes typically exceed 1,000 participants or aggregate data from 10+ individual studies. Findings have been replicated and are widely accepted in the research community.
Well-designed cohort studies, smaller RCTs, or cross-sectional studies with adequate sample sizes (typically 100–1,000 participants). Results are consistent but may not yet be replicated across multiple research groups or populations. The methodology is sound but the evidence base is still developing.
Pilot studies, case reports, animal studies, or early-phase clinical trials. Sample sizes are typically small (under 100 participants) or the study uses non-human models. Findings are suggestive but require further research before drawing firm conclusions. We include these because they represent the cutting edge of cannabis science, but they should be interpreted with caution.
Study Type Hierarchy
Not all study designs carry equal weight. We classify every study by its design type, ranked here from strongest to weakest in terms of causal evidence.
- Meta-Analysis — statistically combines results from multiple studies to derive pooled conclusions
- Systematic Review — comprehensive search and critical appraisal of all available evidence on a question
- Randomized Controlled Trial — participants randomly assigned to treatment or control groups
- Longitudinal Cohort — follows a group over time to track outcomes and exposures
- Cross-Sectional — measures variables at a single point in time across a population
- Case-Control — compares individuals with a condition to matched controls
- Observational — researchers observe without intervention or manipulation
- Review — narrative overview of existing literature (not systematic)
- Pilot Study — small-scale preliminary study to test feasibility
- Animal Study — research conducted on non-human subjects
- Case Report — detailed description of a single patient or small group
How We Select Studies
We do not attempt to index all cannabis research. Instead, we curate studies that are directly relevant to our 20 content pillars. Inclusion criteria:
- Published in a peer-reviewed journal indexed by PubMed or a comparable database
- Directly relevant to at least one of our 36 controlled topic tags
- Cited or referenced in at least one RethinkTHC article (current or planned)
- Available in English or with an English-language abstract
We prioritize recent research (published within the last 10 years) but include older landmark studies when they remain the best available evidence on a topic. Approximately 2,100 studies in our master database are intentionally excluded from public display because they are too niche, too narrowly focused on non-consumer-relevant topics, or otherwise fall outside the scope of our editorial coverage.
Tag Taxonomy
Every study in the RethinkTHC database is assigned one or more of 36 controlled topic tags. These tags are mapped to 20 research pillars that organize our article content.
- 36 controlled tags — a fixed vocabulary that ensures consistent categorization across all 6,500+ studies
- 20 research pillars — higher-level topic clusters that organize tags into coherent editorial areas (e.g., “Withdrawal & Recovery,” “Mental Health,” “Neuroscience,” “Harm Reduction”)
- 200+ topic tag mappings — tag-to-pillar and tag-to-article mappings that connect every study to relevant content
Tags are assigned during the editorial processing stage. Each study receives tags based on its primary research topic, methodology, population studied, and key findings. Tags are never inferred from titles alone — they are assigned after review of the study abstract and methodology.
Consensus Report Methodology
RethinkTHC publishes 20 Research Consensus reports — one for each research pillar. These reports synthesize findings from all studies in a given pillar to provide an aggregate view of what the research says on a topic.
How consensus reports are generated
- Study aggregation. All studies tagged to the pillar are collected. Each study contributes its key findings, evidence strength rating, and study type classification.
- Weighting by evidence quality. Findings from meta-analyses and systematic reviews carry more weight than findings from pilot studies or case reports. Strong evidence findings are given greater influence than preliminary findings.
- Finding categorization. Findings are sorted into three categories: strong findings (well-replicated, consistent across multiple study types), debated areas (mixed evidence with legitimate disagreement in the literature), and research gaps (important questions where insufficient evidence exists).
- Synthesis and review. The consensus report is drafted with explicit sourcing for each finding. Human editors verify that the synthesis accurately represents the underlying studies and does not overstate or understate the evidence.
Consensus reports are updated when significant new research shifts the evidence balance in a pillar. View all consensus reports on the Research Consensus page.
How Summaries Are Written
Every study summary in our database follows a consistent structure: what the study found, why it matters, key numbers, methodology, and limitations. Summaries are:
- Structured drafting — written using our internal editorial framework to ensure consistent formatting and plain-English readability across 6,500+ entries
- Editor-verified — every summary is checked against the original publication for accuracy before publishing
- Framework-driven — written following our internal writing framework that prioritizes practical takeaways, honest limitation reporting, and accessible language
Summaries are not a replacement for reading the original study. We always provide direct links to PubMed and DOI so readers can access the full text.
Study-Article Linking Methodology
Every study in the database is linked to relevant articles using a multi-stage process that ensures comprehensive, accurate connections between research and educational content.
- Tag-based matching. Studies and articles share the same controlled tag vocabulary. Primary matches are generated automatically based on overlapping tags.
- Relevance scoring. Each study-article link receives a relevance score based on tag overlap density, evidence strength, and study type. Higher-quality studies with more tag matches rank higher.
- Manual curation. High-impact studies — particularly meta-analyses and systematic reviews — receive manual review to ensure they are linked to all relevant articles, including edge cases that automated matching may miss.
The result is a network of 120,000+ study-article links that allows readers to trace any article claim back to its supporting research and discover related studies across topics.
Quality Assurance
Before any content is published, it passes through automated quality checks designed to catch errors, inconsistencies, and incomplete metadata.
- Automated preflight checks — verify formatting consistency, frontmatter completeness, and structural integrity for every article and study summary before publication
- Structural validation — confirms that all required fields (title, summary, evidence strength, study type, tags, publication year, journal) are present and correctly formatted
- Metadata completeness — verifies SEO metadata, schema markup, canonical URLs, and publication/modification dates are present on all pages
- Link validation — checks that PubMed links, DOI references, and internal cross-references resolve correctly
Data Access
We believe research data should be accessible. Our research database is available for download in CSV and JSON formats under the Creative Commons Attribution 4.0 International (CC BY 4.0) license.
You may share and adapt this data for any purpose, including commercial use, provided you give appropriate credit to RethinkTHC. Download links are available on the Research Data Download page.
Update Frequency
The RethinkTHC research database is updated on a rolling basis as new studies are identified and processed through our editorial pipeline. Articles are reviewed when significant new research emerges in their topic area. Consensus reports are updated when new evidence shifts the balance of findings in a research pillar.
We do not commit to a fixed update schedule because research publication is unpredictable. Instead, we prioritize responsiveness to significant new findings, particularly meta-analyses and large RCTs that may change the evidence landscape in a topic area.
How We Handle Corrections
If you find an error in any study summary — a misquoted statistic, an incorrect evidence rating, a broken link, or any other inaccuracy — we want to know about it.
Email corrections to corrections@rethinkthc.com with the RTHC ID of the study and a description of the issue. We review all submissions and publish corrections within 7 days when warranted. All corrections are logged in our changelog.
Last updated: March 2026