Built for FRCR trainees and practicing radiologists

Your training should not rely onan AI that makes things up.

You ask a radiology question and get a confident answer with a citation. Then you check it and discover the source trail is weak or missing.

RadAssistant is designed to do the opposite: transparent evidence quality, explicit uncertainty, and source-traceable responses.

Try RadAssistant free

No credit card. Start with verified workflows.

New workflow

Create medically tuned presentations, not generic slide dumps.

Build journal clubs, FRCR teaching decks, MDT summaries, and consultant-facing radiology talks with evidence-aware structure, dark PACS-style imaging slides, and citation-ready summaries.

Formats

Journal club, case presentation, grand rounds, radiology teaching, MDT, conference talk.

Design language

Dark navy + warm orange palette, executive medical layouts, concise stat callouts, and image-first radiology slides.

What happens with general-purpose AI.

Fabrication

It invents statistics

Ask for a sensitivity figure and it gives one confidently. If there is no source, that number should not be trusted.

Hallucination

It cites papers that do not exist

Fake authors, fake journals, fake DOI details. Formatted perfectly, but impossible to verify.

Overconfidence

It rarely says "not enough evidence"

Answers often sound authoritative whether backed by a systematic review or weak anecdotal data.

Generic

It is not built for UK radiology workflows

NICE, RCR, NHS pathways and practical radiology context should be first-class signals, not afterthoughts.

Same question. Different answers.

We asked both tools the same clinical question. See what comes back.

Query

23 y/o woman with convulsions. Bilateral punctate subcortical white matter T2/FLAIR hyperintensities in the parietal lobe. What are possible differentials?

ChatGPTNo verification
Confident but unverified. Citations admitted as fabricated when challenged.
RadAssistantVerified + Re-analysis
Source-traceable, confidence-scored, with built-in re-analysis for stronger evidence.

We built an AI that does the opposite.

Hallucination checks, evidence tiers, and source tracing are part of the product itself, not an afterthought.

How RadAssistant works.

Traced

Every key statistic links back to source text

Guidelines and peer-reviewed sources are referenced directly so you can verify claims yourself.

Checked

Claims are verified before they reach you

Hallucination checks cross-reference responses against curated medical evidence and flag uncertainty.

Honest

Evidence strength is shown, not hidden

Answers are tiered by evidence quality so confidence matches the quality of the source material.

UK-first

NICE and RCR guidance are prioritized

Built for radiology training and reporting workflows used in real UK clinical pathways.

A day in your training. Covered.

On call, exam prep, reporting workflow, or deep research: the platform is designed to fit how radiologists actually work.

Morning round

Consultant asks for imaging features that help distinguish lymphoma from liver metastases.

-> Verified Chatbot: Tiered answer with clear source citations.

Lunch break

Twenty minutes free and FRCR Part 2A is close.

-> MCQ Practice: Curated FRCR-style questions from trusted sources.

Afternoon list

You dictated a complex trauma report and want a second pass on structure and clarity.

-> Report Critique: Actionable feedback on missing descriptors and flow.

Evening revision

You need viva-style practice under pressure with realistic cases.

-> Viva Practice: AI examiner workflow with real DICOM context.

On call 2am

Incidental finding needs fast escalation guidance.

-> Guideline Checker + Differential Generator: Evidence-aware pathways and ranked differentials.

Teaching prep

You need a consultant-ready journal club or MDT deck by tomorrow morning.

-> Medical Presentations: Medically tuned slides with radiology structure, citations, and PACS-style imaging layouts.

Evidence tiers on every answer.

Tier 1

Guidelines, RCTs, systematic reviews

Highest-quality evidence and direct citation coverage.

Tier 2

Cohort and case-control studies

Strong observational support with transparent limits.

Tier 3

Case reports and expert opinion

Clearly labeled lower-confidence evidence.

13 specialized tools. One platform.

Everything a radiology trainee needs from daily Q&A to exam prep, reporting quality, guidelines, and deep evidence workflows, including medically tuned presentation generation.

01

Verified Chatbotcore

Evidence-tiered radiology Q&A with source citations and transparent confidence.

02

Search

Targeted search across guidelines, reviews, and papers with evidence weighting.

03

Report Critique

Structured feedback on report clarity, completeness, and reporting standards.

04

MCQ Practice

FRCR-style questions from curated sources for focused exam revision.

05

Deep Research

Multi-phase research workflow for planning, synthesis, and verification.

06

Open Notebook

Organize saved findings, sources, and personal notes in one place.

07

Viva Practice

Oral exam simulation with case-based prompts and performance feedback.

08

Guideline Checker

Yes/no clinical decision support grounded in radiology guidelines.

09

Differential Generator

Transforms findings into ranked differential diagnoses and next steps.

10

Protocol Lookup

Quick reference for CT, MRI, and ultrasound protocols by indication.

11

MRI Sequences

Sequence reference for practical scanning and interpretation workflows.

12

Learning Dashboard

Track progress, practice activity, and revision consistency over time.

13

Medical Presentationsnew

Generate journal club, FRCR teaching, MDT, and conference decks with medically tuned slide structure and evidence-aware design.

90,000+

curated medical documents

400+

active users yearly

3

evidence tiers per answer

What trainees are saying.

"I stopped trusting generic AI answers the day I chased a citation that did not exist. This fixed that."

— ST3 trainee, London deanery

"Evidence tiers changed how I revise. I can immediately tell what is robust and what needs caution."

— ST5 trainee, Yorkshire deanery

"The viva workflow feels much closer to real exam pressure than anything else I have used."

— ST4 trainee, West Midlands

Built by a radiologist, for radiologists.

Start with the free tier and explore the tools with transparent evidence and source-traceable outputs.

Create your account

Sign in and start using the platform.

Access verified chatbot answers, exam tools, report critique, guidelines, notebook, and research workflows in one account.

or

By continuing, you agree to the Terms and Privacy Policy.