MedBayes-Lite: Bayesian Uncertainty Quantification for Safe Clinical Decision Support
arXiv:2511.16625v2 Announce Type: replace
Abstract: We propose MedBayes-Lite, a lightweight Bayesian enhancement for transformer-based clinical language models that improves reliability through uncertainty-aware prediction. The framework operates without retraining, architectural modification, or additional trainable parameters, and integrates three components: Bayesian Embedding Calibration via Monte Carlo dropout, Uncertainty-Weighted Attention for reliability-aware token aggregation, and Confidence-Guided Decision Shaping for abstention under uncertainty. Across MedQA, PubMedQA, and MIMIC-III, MedBayes-Lite improves calibration and trustworthiness, reducing overconfidence by 32--48\%. In simulated clinical settings, it further supports safer decision-making by flagging uncertain predictions for human review, particularly under distribution shift. For closed API models, the framework remains applicable through sampling-based predictive uncertainty and confidence-guided abstention, while full embedding- and attention-level uncertainty propagation is evaluated on open-weight transformer models.