1. Home
  2. Medical News
  3. Health Technology
advertisement

Roche WSI Validation Shows Comparable Primary Diagnosis Performance

roche wsi validation shows comparable primary diagnosis performance
04/17/2026

Two studies evaluated the Roche Digital Pathology Dx system for primary surgical pathology diagnosis: a precision (repeatability/reproducibility) study and an accuracy (method-comparison) study. The authors conclude that Roche Digital Pathology Dx is noninferior to manual microscopy for primary diagnosis in surgical pathology. The validation program combined a precision and interlaboratory reproducibility assessment of 23 histopathology features with a method-comparison analysis of 2,047 clinical cases, pairing reproducibility testing with head-to-head diagnostic comparison.

The system included the VENTANA DP 200 scanner, browser-based Roche uPath enterprise software, and an ASUS ProArt Display PA248QV monitor. Researchers describe selecting archival cases (one per patient), deidentifying them, scanning slides, and performing quality checks before digital review. To limit bias, the study used randomized case and modality reading order (with washout periods between reads) and adjudication by reviewers blinded to modality, using the reference sign-out diagnosis as the anchor. This reading sequence kept digital and manual assessments separated during evaluation.

For precision, the authors report intersite and intersystem precision of 89.3%, interday precision of 90.3%, and interreader precision of 90.1%. Lower confidence bounds for these primary precision endpoints were all reported at or above 85%, meeting the prespecified threshold. For accuracy, the digital-minus-manual difference (DR–MR) versus the reference sign-out diagnosis was −0.61%, with a lower 95% confidence bound of −1.59% against a −4% acceptance margin. The reproducibility and accuracy endpoints were reported as meeting the predefined acceptance framework.

Mean case reading times were nearly the same: 2.33 minutes for digital review and 2.34 minutes for manual microscopy. The study also reports high initial scan acceptability, with 96.8% of hematoxylin and eosin slides and 93.6% of ancillary slides judged acceptable for diagnosis. After repeat scanning when needed, overall acceptability reached 99.9%, and readers requested added 40× images for 6 slides. Across these workflow and image-adequacy measures, results did not show a large separation between modalities.

In reviewing major disagreements in breast, lung, bladder, kidney, and stomach cases, investigators report no digital-modality-specific root cause. Feature-level agreement varied, with lower agreement reported for findings such as nuclear grooves. The authors describe misidentification patterns when a region of interest contained several plausible features or when surrounding histologic context supported a different interpretation. They characterize these discrepancies as context-dependent rather than evidence of a recurring digital failure mode.

Key Takeaways

  • The system was evaluated in paired precision and method-comparison studies for primary surgical pathology diagnosis.
  • Prespecified reproducibility and noninferiority endpoints were reported as met within the study framework.
  • Reading time, scan adequacy, and disagreement review were reported as broadly similar, without a unique digital-specific pattern.
Register

We’re glad to see you’re enjoying ReachMD…
but how about a more personalized experience?

Register for free