Digital Pathology Podcast

214: AI and Automation in Modern Hematologic Diagnostics

Subscriber Episode Aleksandra Zuraw, DVM, PhD Episode 214

This episode is only available to subscribers.

Digital Pathology Podcast +

AI-powered summaries of the newest digital pathology and AI in healthcare papers

Send us Fan Mail

Paper Discussed in this Episode: Molecular Pathology, Artificial Intelligence, and New Technologies in Hematologic Diagnostics: Translational Opportunities and Practical Considerations. Alnoor F, Mukherjee S, Menon MP, Ng D, Li P, Ohgami RS. Diagnostics 2026.

Episode Summary: In this deep dive, we explore how hematology labs are tackling a massive rise in diagnostic complexity combined with persistent staffing shortages. The solution isn't just working harder—it's an entirely new workflow powered by robotics and AI. We unpack a comprehensive 2026 review that looks at the cutting-edge transformation of hematopathology, moving from manual microscopes to collaborative robots (cobots), digital morphology, and AI-driven genomic analysis. Can machines handle the grueling pre-analytical work and help experts diagnose leukemia faster and more accurately?

In This Episode, We Cover:

The Modern Lab Crisis: How the latest WHO and International Consensus Classification (ICC) frameworks demand high-volume, multi-modal genomic and morphologic data, stretching human pathologists to their limits.

Enter the "Cobots": Collaborative robots are taking over the repetitive benchwork. We discuss systems like the UR5 cobots in Denmark that sort 3,000 blood tubes a day, and the Pramana Spectral HT robotic-arm scanners that digitize over 1,000 slides daily, freeing up human staff for higher-level tasks.

The Digital Eye (Morphology & AI): How platforms like CellaVision and Scopio turn glass slides into AI-analyzed data. ◦ Peripheral Blood: AI pre-classifies cells with 85-98% concordance to manual microscopy, prioritizing blasts and abnormal cells for expert review to improve efficiency. ◦ Bone Marrow: Deep learning isn't just counting cells; it's accurately quantifying reticulin fibrosis and identifying leukemia subtypes with human-level performance.

Flow Cytometry Gets an Upgrade: High-dimensional flow cytometry data meets deep learning. AI models are now achieving expert-level performance in classifying mature B-cell neoplasms and accurately distinguishing acute leukemias from non-leukemic samples.

The Molecular Frontier: AI is making sense of complex genomic datasets. We discuss breakthroughs like the MARLIN neural network, which achieves rapid epigenomic classification of acute leukemia in under two hours, and how AI assists in tracking measurable residual disease (MRD) longitudinally.

The Economics of Automation: Digital pathology is a smart financial investment. We review projections showing potential savings of $18 million over five years for integrated health systems, driven by improved efficiency, higher throughput, and fewer diagnostic errors.

Key Takeaway: The integration of artificial intelligence and robotics is not meant to replace hematopathologists; rather, these technologies serve as essential scaling tools designed to absorb grueling physical labor and routine analytical tasks. By building a workflow where machines handle the sorting, scanning, and initial pattern recognition, experts can focus their time on final diagnostic synthesis—ultimately delivering faster, more precise patient care

Get the "Digital Pathology 101" FREE E-book and join us!

Imagine having um a multi-million dollar quantum computer sitting right on your desk.


Sounds pretty nice, honestly.


Right. But uh to actually get it to run, you were forced to feed it data using paper punch cards.


Yeah. That that completely defeats the purpose.


Exactly. And if you were working in hematopathology or really any clinical lab right now, I mean, that is exactly what your daily workflow probably feels like.


Oh, without a doubt. It's a massive disconnect.


So, welcome to the digital pathology podcast. We are jumping into a journal club style deep dive today.


And this is specifically for all you trailblazers out there who are, you know, navigating this wild frontier of laboratory medicine.


Yeah, we are breaking down a massive, just incredibly comprehensive review. It was published in March 2026 in the MDPI journal Diagnostics,


right?


And it's titled Molecular Pathology, Artificial Intelligence, and New Technologies in Hematlogic Diagnostics. Senu Alnor, Robert Esogami, and just a huge team of colleagues put this together.


It is a phenomenal synthesis. I mean, what they accomplished here is taking AI and lab automation out of the realm of like futuristic pilot programs.


This is where it feels like it's been stuck forever.


Exactly. And they prove that these are mature deployable tools right now. Tools that are actively solving that massive disconnect you just mentioned.


So, let's talk about that disconnect. Yeah.


Because it is uh it's the core anxiety for almost every trailblazer listening today.


Oh, for sure.


We are all trying to practice medicine under these incredibly complex modern diagnostic frameworks, specifically um the World Health Organization's fifth edition, the WHO HEM5.


Right. And the International Consensus Classification, the ICC.


Exactly. And these frameworks are brilliant for patient care. Yeah.


But they demand just an unprecedented level of genomic and molecular data.


Yeah. To properly classify a blood cancer and figure out how aggressive it is, the diagnostic requirements have completely leaped into the 21st century. We are no longer just looking at the shape of a cell. We're categorizing diseases based on highly specific recurrent genomic abnormalities,


which means pathologists are essentially being asked to synthesize massive volumes of really heterogeneous data.


Right. You have morphology, you have flowcytometry, targeted sequencing,


and you have to integrate all of it under tighter time constraints than ever before.


Exactly.


But then, and here's the punch card problem. You walk away away from your high-res screen. You go down into the physical laboratory and the reality looks like it did in I don't know 1985.


Yeah, it really does.


You still have these brilliant, highly trained technologists manually operating microtos, literally turning a wheel by hand to shave off microscopic slices of tissue,


floating those slices in water baths,


manually preparing smears. People are physically walking plastic racks of glass slides and blood tubes down the hall from one bench to another,


which is I mean that is the fundamental bottleneck we have to address before we can even talk about deploying advanced AI algorithms.


Right. Because the AI needs the data to be perfect.


Exactly. Before you can have a smart laboratory, you have to fix the physical handling of the specimens. AI requires consistent standardized digital inputs.


So if your physical prep is chaotic


or heavily dependent on, you know, which technician happens to be on shift that day, your digital output is going to be incredibly noisy.


I want to push on that a bit actually.


Why Did anatomic pathology and hematopathology get left so far behind?


Well,


because if you look at core chemistry labs, they adopted total laboratory automation decades ago. They have those cool robotic tracks moving samples around.


Why are we still carrying glass slides by hand?


That's a great question and it comes down to the substrate. The substrate is entirely different. How


a core chemistry lab is dealing mostly with uniform vacuum tubes filled with fluid. A machine can easily spin that down, sample it, move it.


It's highly predictable.


Exactly. But hematopathology deals with bone marrow biopsies, fragile tissue cores, and incredibly variable peripheral blood smears.


Oh yeah,


it is physically messier. It is incredibly heterogeneous and just exponentially harder to standardize.


But the paper is saying the technology has finally caught up to that messiness.


It has. Yes.


So, how do we actually fix this punch card problem?


Because if the substrate is that messy, we can't just like slap a conveyor belt in the middle of the room and call it a day.


No, you don't. The transition actually starts with a very practical on-ramp. You start with automated liquid handlers.


Okay, benchtop systems,


right? Systems that do the aloquoting, the regent edition, the PCR setup. They are relatively low cost. They just sit on an existing bench


and the laboratory staff can program them without needing some software engineer on site.


Yeah, exactly.


It's essentially automating the repetitive pipet work that just destroys technologists thumbs over a 20-year career


and that reduces fatigue while dramatically improving precision. But the real gamecher, the huge leap happens when we step up to the collaborative robots.


The cobots.


The cobots. Yeah.


Let's look at the data coming out of Copenhagen University Hospital in Genttoft on this.


Oh, this is a great example.


Because they brought in two UR collaborative robotic arms for their core hematology lab. And these aren't just moving things blindly, are they?


No, they use advanced vision systems to actually pick up blood tubes from a bulk conveyor.


Wow.


Yeah. position them to read the barcodes, detect the specific cap colors, sort them into the correct analytical racks, and then load the analyzers.


And what's the volume like?


They are processing 3,000 samples a day.


3,000.


And the critical metric here, they are delivering over 90% of their results within 1 hour, even after taking on a massive increase in testing volume.


And they manage that massive volume spike without adding a single extra staff member, right?


Not one. That is a exactly what clearing a pre-analytic bottleneck looks like in the real world.


But it's not just blood tubes. These robotic handlers are being integrated into high throughput slide scanning, too. The review details the PMAN spectral HT cluster.


The scale of that system is just hard to wrap your head around.


It really is. It's a cluster of four single slide line scanners arranged around one central robotic arm. Right.


Yep. The arm is continuously loading and unloading slides based on real-time scanning status. At Caris Life Sciences, they integrated these systems into a laboratory that processes 1.5 million slides a year.


Wait, 1.5 million fragile glass slides?


Yeah.


Okay. So, the immediate fear when a trailblazer hears about an industrial robotic arm moving millions of glass lines is that it's going to crush them.


Oh, absolutely.


Or worse, that it's an assembly line takeover designed to just replace the technologist entirely.


And that anxiety is exactly why the industry uses term cobot. These are fundamentally designed for collaboration,


not replacement.


Right. They have torque limited joints and haptic feedback. If a cobot arm accidentally bumps into a human technologist or if it senses too much resistance when gripping a slide, it instantly stops.


Oh wow.


Yeah. They are designed to safely occupy the exact same workspace as your staff.


So no chain link fences required around the machine.


None at all. And regarding job security, think about what the coat is actually doing. It is absorbing the ergonomically destructive tasks


like capping and uncapping. thousands of tubes manually,


which causes severe repetitive strain injuries, or loading slides into a scanner for 6 hours straight, which causes debilitating neck and back pain.


So, the cobot takes the physical grind.


Exactly. It elevates the human technologist to a supervisory role, managing the system, handling complex troubleshooting, and overseeing quality control.


Okay, so the cobots have essentially solved the physical traffic jam. They've perfectly prepared the samples and scanned the glass slides into a standardized digital format,


right?


But pixels on a screen are just dumb data. How does the lab actually process all that visual information without just, you know, shifting the bottleneck from the physical microscope to the computer monitor?


Well, that is where the software takes over. The transition from glass to pixels opens the door for digital morphology platforms


like the cell systems.


Exactly. Systems like the television DM series and the newer fullfield imaging platforms like Scopio's X100.


Yeah. The detail in the paper about fullfield imaging capturing the entire smear, including the feathered edge. That really stood out to me.


This is so important.


For anyone listening who isn't physically smearing blood every day, why is that feathered edge so critical?


It really just comes down to physical fluid dynamics. When a technologist physically pushes a drop of blood across a glass slide to create a smear, the mechanical force pushes the largest, heaviest cells to the very end of the smear,


which is the feathered edge,


right? And in hematopathology, those large cells are often the leukemic blasts or the most abnormal cells.


Okay?


So, if your digital system only scans the thick center of the slide and misses that delicate feathered edge, you could literally miss the leukemia,


which is terrifying,


truly.


But the validation data on these new full field systems is incredibly strong. The paper was showing concordance rates between 85 and 98% for major lucasite populations when compared to a human doing traditional manual microscopy. And keep in mind that is just peripheral blood. The review highlights how AI is being deployed in much more complex environments too like bone marrow biopsies.


Oh right.


There is a collaboration between ARUP laboratories and primmana utilizing a deep learning model called deep. It was trained on tens of thousands of single cell images and can classify over 20 different classes of marrow cells.


And it's not just counting cells either. It's analyzing the actual architecture of the tissue.


Yes. Exactly.


Like reticulum fiber. osis in a bone marrow biopsy. Historically, a pathologist looks at a silver stain through a microscope and gives it a semi-quantitative grade,


right? You look at the dark intersecting lines and say, "Um, that looks like an MF1 or maybe an MF2,"


which is highly subjective and human eyes get fatigued.


Two different pathologists might give you two different grades on the exact same slide,


which is a huge problem when you are trying to track disease progression over time,


right?


The machine learning models discussed in the review shift this entire They provide a continuous indexing of fibrosis.


So it's not just eyeballing it anymore.


No, the AI is calculating the exact pixel level density and coverage of those silver stained fibers. It provides a highly reproducible quantitative metric that extracts granular data a human simply cannot consistently measure.


It really feels like the AI is acting as like the ultimate sue chef in a high-end restaurant.


Oh, I like that analogy.


Yeah, think about it. It gets to the kitchen at 4:00 a.m. It chops all the vegetables. preps all the base sauces and completely organizes the station.


Right.


It pre-classifies all those marrow cells. It does the tedious pixel counting for the fibrosis and it flags the anomalies.


But the pathologist is still the executive chef.


Yes.


You walk in, taste the dish, synthesize the patient's clinical history and sign off on the final many the final diagnosis.


That is the perfect way to look at it, especially regarding workflow efficiency. When the AI performs that pre-classification, it actively prioritizes the abnormal It bubbles them to the top.


Exactly. Think about a technologist spending 10 minutes manually clicking a counter for 100 perfectly normal neutrfils just to find the one reactive lymphosy.


It's exhausting.


The AI does that instantly and presents the clinically significant findings right at the top of the screen. So review time plummets because you are only spending your highly compensated time looking at what is diagnostically critical.


But morphology, what the cells actually look like is only the visual layer. to complete a diagnosis under those massive WHO frameworks we talked about earlier. Labs rely heavily on deep nonvisual data,


right? Things like multiparameter flowcytometry.


And because flowcytometry generates inherently structured multi-dimensional data, it is the absolute perfect playground for deep learning, isn't it?


It really is.


Let's break down how we currently handle flowcytometry versus how the AI does it because the contrast is wild.


It is. So when a human looks at flowcytometry, data. We use a process called manual gating. We draw two-dimensional boxes on a computer screen to isolate specific cell populations based on two markers at a time. Right?


It is iterative. It takes a lot of time and mathematically it forces us to ignore the higherdimensional relationships between all the other markers.


It's like trying to understand a massive symphony by only listening to the violins and the cellos in isolation, completely ignoring the brass and the percussion.


Exactly. What these deep learning models do is listen to the entire orchestra simultaneous. ously they ingest the full event level data across all dimensions at once.


Wow.


They pick up on harmonic relationships between cells that a human drawing 2D boxes would completely miss.


And the results speak for themselves. The review sites studies where deep learning models classified mature bell neoplasms in acute leukemas at a performance level matching expert hematopathologists


purely by analyzing that highdimensional space.


And then the technology pushes even deeper into the molecular realm. When we are dealing with acute leukemias, we are dealing with clinical emergencies. But our current testing paradigm is agonizingly slow,


painfully slow. Flowcytometry data comes back today. Standard karaotyping takes a few days. Next generation sequencing might take a week or more.


And that diagnostic latency is the enemy of good patient care.


Exactly. It delays the initiation of optimal targeted therapy.


Which brings us to a part of the review that genuinely blew my mind. They discuss a neural network model called Marlin.


Yes. It performs rapid epigenomic classification of acute leukemias in under two hours.


Incredible, right?


Let me stop you right there because epigenomic classification in under two hours sounds physically impossible based on how traditional sequencing works.


How does Marlin actually pull that off?


It comes down to the hardware software synergy. Traditional sequencing takes DNA, chops it into tiny little fragments, reads them, and then a computer has to painstakingly puzzle those millions of fragments back together over several days,


right?


Marlin uses long read sequencing technology, specifically nanopore sequencing. It pulls massive unbroken strands of DNA through a microscopic pore. Okay.


As the DNA passes through, it changes the electrical current, and the AI can instantly read not just the genetic code, but the epigenetic tags, the methylation marks, those on andoff switches attached to the genes, and it does it in real time.


So, for a patient sitting in a hospital bed waiting to find out what kind of leukemia they have, we move from waiting a week for the puzzle to be put together to having a complete epigenomic profile on day one.


It eliminates the diagnostic latency entirely. The AI scales and integrates that complex data instantly.


That's huge.


And the review also points out how AI is transforming how we track measurable residual disease or MRD.


Right. The incredibly difficult task of trying to find one single leukemia cell hidden among tens of thousands of normal cells after a patient has gone through chemo.


Exactly. Molecular MRD assays generate highly complex longitudinal data. And one of the biggest challenges for a pathologist is distinguishing true residual leukemia from a phenomenon called clonal hematopoasis.


Which is what exactly?


Clonal hematopois is when aging stem cells acquire genetic mutations, but they aren't actually acting like leukemia. They're just noisy background interference.


So how does the AI tell the difference between background noise and a ticking time bomb?


It models the clonal dynamics over time. The AI analyzes the growth curves and the specific patterns of mutation acquisition across multiple test points.


Ah,


it can mathematically separate the slow benign drift of aging cells from the aggressive exponential signature of relapsing leukemia.


Okay, here's where I have to play the skeptic a bit


because the review introduces a concept called multimodal integration. That sounds a little bit like reading tea leaves.


Oh, I know what you're going to say.


The author cites studies showing that AI models are now capable of predicting specific molecular erations like an MPM1 or FLT3 mutation in acute myoid leukemia purely by analyzing the morphologic patterns on a standard digitally scanned H& slide.


Yes.


Hold on. You were telling me the AI can read DNA mutations just by looking at the pink and blue shapes of a standard smear. How is that mechanically possible?


It sounds like magic, but it is actually just architectural fallout.


Architectural fallout.


Yeah. A genetic mutation fundamentally changes how a cell behaves, right? That behavioral change alters the cell's internal structure. It changes how the cell clumps with its neighbors. It changes the physical texture of the chromatin inside the nucleus.


Oh, so the DNA mutation leaves a physical footprint.


Yes. The human eye cannot reliably see or quantify those micro patterns across thousands of cells, but the AI can.


Wow.


It maps those tiny visual breadcrumbs back to the exact genomic mutation that caused them. It literally bridges the microscopic scale and the genomic scale. of this technology is incredible. But trailblazers, we have to ground this in reality.


Always


healthcare administrators and laboratory directors operate in a world of extremely tight budgets, massive overhead, and strict return on investment requirements.


You can't just walk into the seauite and say, "I want a robot because it reads architectural fallout."


Exactly. Do the economics actually justify overhauling a laboratory's entire physical and digital workflow?


Well, the authors dedicated a significant portion of the review to the translational realities. And the economic data is highly encouraging.


Good.


They cite a financial projection from a large integrated health system that estimated full digital pathology implementation could save them $18 million over 5 years.


$18 million is not pocket change.


Where are those savings actually coming from?


A massive portion comes from improved workflow efficiency and a reduction in costly diagnostic errors. But it is also the physical logistics


like storage.


Think about the costs associated with physically storing millions of glass slides, the climate controlled square footage, the staff time required to retrieve archive slides.


Oh, true.


Think about the courier fees and shipping insurance for sending fragile glass slides across the country for expert consults. Digitization eliminates all of that overhead.


The review also highlights a study of eight European laboratories that implemented digital pathology. They found that the investments reached a positive net present value within 7 years


and they were actually cash flow positive. in just about 3 years.


If you are a lab director trying to build a pitch deck for your hospital's CFO to buy a fleet of slide scanners and cobots, that cash flow positive in three years metric is the ultimate mic drop. It changes the conversation entirely.


It really does. And we also have to factor in operational costs. There is a study from an anatomic pathology laboratory in Malaysia noted in the review which found that salaries accounted for over 40% of their total laboratory expenses. Which makes sense. Your biggest cost is highly specialized, brilliant human labor.


Precisely. When human labor is your primary expense, any automation that absorbs the repetitive manual work and allows your staff to operate at the top of their license pays for itself very quickly.


Absolutely.


But the authors are incredibly clear on the human element here. The goal of all this technology is not to build an autonomous AI doctor that issues pathology reports in a vacuum.


Right? Successful translation requires disease specific validation. and heavy expert oversight.


The ultimate vision is a highly integrated hybrid workflow. Cobots handle the physical prep and clear the pre-analytic bottleneck. AI modules pre-classify the multimodal data and highlight the anomalies and human experts synthesize that data, oversee the final interpretation, and generate the clinical report.


What a phenomenal road map for the field. Let's recap the journey this source material just took us on.


Let's do it.


We started by looking at the massive data demands placed on modern human shadow pathology by the WHO and ICC frameworks.


Right?


We explored how collaborative robots are stepping in to handle the messy physical reality of the lab, sorting thousands of tubes and perfectly scanning millions of slides without fatigue.


We saw how digital morphology and AI act as our ultimate sue chefs mapping the feathered edge and quantifying fibrosis.


We broke down the mechanics of deep learning and flowcytometry and how Marlin's long red sequencing delivers epigenomic classifications in under 2 hours.


And finally, we saw the hard economic data proving that this digital revolution makes financial sense.


It fundamentally proved that the tools to modernize the laboratory are here today. So to all the trailblazers listening, take a hard look at your own lab this week.


Ask yourself the hard questions.


Are you still operating with a punch card physical workflow while your clinicians are demanding quantum level diagnostic expectations? It might be time to start advocating for that physical reboot.


And as you think about the future of your lab, I want to leave you with a final forward-looking thought inspired by that multimodal AI research we discussed.


Lay it on us.


If AI becomes consistently highly accurate at predicting specific genetic mutations purely by analyzing the visual arrangement of cells on a digitally scanned slide.


Yeah.


Could the current gold standard of expensive time-conuming next generation sequencing eventually be downgraded?


Oh wow.


Could NGS become just a secondary confirmation tool used only When the morphology AI flags an ambiguity, it would completely upend the economics and the speed of molecular diagnostics.


A world where the glass slide holds the genomic answers if you just have the right algorithm looking at it. That is definitely something to show.


It really is.


Thank you so much for joining us for this deep dive on the digital pathology podcast. We hope this breakdown of the latest MDPI diagnostics review gave you the insights you need to keep pushing the boundaries in your own practice.


Stay curious out there,


keep blazing trails, and we will see you next time.