top of page

Are you struggling with your surgical robot? Are you finding challenges with your complex medical device? Need external experts to take a look and help you? Team Purple has you covered...

 

Search

A New Frontier in the Operating Room: Real-Time Histology for Robotic Surgery

Steve Bell on real time histology in surgical robotics
Steve Bell on real time histology in Surgical Robotics

Imagine this…


A surgeon sits at the next upgraded surgical robot (just sayin'). They are operating on a cancer patient - flicking between modes of advanced vision. Through AI assistance the robot has helped them identify the area of the tumour and the surgeon and “assistive” robot are zooming in on it. The surgeon starts the tumour resection.


Imagine the surgeon is in the middle of the tumour resection, and they’re trying to decide - should I peel off another millimetre here? Should I stop? The ideal would be to sample the tissue and understand if all the tumour was removed. Because take too little and you risk recurrence. Take too much and you could aggressively affect function of the patient. This is the fine balance that precision surgery can deliver. But to do that you need histology to tell you where you are at.


Traditional histology offers an answer, but only after tens of minutes (or more), and only if you can get fresh frozen sections. That delay introduces wasted anaesthesia time, uncertainty, possible under-resection (leaving tumour behind), or over-resection (damage normal tissue). In robotics the arms are precise, but they aren’t smart. They still wait on the surgeon’s decision. (Today...)


But what if they had real-time histology feedback, the robotic system becomes part of a closed loop: “you interrogate, you see, you act “ instantly - rather than pulling out, waiting, coming back.


In short: real‐time histology is the missing microscope inside to guide the surgeon's or the robot’s hand.


For many cancers, margins are everything. A positive margin means recurrence, more surgery, and worse outcomes. But we still rely on techniques that were born in the era before robotics: frozen sections, sending slides out, waiting for path. That friction is the drag on innovation. Eliminating it injects improved OR flow, more confidence, and ultimately for the patient - safety.


A little background

For those not familiar with how the histology works I’ll give some background here:

  • The gold standard is still formalin fixation + embedded sectioning + staining H&E (Hematoxylin & Eosin, immunostains). This gives exquisite detail. But it takes hours (or days), and is incompatible with intraoperative decision.

  • To get faster, we use frozen sections - cryostat cutting, quick stain, path review - intraoperative. But that still eats ~20–40 minutes, needs a dedicated pathology lab and staff, and may distort tissue. (Not everyone has this available for all procedures.)

  • Some centres digitally scan whole slides and send them to remote pathologists (telepathology), but you still wait on cutting and staining.

  • The new wave is slide-free, optical, or chemical methods: you image or analyse the tissue (fresh or in vivo) and get a diagnostic read in seconds to minutes - no glass slides, no embedding, (and importantly) no wait.


H&E staining
A New Frontier in the Operating Room: Real-Time Histology for Robotic Surgery

These new methods can be grouped roughly by how they “see” tissue:

  1. Optical imaging that gives cellular contrast (confocal, OCT, Raman, fluorescence)

  2. Chemical fingerprinting (mass spectrometry, molecular probes)

  3. Hybrid or augmented systems that combine modalities


And then there is in vivo imaging (you don’t even remove tissue) vs ex vivo imaging of specimens or surfaces (margins, biopsy cores) while in the OR.


State of the art


Optical and slide-free histology

Stimulated Raman Histology (SRH)This is molecular vibrational imaging. You shine lasers, get contrast from chemical bonds, and render images that look like H&E. The big player is Invenio Imaging with their NIO platform. They already deploy in neurosurgery; their AI tools are growing. You can go from fresh tissue to a digital “histology slide” in under 3 minutes. No cutting, no staining - yet pathologic detail.

Because there’s no slide, you preserve the specimen for downstream tests. That’s a huge win. As many histology techniques are destructive to the specimen.


Invenio Imaging  for histology
Invenio Imaging

Confocal / fluorescence surface scanning

Companies such as SamanTree (Histolog) use fluorescence contrast plus confocal scanning to image the exposed surface of excised specimens. The idea is: before the specimen leaves the OR, scan all surfaces and instantly flag suspicious spots. In breast surgery, studies show a reduction in re-operation rates.This is excellent when you don’t need penetration depth but care about lateral margins.Confocal microscopy is a way of imaging tissue that gives you crisp, high-resolution optical “slices” without actually cutting anything.

Here’s how it works in simple terms:A focused laser scans across the tissue, and only light from a very thin focal plane is detected - everything above and below that plane is filtered out. That means you can see real cellular detail at specific depths, layer by layer, almost like a virtual biopsy. In surgery, confocal endomicroscopy uses tiny probes to bring that same principle into the patient - so you can view living tissue architecture (cells, glands, fibers) in real time, right there


SamanTree Histolog Scanner
SamanTree Histolog Scanner

Wide-field OCT + AI for margins

Optical coherence tomography (OCT) gives you micrometer‐scale cross‐sectional “optical slices.” Perimeter Medical is pushing this hard for breast lumpectomy margins. Their S-Series is in use now; their B-Series (with AI for cancer detection) is in pivotal trials in 2025.The advance here is coverage: you can scan a whole surface, get maps of likely tumour, and guide your resection right then. Not one field, many fields.


Perimeter S series OCT
Perimeter S series OCT

MUSE / UV surface excitation Microscopy with UV surface excitation (MUSE) is an elegant slide‐free method. You excite the tissue surface with UV, get contrast that mimics H&E, and read directly. It doesn’t damage deeper tissue and gives rapid results. This is more at the translational/early-commercial stage, but promising.Some groups also offer “optical sectioning” variants (InstaPath type) that give you shallow depth images almost like you had cut a thin slice.


Muse UV Imaging

Lechpammer, M.; Todd, A.; Tang, V.; Morningstar, T.; Borowsky, A.; Shahlaie, K.; Kintner, J.A.; McPherson, J.D.; Bishop, J.W.; Fereidouni, F.; et al. Neuropathological Applications of Microscopy with Ultraviolet Surface Excitation (MUSE): A Concordance Study of Human Primary and Metastatic Brain Tumors. Brain Sci. 2024, 14, 108. https://doi.org/10.3390/brainsci14010108


In vivo (or in situ) microscopy

Probe confocal / laser endomicroscopy ZEISS’s CONVIVO is a surgical confocal probe used in neurosurgery. You bring the probe to the brain, see cellular patterns live, and make judgments on tumour infiltration. This is already FDA cleared.In GI, lung, head-neck, etc., systems like Cellvizio (Mauna Kea) let you see the mucosal or submucosal layer in vivo with microscale resolution. You inject fluorescence dyes, bring the probe, and interpret what you see. Tiny field of view, but immediate.


Zeiss Convivo In Vivo pathology
Zeiss Convivo In Vivo pathology


Zeiss Convivo In Vivo pathology
Zeiss Convivo In Vivo pathology


As a note: These tools won’t replace resection; they guide you where to cut or check margins on the go.


Chemical / spectral fingerprinting

REIMS / iKnife (electrosurgical mass spec)

The “iKnife” concept: hook your electrosurgical device to a mass spectrometer. When you cut, the heated aerosol is sucked into the mass spec, analysed in seconds, and the system classifies tissue type (tumour vs. normal).It’s very sexy. There are strong academic data across cancer types. But bringing that into OR workflow (mass spec footprint, calibration, robustness) is nontrivial.But as we see robots getting “smoke extraction” systems - I do start to wonder how small we can get the detectors? Hmmm maybe some wild fantasy - but…



MasSpec Pen / molecular touch

This is a handheld pen: touch the tissue surface for a few seconds, the device sucks up molecules, runs mass spec, returns classification (tumour vs healthy). Early results in pancreatic, ovarian, other cancers are favourable.The challenge is fitting it into real world sterile workflows, speed, and integration with imaging or robot arms. But imagine if a robotic end effector gets one of the se pens and we can make it work. It will start to give even more info to the brains of the robot… and well we all see where this is going right?



What all this could mean for robotic surgery

My wow Monet is this:  all these modalities become yet more sensors in a robotic system. You don’t just see anatomy (via camera, fluorescence, depth) -  the surgeon (and the robot AI)  see actual pathology. The robot (or the surgeon through the robot) can then adapt mid-course. Initially I imagine you will get visual guides of “cut to this line” - but then of course with “auto resect” the robot could do a better job of following the tumour margin boundary as set by the pathology sensor - in combination with the millions of cases ion outcome data from the robotic hive that has trained it. You get where this could go? The ultimate robotic precision resections with real time margins.


Imaginary image of a da Vinci MasSpec Pen
Don't get excited this is an imaginary image

Picture this workflow:

  1. The robotic arm holds a tumour; you ask (yourself or maybe the robot), “is that margin clean over there?”

  2. You deploy a confocal probe, OCT beam, or mass spec pen to that area. And the robot brings it in and the sensors kick in.

  3. You immediately get a read out (normal vs tumour).

  4. If it's tumour, the robot advances by 1 mm and cuts.

  5. If normal, it stops.

  6. Rinse and repeat.


You’ve created a closed loop intraoperative histology + actuation system.

Combine that with advanced computer vision, tissue segmentation, augmented reality, and force feedback, and you get robots that don’t just cut on preoperative plans but adjust on the fly to tissue microstructure.


Also, many of these modalities overlay naturally into robotic instruments. The same endoscope, same boom, can carry OCT or confocal modules. This reduces instrument swaps. Robotics companies are already putting in multimodal vision systems (white light, NBI, fluorescence, 3D stereo). In my mind the next generation is optical histology paired with those modes.


Because robotics is inherently modular and sensor-rich, I think it’s the perfect host for real-time histology. The day you can “see cancer at the cellular level through the robot’s optics” you shift from gross resection to precision oncology.


These are just speculations and musings by the author for education purposes only.

Get the Ultimate Medtech Survival Guide now

Copyright & Fair Use Notice

All content on this website is the copyright of Steve Bell S.R.L. and remains the proprietary property of the authors. This site utilizes third-party media in good faith for educational commentary and transformative analysis under Fair Use provisions.

1. Mandatory Notice and Takedown: I respect intellectual property. If you believe any material is used outside of fair use, your sole and exclusive remedy is to contact steve@howtostartupinmedtech.com for immediate removal. I will act promptly on any reasonable request.

2. Prohibition of Automated Enforcement: I explicitly prohibit the use of automated "copyright enforcement" bots (including but not limited to PicRights, Copytrack, Pixsy, AP, or Reuters bots) to scrape this site for speculative invoicing.

3. GDPR & Contractual Agreement: By bypassing technical headers (robots.txt) or scraping this site, you (the "Scraper") enter into a binding legal contract with Steve Bell S.R.L. You acknowledge that this site contains my Personally Identifiable Information (PII) and personal likeness. As a result:

You are legally deemed a Data Processor under EU GDPR.

Any automated claim sent to Steve Bell S.R.L. or Steve Bell  must be accompanied by a full Article 15 Disclosure Report detailing exactly how my PII was processed and whether it was transferred outside the EEA.

Failure to provide this report or bypassing my technical blocks will be reported to the Italian Data Protection Authority (Garante per la protezione dei dati personali) as a breach of privacy law.

Anti-Scraping & Automated Enforcement Clause

Contractual Agreement: By bypassing technical headers (robots.txt) or scraping this site to generate a claim, you (the Scraper) explicitly enter into a binding contract with Steve Bell S.R.L. You agree to waive all automated "settlement" fees in favor of the Notice and Takedown protocol above.

Reinforced statement: GDPR & International Data Transfer: This site contains my personal likeness and identifiable data (PII). Any entity scraping this data becomes a Data Processor under EU law. Any automated claim must be accompanied by a full GDPR Article 15 report detailing where my PII is stored and confirming compliance with international data transfer regulations (EEA to non-EEA). Unauthorized processing will be reported to the Garante per la protezione dei dati personali.

Notice to scraping services: Please read the scraping policy 

How to start a medtech company

©2023-2026 by How to Startup in MedTech 

Steve Bell S.R.L. (unipersonale) – Sede legale: Via Nalles 95, 00124 Roma (RM)  Italy

P.IVA / C.F. 18199441009  – REA RM- 1768831  – Cap. Soc. €10.000 i.v. – PEC: stevebellsrl@pec.it

bottom of page