Surgical Robotics - Does it make sense
- Steve Bell

- 2 days ago
- 5 min read

Oh a witty little title to drag you into the blog. But what is this one all about. It’s about the next phase of surgical and health robotics where we dive into the sensor layer. I’ve dropped a video overview below with some thoughts and comments. But here I want to go a bit deeper into some of the specifics of where the sensor layer will be critical to robots.
Don’t get me wrong - it’s not that surgical robots today don’t have various “sensors” in them. But I’m talking here about the next generation of sensors that form rich data inputs to help surgeons and clinicians make more informed decisions. It will provide insights beyond what they can “know today.”And beyonds that it will give way more information to the coming AI systems (Did you see NVIDIA's GTC 2026 presentation by the way) so that they can make better decisions based on multi modal data inputs. (Different types of data such as vision - pressure - touch - temperature.)
I’ve pulled this post out now because of a few recent events that have made me think “now is the time to talk about this.”
What's got me thinking?
First came JURA - the in-development imaging system by Intuitive. Imaging in effect is a rich “sensor” system that is using imaging sensors to obtain data - convert and provide outputs and insights. In the new JURA system it is thought there are things like multi-spectral or Hyper-spectral imaging for one. This is an array of camera sensors that can capture light way beyond the human eye - then colorise the output - and display it on screen to see different things that a human could not normally see such as blood flow and more. Often not thought of as a sensor layer- but it is part of the rich sensor data that can be brought into a system.

Next I was at the amazing ORSI AI & Surgery days and watched session after session about AI and the systems processing the data. I saw clinical application after clinical application. And more and more I heard “The more input data we have the better AI outputs we can get.”I saw that already many companies are working out how to get rich data layers into the AI systems beyond vision. Such as haptics, force, pressure. And I started to see that for specific applications there were specific sensors. Gaining rich new data sources they could feed into the AI models. Having vision - force - temperature - pressure etc etc - makes a more complex “real world” sense that allows the complex NVIDIA foundational models to be used in different ways. And to give new information to users to make better pre, peri and post operative care decisions.
BREAKING NEWS: Sensome & Robocath
What was the final push was a meeting with Franz Bozsak from Sensome last week at the MD1010 meeting in Paris. He showed me an incredible technology where a micro sensor is embedded into a guide wire to make a new class of product. A sensory wire. Tiny tiny tiny. But able to throw off some very very interesting data.


Watch my video it has a section in it all about this breathrough. With this type of sensor they are able to measure inside the clot for a stroke. They can enter the clot and see the make up of the clot, the start and finish - and hence - length of the clot. It can help to show if you are in the right arterial branch for the clot. Help to show where to get the suction catheter against the clot to have better removal of the clot.The punchline was to see the video (see my video) of this new sensor wire being used with Robocath endoluminal - next generation robot. It adds a new dimension to navigating the robot - and giving the user data back. Think of what this will help with for the AI to navigate - assess - remove the clot -as these robots get smarter based in this sort of sensor data.
When I start to see real world examples of this I start to see the future of robotics and how the sensor layer now starts to add a deeper and richer layer to the robot - bringing greater and greater value. And a bigger reason why to abandon old manual practices. And making the AI even more powerful in the future. Instead of lossing at that generational know how and data - we collect it - analyse it and drag insights from it. Think of this for training of the future. You can "feel" how much force and expert applied. And you can benchmark yourself vs them.
But it didn’t stop there. I saw that Intuitive just bought Ruby Robotics. An interesting company that uses peri-operative imaging to get biopsy assessments. Ideal for ION but also could be introduced to da Vinci. This sensor and AI platform combination will be a powerful add on to systems like ION.

But we are going to go well beyonds this as I’ve seen so many of these micro sensors coming that could change things like robotic instruments - giving them sensations of true haptic feedback - texture - density - wetness - temperature - all the things that surgeon’s fingers do automatically today. But with a robot it’s lost - and it’s valuable information. Being able to use sensors to bring this back to the surgeons will be huge. And it will set robotic systems apart… just saying. Those with a vast sensor rich data ssytem and thios "dumb" robots.
Think about stroke robots that are navigating thin guide wires and catheters through tortuous vessels with varying calcifications - false aneurisms - narrowings. Today with a catheter the clinician has direct feel of the catheter and the feel against the tissue. The feel resistance- pressure - when they get to a vessel junction the image and feel work together to allow navigation of the wires. This sense of wire touch is critical.
In robots - by adding pressure and force sensors into the wires and catheters we can replicate that touch and give back haptic sensations to the users at the console. Think Sentante and what they are doing with their haptic feedback. Emulating the feel of a guidewire .and cathere Bat the console. They need really good sensors for that.
But more than that, all of this rich data of combination imaging, pressure, force etc is a rich data source that starts to inform learning systems and AI. If you want better autonomous and rapid navigation in the future... you cannot rely on just 2D imaging. Instead with the sensor layer you can have the AI act more like a “human” and beyond. The robot will see and feel it’s way to the pathology. Just as a clinician does today. And it will be able to replicate the skills of the best in every system worldwide. Raising the level of skill for all operators.
Here’s a video I made last week, and it contains some great breaking news from Sensome and Robocath with a great new video on what they are doing. (Thank you to Sensome and Robocath for giving me early access.)
Well I hope you are as excited as I am about the upcoming integration of smaller and smarter sensors. And you might also ask “But why now” from a technical point of view.
I think that answer is about technology and miniaturisation is now here. The ability to transmit clean signals over long pathways. The ability to miniaturise connection wires. The ability of micro electronics that can be placed right at the tip of wires and instruments and gain incredible data from such small packages. That is only available - now. And we are in the middle of the AI revolution that can take all that data and make sense of it - and present it back to users in meaningful ways. Or help drive towards autonomy in the future. Yes surgical robots make sense (Pun intended.)
These are just musing and thoughts of the author for eduajfyional purposes




Excellent breakdown of why surgical robotics success is ultimately an economic and workflow question, not just a technology one.
One layer that still feels underdeveloped across platforms is real-time intraoperative signal awareness — the ability to detect emerging risk patterns during energy–tissue interaction before they manifest as complications.
It seems likely that the next phase of robotics will be less about improved actuation and more about embedding this kind of procedural intelligence into the system itself.
Exciting to see more discussion moving in this direction!!