Digital ecosystem or surgical robot - which is most important?
- Steve Bell
- Nov 20
- 15 min read

This is a common question I get asked - especially by those that are struggling with their digital ecosystem. I think they inherently know the answer… but let’s dig in…
There are now over 36 soft tissue surgical laparoscopic robots out on the market, or in development today. There are numerous endolumenal robots and a handful of open robots for microsurgery or opthalmics. The list grows each passing 24 Hrs.
There are vast numbers of specialised navigation and robotics systems from spine, to prostate, to neuro. - and all are competing for market share (or will be soon.)
There is a wave of robots coming for new application like heart valves and stroke.
So how important is the robot and how important is the digital ecosystem that needs to be built along side it?
Let me talk software for a second
I’ll distinguish between two types of software, but then explain why I think they are deeply connected. First there is the embedded system software of the robot. The is the code that turns the robot on - puts up on screen menus - takes the inputs from the user and converts them into the movements of the robot. It’s a bit like the iOS on an iPhone.
The second bucket of software (I will butcher in how I describe it) is the surrounding software. Things like software that captures the video and puts it to the cloud then analyses it. Software that records the procedures, or even talks to the electronic medical records system. Software in simulators that allow phot realistic procedures or more of a training environment. Software that is in the “App” on the phone or the web browser that pulls all the data together and presents it in insightful graphs, soundbites and infographics for the users. Be that surgeons with surgery data or hospital administrators with an overall picture.
The software engineers are just rolling their eyes - but hey stay with me I need to make this content accessible - it’s not a software course.

This is sort of the surrounding software that brings insights and value to customers… but also the company needs this type of software too Software that indicated predictively if there is a fault about to occur with one of the robot arms, automated and smart servicing schedules, and insights that help the R&D team to improve things. Some of that embedded telemetry data is useful; but also some of the suer data that the robot doesn’t collect itself.
The one beauty of a robot is that it is a very complex computer that sits between the patient and the users - and is crammed with all kinds of sensors and data producing hardware. And all of that data (if the system and the hospital allows) can flow back to the company - be collected - processed - generate insights that can then be delivered to the clinical team (surgeons, nurses, hospitals), OR support team of the company, sales teams, engineers and more.
I cannot stress enough to hospitals to let the data flow - you will ultimately (directly or indirectly) benefit from it. It’s not an “oil” that you can monetise in the way you think.
Different companies - different digital solutionsfor surgical robotics
As I scan the current robot offerings - across all types of surgical robots - I see very different digital ecosystems associated with each robot. From the class leading Intuitive Insights to CMR Surgical and Versius connect, to Touch Surgical. And then I see many companies with virtually nothing as a digital ecosystem. They simply have “the robot.”
So why is this?
Part of the ability to have a comprehensive digital ecosystem lies simply in “cash and resources” - as this part of the robotic architecture is costly and resource intensive. Much of the data infrastructure relies on things like cloud storage (which for video gets expensive quite quickly) and of course software has very fast update cycles compared to hardware. The digital ecosystem just absorbs time, people and money - and if you are just getting your robot out… it is often n afterthought.
It also comes down to the hardware. If the company didn’t think future ecosystems - they may not have put all the sensors in the system they needed. The robot might be quite “dumb” - and just be a electro mechanical marionette. And many of those decisions needed to be made a decade ago - and are not easy to retro fit. So some systems just have limited capability of what data they can provide to their digital ecosystem. In fact some are closed off on purpose for cyber security reasons. And that can mean limited data coming out - but often no ability to write into the robot with things like over the air updates or user preferences.
It helps with the cyber security when you are just trying to get FDA - but it often nixes your system in terms of user functionality.
So on many systems, today, we have a wide range of digital capabilities - including telesurgery and tele proctoring. That is all part of the digital infrastructure - and some robots are built to do it - others are just not.
So we end up with a massively wide digital ecosystem definition - and each and every system will tick some boxes and not others.
But does that matter? Who cares? Surely it is just about if the robot works or not for most users?
Class leading sets the baseline in surgical robotics
Of course I’m going to talk about the class leader here in digital ecosystems: Intuitive across both their da Vinci line up and Ion line up. But let’s focus in on their da vinci side of the apps for now.

From day one the team at Intuitive has always understood the power of the computer between the surgeon and the patient. And it was suspected for years that a lot of data was being collected - and little came back to the users. I heard for years this theory… and the frustrations it brought with it. But in the last years that has all changed. Intuitive has shown dominance in the field of connected apps and data around their system with things like their Intuitive Insights Plus. They have a whole suite of digital products and ecosystem that is making the robot way more than just a clever surgical tool.
Quick interjection: I was accused this week of being a mouthpiece for Intuitive. I was asked if I was actually paid by them to write these things… because I always hold them up so high. Simple answer - NO. I just speak what I see. For me I’m just trying to remove any sugar coating about the surgical robotic space and say it as it is. Hope that’s clear.
And to add to that, as I was involved in the development of a digital ecosystem. I think for a brief moment the absolute class leader in digital apps, and the company that set the pace was CMR Surgical. I think for a moment we had the class leading app with data insights, case insights, video given back, kinematic data, instrument usage etc. But with vast resources (and vast data - which I’ll come on to) Intuitive just started to nudge ahead. Then have since put a big distance between themselves and everyone else. But I do want to give credit where credit is due - team Versius !! The pioneers of this.

A second team at the time under the guidance of Scott Huennekens at VERB were also - “data first” and that gives me some hope and insight that has spilled over into JNJs OTTAVA system - that by all accounts should be a heavily data centric system. I mean they built the entire Polyphonic data ecosystem to take all that data (and more). But I do think that as they are also late to the party - which is not always a negative. One advantage is that they can see what others are doing, what is working, what is valuable… and emulate it. Building hardware later also ensures enough sensors and data in/out is built into the heart of the system. I have high hopes that Ottava will be a very smart and well built data system.
But back to Intuitive - and why their system actually matters the most today.
Firstly, they are the market leader with the biggest system installed base of 10,800 surgical robots. And that is meaningful from a network effect. And scales the mass of data they have generated and are generating daily.
Second they have their robots “sorted” and have an entire division just dedicated to the entire data science and data ecosystem side of the business. What I mean is - they are not stuck in the early - “just get the damned robot out” phase. They are along the evolution to looking at complete systems.
Third - they set the benchmark in tenders, offers, and user experience.
If, as a user, you get sucked into their digital ecosystem - it’s a bit like Apple or Android. As a user it is hard to change out. You might lose all that case history and insights - as most systems don’t talk to each other.
You get used to how it works - where to find stuff - which tabs to press… and the connection to the community. If you have six surgeons in your hospital all summing data to a hospital dash board - it’s hard to be the odd one out - right?
How do you explain to management that you don;’t want to participate any more in their crucial data pool. How do you work with the residents on the Intuitive app if you don;’t have it. It’s a super clever lock in that hardware companies do. Smart.
The depth and quality of what Intuitive’s digital ecosystem brings to all users is quite staggering. Video insights - kinematic (movement) insights - now with force and tension insights - outcomes are linked - an a deep deep set of case data and system usage data.
All presented by gorgeous interfaces that make it “easy" to access data; and more importantly insights about you and your surgery and your patients. They have amazing digital simulation, amazing engineering feedback data to keep uptime at 99%, they have inventory data and usage data for hospitals - and all of these data streams all feed in to demonstrate better outcomes, better efficiency and a class leading experience. Hell, now you can even set up your system before you enter the OR.
You can get crucial software updates “over the air.”
You can get insights by the time you have taken your gloves off and sat for a coffee.It is class leading - and so sets the bar. And all other companies are going to have to chase that bar. Because YES it is now an important part of what a robotic service in a hospital means and accepts. In a minute I’ll come onto open and manual lap and data - as that too is now a growing part of this ecosystem. It’s not lost on Intuitive.
But have no doubts - everyone’s digital ecosystem will be measured - sooner or later - against Intuitive insights. And well Intuitive has one big (and I mean) big advantage over everyone else due to their installed base. And that is the number of cases they have to date.
The mass of data counts in AI and autonomous surgical robotics
17 million procedures and counting. And that means a shit load of data associated to those cases that feed their immense data engines. I mean a LOT. The next closest company could have about 40,000 cases. Just let that sink in as a scale difference.
No one can catch up - ever ! So do not delude yourself that “once we hit the market we will.” No… you will not.

And this really matters - as the best ML training comes from bigger data sets. Not that bigger is always better - but in this case it is. Now how much is “big enough” I’ll let the data scientists speak to that - but if memory serves me right it needs millions of data points if not hundreds of millions to train these systems accurately. And well Intuitive wins outright there. And no they are not sharing that anytime soon.
In my mind, some of that data gap can be closed - and in some ways - it could also offer “different” insights. If the data collected (such as with Medtronic’s Touch Surgery) were to collect manual lap and robotic - you start to actually get a better data set in my opinion. You start to have AI systems train on “surgery” not just “robotic surgery.” And as a surgery fan… and patient first fan… think that being able to really understand (ideally with open cases) insights that say… these procedures are best performed open, these ones by manual lap and these ones by robotic - would enrich our understanding of what is best for the patient.
Of course if you are a company owning the manual lap market - you might have some unconscious bias into manual laparoscopy - but I think the data will start to become obvious.
Clearly that is not lost on Intuitive. When the DV5 tower came out I postulated that they would drop these in for use in hand help lap cases.
Some of you laughed and said they care about robotic only…”Well - they have dropped them into lap use cases… and are scooping up the data (as far as I can see) and that is loading them up with the manual data as well as da Vinci data. It all adds to their massive data lake and starts to allow them to look for “well actually these manual cases have worse outcomes than if you do it with a robot.” Because they too will see the world one way.
For me it doesn’t matter - what counts is the mass of accurate data to train the models and then let the data speak (with as little bias as possible) and give the surgical community insights.
Now… for the naive ones out there - this might seem like a “really nice” thing to do for improving surgeons. And yes it is. But don’t be fooled. This mass of data of course counts for giving back insights - raising the bar - there is no doubt and it is happening today. But that same data lake can also be used for another purpose - the training of AI to make support and a steady step wise to automation and then autonomy. And who holds the biggest and best data sets will have the best training of their system. Period.
Data learning for AI and autonomy in surgical robotics
I’ll break this out here as I cannot stress enough that we are heading at full speed to autonomy. It may be supervised autonomy, and it may come in small steps - a bit of needle driving, a bit of grasper force reduction. Small assistive steps that help users avoid complications. Then no fly zones, anatomy identification and avoidance or targeting.
“Hey da Vinci… cut around that tumour with perfect margins…”

This is where the digital and data ecosystem is actually driving the robot. It is training the systems to give insights to the user and the “robot” at the same time. And for those that are scoffing and saying “Not in my lifetime!” - I’d like to point you to the demo that Intuitive just showed at Italian Innovation week. Yes a “theoretical demo” but be assured it is coming to an OR and sooner than you think. (Check the article here.)
So here’s an interesting thought for you all. One way that other companies could potentially get to this amount of data faster is to pool all their data. Have Medtronic and JNJ pool their data - add in the Chinese robots , SSi and CMR… and you start to get bigger data sets faster. In fact I think Polyphonic by JNJ was an attempt to do that - or rather is.
But as a side note… and this is just my opinion as usual, this should NEVER have been done under one company. No way competitors to JNJ are going to help train up OTTAVA and let JNJ have access to all that data.
Instead - a smart move would have been to create this as a surgical data foundation. A genuine, independent, non-profit data company - where all the companies could pool their data - and all could get the benefit of pooled learnings. If they would just understand that in surgical robotics it is “Intuitive Vs everyone.” Then maybe they would understand that cooperation is the only way.
But no… JNJ had to make it a JNJ thing… had to slap red all over it… had to have control - and well that means only JNJ will get JNJ data and it will flop (in my humble opinion.) Still time - just a sayin’.
Come on guys! Give it up - hand it over - have a grant you give continue it, but let a society like SRS own it and run it independently. And allow all companies that contribute data to then mine the data for insights. Just have industry pay for it but not control it. Anyhow I fantasise as I grumble.
It is clear that many of the companies are aggressively creating a digital ecosystem, and many are using that as training data for autonomy. Each week I see neat demos of some kind of autonomy - it is coming… and with the Chinese companies it will come very fast indeed. Just look at their industry on robots, drones and electric cars. They ain’t hanging around.
But it is not just the robotic companies alone enabling this shift - NVIDIA has release after release of new chipsets, new hardware and new real world physics models. They are massively empowering the robotic companies by giving them the AI and autonomy tools. Some of this will reduce the need for the robot companies own data. Recently JNJ showed a neat demo video of how the 3 arm PCNL Monarch was trained via digital twins on NVIDIA real world anatomical models. Quite impressive. This “shortcut” by NVIDIA is helping the industry to get there faster. But it still will need some of the data from the actual robot to compete the picture.
Every robot if different in how it works, kinematics, control software and imaging. You can get help - but you can’t get the full solution.
So what is more important - Surgical Robot or Digital ecosystem?
I started this post by asking this question. And the answer is not “both are the same.”
I’m with the next generation of TechMed thinkers on this. The robot will become the capable delivery system; and it will be important. But who has the best digital ecosystem and software will have the most success with the next generation of physicians. The tech savvy, AI ready generation that will just shake their head when an old school surgeon says “Pahh AI will never be as good as me. Who would want all that assistance?”
Rapid capability advances are bound to software not hardware. Of course you have to have credible - reliable - functional hardware. But the software around it in the apps, training systems, AI guidance and autonomy will become the new currency of the next generation.
Let’s take the DV5 and the Toumai from Medbot. Both X booms. Both closed consoles - very similar in design aesthetics and capabilities at a physical level.
Imagine that tomorrow Toumai gains full AI powered Autonomy that really works. Any surgeon on the planet can suddenly perform like the greatest of the greatest surgeons anywhere. The system never makes mistakes, it gives guidance to anyone and everyone that is world class. It intervenes when needed - reducing grip force, or stopping cutting the wrong structure. It shows the boundaries of the tumours and then cuts with perfection - under guidance. It sutures to absolute perfection - and all of this from their global knowledge data base, localised health care system, user surgeon knowledge, and most importantly the electronic medical record and history of that patient. It guides… it acts… and it produces the best results you can get - repeatedly.
Complication in the hospital - related to surgery - crash. Cost savings are realised - and efficiency increases.
With the next drop it gets almost full autonomy - and 3 hour procedures drop to 2 - complications are almost non existent - more complex cases can be done by anyone. The lack of surgical talent is diminished but the new autonomous robots allow hospitals to work at full capacity.
All of this data feedback to the digital ecosystem (a virtuous circle) and it works out better OR management, supply chain, efficiencies and scheduling. The hospitals youngest and newer surgeons become efficient supervisors almost instantly, even as the older surgeons retire… none of the knowledge is lost. There is no drop in efficiency or standards of care…
So… Medbot takes the lead in this. DV5 is still good - but is a robot as it is today.
So, which is now more important? The robot hardware or the robot software and digital ecosystem around the robot? You tell me.
And which part of the system will get regular upgrades and drops faster?
Software can be updated way faster than hardware. And it is easier to implement to the field than changing hardware. And way way way more cost efficient for both the hospital and the company.
In my mind - when all hardware becomes more or less equal - it is the best software that will start to win. And yes I understand all of the regulatory hurdles - but I’m looking 10 years from today. And yes (I can hear the gnashing of teeth) Intuitive is actually the one currently leading in this area of robotics and software and AI.
But I want to give hope to all, by reminding us that looking in the rear view mirror is not what is in the future. If the companies can start to push ahead in software and digital ecosystems - then they can be more and more competitive - and give compelling reasons for hospitals to consider other systems.
Hell - if one company can absolutely take the lead in AI & Autonomy - they could get way more market share than I have predicted in the past. It is possible!
Fantasy? No. Unless you’ve been sleeping you will already know now that many of the companies are deep into the digital ecosystems. And a few of them have started to show convincing demos of AI assistance and initial steps of automatic help. The slow steady march to autonomous procedure steps (or surgical actions) and then onto chaining these together to move to more and more autonomy. It’s happening now. FDA is working this out too - so stop hiding behind “It’s regulatory that will block it…” It won’t. It will of course slow it - but this is being worked out… NOW!
And I will not go into it today - but this also spills into education, simulation, feedback loops, distance tele robotic AI assistance (commands from a distance - and actions at a local level). I myself was raising an eyebrow just two years ago… but today having had access to see what I have seen in development… I can assure you this is almost here.
Get ready for a software driven robotic future.
These are just speculations and opinions of the author for education purposes only


