I’ve been thinking about robots, dogs, sci fi, and the intravenous drugs hospitals are pushing into patients’ veins.
2030. A robot, a dog, and a pharmacist walk into a bar.
A chatty bar tender learns that his three customers comprise the entire pharmacy department of the large hospital in town.
When asked what each does, the robot starts, “I prepare medications.”
The dog follows, “I keep the pharmacist away from the robot.”
The pharmacist confesses, “I feed the dog.”
Below: Response image to broken link on Google.
At my first American Society of Health-Systems Pharmacists Annual Meeting (1993), I stumbled onto a prototype unveiling of the first IV-preparation robot in America. In the same exhibit hall, I observed a table model of a “revolutionary” medication storage-and-retrieval robot that would eventually find a home in about 10 percent of our nation’s hospitals.
During an education session at that same meeting, a hospital pharmacy leader and technology guru forecasting the future of pharmacy practice noted pharmacists’ fears of being replaced by robots.
Two decades later, we’re still a long way from robots replacing pharmacists. If anyone, technicians should fear being replaced. But even then, the pharmacy robotics of the present and immediate future that I’ve seen require technicians to operate, “feed,” and pamper them.
In the meantime, robots, along with less sophisticated semiautomated bar-code-driven technologies, have demonstrated efficacy in preparing IV medications more accurately than humans left to themselves. See IN THE CLEAN ROOM: A review of technology-assisted sterile compounding systems in the US (Jerry Fahrni, Pharm.D. and Mark Neuenschwander).
I recently stumbled onto science-fiction author Isaac Asimov’sThree Laws of Robotics, quoted from his Handbook of Robotics, 58th Edition, 2058 A.D. Though penned in 1942, Asimov’s laws are still used by ethicists today.
The first law states: “A robot may not injure a human being or, through inaction, allow a human being to come to harm.”
I’d argue that engineers have a moral, not to mention legal responsibility to make sure their robots neither harm the operators nor, through their inaction, allow them to come to harm. Interestingly, one oft-stated purpose of today’s IV preparation robots is to protect operators from exposure to toxic chemicals and to relieve them of wear and tear from repetitive manual tasks.
Patients that receive, nurses that administer, and pharmacy technicians that prepare intravenous medications are all concerned that no errors are involved along the way.
So, why are only 5 percent of our nation’s hospitals using available and proven IV-preparation technologies to ensure accuracy when making IVs?
The most common response I get is, “We can’t afford them.” Okay, if our only options were half-million or million-dollar robots, I’d understand. But as I said, there are semiautomated systems, which are essentially as safe and cost 80 to 90 percent less than the robotic options.
Actually, I doubt the foot-dragging stems from lack of funds so much as hospitals failing to discern the occurrence of errors in their clean rooms. I wonder if this is not more what Aldous Huxley called “vincible ignorance,” which he defined as “not knowing because we don’t want to.”
In any instance, too few health systems seem to exhibit the courage or take the trouble to expose their clean-room practices to the literature on preparation errors.
One five-hospital observational study on the accuracy of preparing small and large volume injectables, chemotherapy solutions, and parenteral nutrition showed a mean error rate of 9 percent, meaning almost one in ten products was prepared incorrectly prior to dispensing.
Before I close my laptop, could I rerun Asimov’s first rule for robots? “A robot may not injure a human being or, through inaction, allow a human being to come to harm.”
Hospitals seem to understand the first line. Everyone gets Hippocrates’ “First, Do no harm.” But I’d suggest that by failing to adopt sterile compounding technologies, hospitals are allowing human beings to come unnecessarily to harm. Should not Asimov’s “inaction” warning apply to people (hospital leaders) as much as to robots?
Edmund Burke’s well-known no-brainer comes to mind: “The only thing necessary for the triumph of evil is for good men to do nothing.”
Finally, lest we put all the blame on healthcare providers, I wonder if we lay people don’t share some culpability by failing to expect our hospitals to use proven bar-code scanning systems in their clean rooms.
Ask your hospital if it uses bar-code verification technology when preparing IVs for you and your loved ones. If they don’t, ask why.
What do you think?
Mark Neuenschwander aka Noosh
PS. At the 1933 Chicago World’s Fair, the science building featured a robot, which among other things could smoke. Smoking is one activity humans would benefit from turning over to robots. Then again, Asimov’s laws must have some corollary against humans bringing harm to robots.