Doctors have many different tasks each day, but they can usually be broken down into three categories. Incidentally, they closely resemble different kinds of fires.
The first would be a fire in a garden, for instance, an inpatient who is heading south quickly or any patient badly needing a scan despite a long waiting list. The second would be a fire down the street, such as ensuring sufficient staffing to provide at least the minimum standards of viable care, or even facing the difficulties in offering care through outpatient clinics that wind up having unacceptable delays. The third is the fire that is far away, as these are simply distant priorities. However, it is this kind of fire where there is the most room for service improvement. Ideal Health Consultants encourage adopting new technologies, such as AI (artificial intelligence), given how these priorities often take a backseat to patient care, for good reasons of course.
Adam Kay wrote a brilliant text called “This Is Going To Hurt” which lays out what life is really like for a doctor, and he does so in vivid detail. If you’d like to also learn how hard it is to work inside the system as a consultant, then also check out Henry Marsh’s “Do No Harm”. Both are excellent reads about how doctors are dedicated to giving patients the care they need and deserve, but they also need systems that empower them.
It’s always going to be true that patients must be prioritised over all else. Sadly, how much we can work on data and make progress with an industry-wide digital transformation is seriously curtailed simply by workforce limitations. The total number of doctors even participating in the conversation is not what it should be, and if things go on like this, then it’s just not going to be possible to make advancements in patient care.
AI is sensationalised by modern media to some degree. However, up-to-date surveys show that both healthcare organisations and their executives have a better comprehension of how AI can be of value. There are operational areas where clinical and even life-crucial functions can adopt AI technology, and some of these certain areas aren’t likely to be stressful to either clinicians or their patients. A wise approach can prevent some of the disappointment that inevitably happens when a new technology is oversold and then rolled out into the real world.
Also, a machine cannot be fully responsible for clinical duties. It’s not even physically possible yet. Also, any potential ethical implications need to be seriously considered and appropriately managed. Human clinicians are members of various professional bodies that train them well. Machines just can’t match this. It’s going to be a long time before any algorithms are handed control of full clinical decisions. It’s true that people can and do make mistakes. However, right or wrong, people deal with expectations that aren’t appropriate for machines. Until then, clinical artificial intelligence has to keep learning from humans so it can improve. The risk to patients also needs to be minimised along the way. In truth, we’re actually focused not so much on artificial intelligence as augmented intelligence.
The paradox isn’t much different than someone trying to get their first job, because they need to have professional experience in order to find employment, and yet can’t get that experience since no one is going to employ him. How can any health system start incorporating AI to help doctors that are currently too stretched to offer the clinical guidance AI requires for training and learning? No doctor could ever be expected to make the development of AI a higher priority than patient care. However, many doctors should be engaged so we can have the ideas of lots of them instead of just the current handful. A lot of doctors have overcrowded clinics full of patients who are having a hard time both comprehending and fighting their diseases. Doctors in such places need to know that the tight-knit care teams they have aren’t going to get replaced by machines.
We should look at some ways of improving such clinics. Can certain appointments happen via telemedicine? If the answer is yes, then which ones in particular? Do patients typically have access to common digital tools? Can these be employed? Analytics are necessary for insights into the healthcare experience of both the workforce and patients alike. Some of this can be used immediately in putting out the fires mentioned at the start. That should free up the time of doctors and nurses to have more time with their patients.