Monday, September 7, 2015

"Extinction of the radiology report" or the radiologist ?!

  As I await my hand-tooled leather bound, numbered and signed copy of Curt Langlotz's book, The Radiology Report (Amazon), I am most intrigued by the next to last section of Chapter 12, "The Possible Extinction of the Radiology Report." I, myself, have been giving this a lot of thought in the context of the acquisition of Merge by IBM Watson Health. Their tag line for the acquisition, picked up immediately by the media is, "Watson to Gain the Ability to 'see' with Acquisition of Merge Healthcare."
  I do see (no pun intended)(OK, going forward, all puns intended) how this could be interpreted in several ways. Some will say that they acquired the legendary 'Merge DICOM Toolkit(tm)', one of the first and finest(?) DICOM SDKs in existence. So, in that sense, Watson can ingest, digest, and expel(?) (exgest? vomit?) medical images. (It is hard to stop anthropomorphizing Watson). But that has been easy to do for decades now. I would argue, as is my wont, that a DICOM toolkit is not truly 'seeing'. A DICOM toolkit is, to a robot or information system, perhaps, the 'retina' of the image seeing process; Transmitted 'energy' transformed into a representation that can be processed by the 'brain'. I may be going too far, here (bear with me, I'm getting to a point).
  I would continue to argue, as is also my wont, that computers do not see by ingesting data. Computers 'see' by algorithms. We are all familiar with some of those algorithms such as those that detect and classify breast calcification and masses. We are also all familiar with new niche CAD applications in development. IBM and Watson already surely have access to scads of laboratories working in this area. As I respond, however, to people who come up to me at cocktail parties (admittedly a rare occurrence; cocktail parties not people coming up to me) and say, "aren't you afraid of losing your job as a radiologist to a computer?",  "That ain't going to happen in my lifetime." So, for now, we can consider Watson a child, born blind, who is beginning to perceive the outside world and may just be able to recognize a few, very specific, objects.
  There are, then, probably some who think that Watson can 'see' because they acquired billions of images AND their associated radiology reports under management by Merge systems at thousands of Merge customers. What a tantalizing training set! Of course, a tremendous amount of image processing and manipulation to do as well as a ton of NLP (even if you created UIMA). Now, others, as is certainly their wont, will argue over who owns this field of haystacks. As many lawyers as can dance on the head of a pin could debate this, but I, and many others, would approach and say, "Your Honor, the patient owns their data." The health care providers are merely stewards, curators, users and librarians of the data, accessioning, analyzing and reporting to the patient. The Merges of the world are merely contracted file cabinet salesmen and managers. So, it is not inconceivable to imagine a horde of IBM lawyers (some of whom are aware of the impact of Watson to their own profession) descending on Merge customers to negotiate new Common Rule ways to approach their patients to ask them to donate (?!) their images and reports to the medical education of Watson.
  I don't see it that way at all. To my mindWatson, through Merge, acquired 'desktop' software access to hundreds, if not thousands, of radiologists. To my mind, this is the most intriguing prospect and strikes to the core of what it means to be a radiologist. Ginni Rometty, herself, predicts, "every decision that mankind makes is going to be informed by a cognitive system like Watson." Broadly speaking, radiologists do five kinds of work, Clinical, Research, Education, Administration and Management (which explains why we are the CREAM of the crop). Make no mistake about it, though, what Radiologists are paid, generously, primarily to be is eye-brain systems: Make image feature observations and derive inferences therefrom. We are not better than other humans at finding Waldo, rather, "expertise in medical image perception is domain specific and dependent on the extensive training that radiologists receive in that domain."
  One chronic problem we have with information systems, in general, is that we still ask computers to do things at which they are not good when a human is better and available and we similarly continue to ask humans to do things at which they are not good when a computer is more suited to that task. That is the crux of the opportunity. We are very good at making those image observations, and slightly less good, I bet, at making the inferences but we are very bad at, for example, searching for patient information (even in a connected EMR) in a useful and efficient manner, knowing the entire patient context, and knowing all the myriad details of a broad list of gamuts.
  Watson, I would also bet, is very good at these latter tasks. I suspect, he is, or will be, a near perfect Bayesian. Through Explorys (another recent IBM acquisition) Watson will have access to a ton of "Data Related to the Delivery and Cost of Healthcare." Access to EMRs will not be far off. Through Phytel, (yup, another recent IBM acquisition), Watson will have access to population health data. Watson will know far more about the patient, diseases and disease management, and how that specific patient fits in to precise population metrics and experience far better than any human. And in a very Deep Blue-ish way, Watson will be able to find the most cost-effective path to the 'end game' for that specific patient. 
  Watson, however, also has needs. I believe the appropriate reference is Feed Me. Watson needs input, especially in imaging, from humans where the long arm of electronics cannot yet reach. So, if I were among Watson's keepers, I would be adding structured image annotation to Merge's workstation software as fast as humanly possible. The good news is that a surprisingly good looking group of researchers were funded several years ago by the NIH NCI AIM Resources to develop a 'standard' information model and tools to represent structured biomedical image annotation and markup; called AIM (Classic AIM demo video Daniel Rubin AIM related projects).
Structured image annotation will be far more important than structured reporting going forward.
  Now, allow me to finish this extravaganza by putting on my nerdish, science fiction fanatic hat. What could the end game look like for radiologists? "We will control the horizontal. We will control the vertical... For the next hour, sit quietly and we will control all that you see and hear." There will be no reading worklist, no EMR, no reporting tools as we know them. There will be image display software with embedded, speech driven,
[Yes, speech driven, though speech recognition is becoming commodity. (I speculate that Watson, with cash resources nearly as large as compute resources, bought 'eyes' with Merge rather than 'ears' with, for example, Nuance Healthcare since they already have NLP technology)]
 AIM annotation tools, period. Images prioritized (Watson will be an excellent case manager) for human evaluation will be displayed in a pre-determined fashion optimized for the feature detection task at hand. No preferences, no configuration. Your job as a radiologist will be to make image observations and annotate them. Think CAPTCHA: Telling Humans and Computers Apart Automatically on steroids. You will still derive some inferences of value and annotate those as well, but the vast majority of the inferences, conclusions, diagnoses and recommendations will be made by Watson taking into account vast amounts of information of which you could not possibly have knowledge. As the CAD algorithms improve, you will note that certain images (perhaps first, mammograms) no longer flash on to your screen as the algorithms take over. Don't forget, that Watson will not only rely on the image processing results and their accuracy but also everything else known about the patient and their population. Watson will not need perfection in image interpretation to be (statistically) perfect in diagnosis. 
  But wait, Watson, "it dices, it slices and so much more". I believe it was a wise radiologist, Merril Sosman, who is attributed with saying, "You see what you look for; you look for what you know" (in-the-process of being minted, young, whippersnapper radiologists take heed). No one will know better than Watson what Watson knows about a given patient. No one will know what piece of information is the most critical to Watson to improve the power of his calculation than Watson. So I also imagine that you will have an ear piece in place when you are on duty. 200 milliseconds after the image is displayed and you vocalize some annotations, Watson will whisper in your ear, "But did you see 'endosteal scalloping'?"
  Do not mourn, prematurely, the passing of the radiologists. We will still, for now, take responsibility for and manage patient safety, radiation safety, and technologist quality control (garbage in / garbage out still applies). Research in radiology will continue to develop new modalities and techniques to create new image features to be observed but research will decide which are for Watson and which still need to be done by humans. Residents will only be taught in the latter. We have evolved and adapted with technology, perhaps better than any other medical specialty, over the 120 years since X-day. I imagine we will adopt to this change as well without becoming Melkotians. New possibilities will arise (making image observations inside the darkened interior of your autonomous vehicle).
  Every other participant in health care delivery will have to adapt to these changes as well. Just consider, one day Watson will be whispering in to the ear of some internal medicine specialist, "OK, now insert your finger...". "I, for one, welcome our new interpretation overlord."