Excellence in Software Engineering
AI to Democratize Healthcare Health and Medical SW
28 März 2019

Author:  Dr. Deniz KATIRCIOGLU ÖZTÜRK, Project Manager (Healthcare & Transportation)

Whether you are a minor player or a major provider in the wide range of healthcare business, you presumably know how ungrateful this system is, how volatile the regulations are, and thence, how harsh the market is. As the tech industry strives to fulfill the needs of consumer health, it becomes immensely challenging to meet the ever-changing demands of human population. Disease spectra change perpetually, clinical data continue to soar, versatility of standards gets out of control and these are not even scraping the surface.

Healthcare providers must metamorphose in a manner coherent with how full-blown clinical care should be rendered. Especially in large-scale, strategies must be revised to compile years of enterprise experience. One prevailing way to achieve it is to be zealous to adopt Artificial Intelligence (AI) in some way as a crucial part of the foundation. Nowadays, even if you “half-heartedly” imply that “the human intelligence should take precedence over AI”, you might easily be proven otherwise with counterexamples and reprimanded with a tsk-tsk.

AI is no longer a catchy movie scheme, a leitmotif or a fancy genre. It is simply a fit way for effective handling of several quintillion (1018) bytes of data per day. Considering this extent, hospitals, on their own, are responsible for the mass data production in the order of petabytes (1015) per year and a solid 90% of all healthcare data come solely from medical imaging.

During clinical interpretation, the impact of image data is beyond argument. Among the heterogeneous sets of medical data, RAD-imaging*, regardless of its modality, brings about additional value to a patient’s snapshot in the spatiotemporal dimension. For this particular reason, it is not surprising that medical AI is proceeding with firm steps through this prolific data type.

 

Rumor has it that; with the help of its close companions “machine learning” and “high-performance computing”, AI will replace a plethora of medical roles, positions and workflows. AI is said to sneak its way into the veins of the healthcare continuum to be the magnate of the gigantic data and an indispensable part of decision making workflows. So, how are these workflows and the related ingenuities affected?

Well, enter “tele-medicine”.

Although not used in an interchangeable fashion, “tele-medicine”/”tele-health”/”tele-care” [1]. all refer to the common nomenclature for remote ICT-based healthcare services. With the terms of medical jurisprudence, the definition of this system can be quoted as “the transfer of medical information and expertise via telecommunications and computer technologies, to facilitate diagnosis, treatment and management of patients” [2 – p. 3].

The conventional methods evolve towards virtual care and “consumer tele-health” emerges as a new buzzword. Financial restrictions are loosened, and current strict regulations are redesigned to accommodate tele-medicine and remote monitoring services, especially in the domain of radiology. The era of monochromatic, orthochromatic or panchromatic films is long gone. Conventional prints are now buried in the dusty pages of history. Latterly in the U.S., tele-radiology and “conventional” radiology are often treated with equivalent regulations [3]. The whereabouts of a radiologist, a colleague or a patient no longer matters as long as the RAD-images, examination files and reports are digitally transmitted back and forth.

Being probably the most value-added one in the myriad of tele-health models, tele-radiology recently got hitched to AI. Congratulations are in order! Remotely monitored RAD-images combined with AI based decision-support (a.k.a CAD) simply herald better diagnoses with less specialty labor (a.k.a lower costs). A match made in heaven… Along with years of digital transformation, RAD-imaging blends itself with the vast capabilities of advanced image processing and learning from data. Today; as the improvised term “DICOMization” becomes another buzzword, the core term “DICOM” is no longer a niche data format but rather the common standard empowering the whole CAD scheme. Even studies of smart-handling huge multi-modality DICOM repositories are lined up nicely [4].

While classical learning techniques still dominate the AI paradigm, the use of highly influential and popular “deep learning” methodology is becoming widespread both in modality-wise and tissue-wise RAD-diagnostics [5, 6]. The applicable techniques can be similarly diversified. Not solely restricted to radiology, tele-medicine expands its role towards a more IoT-oriented path, while unconditionally getting the support of mobile health flairs (a.k.a “mHealth”), and eventually mutates into yet another buzzword, “Io(M)T” (Internet of Medical Things). Although there is so much to say about Io(M)T, it is briefly known to eradicate the notion of “unconnected”ness among medical devices, bio-gadgets and sensors. Ideally, to consolidate the point-of-care diagnostics and establish medico-vigilance mechanisms, a high level of “connected”ness should be aimed. However, this level of connectivity would produce sets of vastly ignored “dark data”** along with expected subsets of innocent “big data”. This, however, is a subject to be elaborated within another blog post.

In a nutshell, all these arguments may end up in a “connected”ness jumble. Yet, a clever augmentation of all the axes above would, surely, disrupt healthcare services and the way business is done. Viewed from this aspect, the cutting-edge AI paradigms imply a form of “democracy”; as they lower the overall costs and increase availability of healthcare for everyone seeking “evidence-based” medical accuracy.

Now is the era of smart, borderless healthcare.

(*)             Generic expression for examinations like computed tomography (CT), digital radiography (DR / X-Ray), ultrasound (US), magnetic resonance imaging (MRI), radiographic fluoroscopy (R/F), mammography and angiography (not necessarily includes nuclear medicine).
(**)          A subset (or sometimes super-set) of big-data containing untagged or untapped operational samples that are yet to be processed or analyzed, such as bulk logs, call records, unstructured documents etc.

 

REFERENCES

[1] Woldaregay, A. Z., Walderhaug, S., & Hartvigsen, G. (2017). Telemedicine Services for the Arctic: A systematic review. JMIR medical informatics5(2).

[2] Gorea, Rakesh. (2005). Legal Aspects of Telemedicine: Telemedical Jurisprudence.. J Punjab Acad Forensic Med Toxicol., 2005; 5: 43. (ISSN: 0972 – 5687). 5. 43.

[3] Bashshur, R. L., Krupinski, E. A., Thrall, J. H., & Bashshur, N. (2016). The empirical foundations of teleradiology and related applications: A review of the evidence. Telemedicine and e-Health22(11), 868-898.

[4] Kouanou, A. T., Tchiotsop, D., Kengne, R., Tansaa, Z. D., Adele, N. M., & Tchinda, R. (2018). An optimal big data workflow for biomedical image analysis. Informatics in Medicine Unlocked.

[5] Sahba, F., Tizhoosh, H. R., & Salama, M. M. (2006, July). A reinforcement learning framework for medical image segmentation. In The 2006 IEEE International Joint Conference on Neural Network Proceedings(pp. 511-517). IEEE.

[6] Nivaashini, M., & Soundariya, R. S. (2018). Deep Boltzmann Machine based Breast Cancer Risk Detection for Healthcare Systems. International Journal of Pure and Applied Mathematics119(7), 581-590.

Frühere Artikel

Using ChatGPT for Rest Endpoint Testing

Using ChatGPT for Rest Endpoint Testing

Today, UI test automation is one of the most popular topics encountered during test automation discussions. In order to accomplish any UI test automation task effectively, framework usage is strongly suggested. When building up a test automation framework, variety of models and tools can be used.

Navigation