Mendel Team
March 1, 2023

Competence via comprehension: AI for healthcare needs clinical reasoning skills

Artificial intelligence (AI) is playing an increasingly important role in the healthcare industry. But to fully leverage the potential of AI, it must be equipped with clinical reasoning skills - the ability to truly comprehend clinical data, or in other words, to read it as a doctor would. When it comes to data processing tools, only a tool capable of clinical reasoning can effectively process unstructured clinical data. 

A healthcare ontology can help AI develop these clinical reasoning skills by providing a structured and standardized representation of healthcare concepts and their relationships. In healthcare, an ontology can be used to represent the relationships between various clinical concepts, such as diseases, symptoms, diagnoses, medications, and procedures.

The value of a healthcare ontology lies in its ability to help AI systems reason about clinical data and make accurate inferences.

But the standard clinical ontologies don’t reflect how clinicians actually describe medical concepts.

The complexity of healthcare data requires a thorough understanding of medical terminology, disease processes, and healthcare delivery systems. Without this foundational knowledge, it can be challenging to comprehend the data accurately. However, vocabulary isn’t enoughcontext is imperative. Doctors use shorthand, paraphrase, and assume domain expertise, which can be difficult to map onto an ontology rooted in vocabulary alone. Clinical reasoning requires far more than just word recognition.

As an example, consider the term EGFR. It can stand for “epidermal growth factor receptor” (a gene) or it can mean “estimated glomerular filtration rate,” (a kidney test). An ontology that understands both the vocabulary and the context of healthcare can help AI develop clinical understanding – which is the key to structuring unstructured health data accurately, efficiently, and without losing or misinterpreting crucial information.

Where standard NLP gets stuck

While natural language processing (NLP) can be useful for extracting information from clinical documents, it can struggle to reason like a clinician. Standard NLP often gets stuck at the surface level of the text, lacking the capability to move across documents in time and understand concepts that go beyond the text on the page, such as intent. This can lead to inaccuracy and imprecision when deviations from the standard sequence of events occur.

To be maximally useful, an AI tool must understand not just what is written, but what is implied. In order to support real comprehension, ontologies need to model knowledge in a different way.

At Mendel, we’ve developed a proprietary knowledge representation system that reduces concepts to their most basic entities. Using this approach, we created an ontology that AI can reason with – a breakthrough in the field. Furthermore, by mapping back to the standard ontologies in common use, Mendel's approach supports communication with existing healthcare systems.

For AI to develop clinical reasoning skills, its governing ontology must support learning.

Our reasoning algorithm with a governing ontology does just that – delivering comprehension, not mere recognition. With learning and comprehension support, AI can begin to unlock the full potential of healthcare data.

Our approach to knowledge representation could mean a more accurate and efficient structuring of your healthcare data.

Read our white paper to learn how.

The Feed