TECHNOLOGY

Pioneering research
for clinician-like AI

Our AI research aims to learn from the journey of every patient with the logic of an expert and the scale of the machine.

Clinician-like AI

Processes structured & unstructured data

Over 80% of patient data remains unstructured. Mendel's mission is to ingest both structured and unstructured data, synthesizing the intricacies of clinical language for coherent understanding.

01

02

Understands at the patient-level, not just the document level

A patient's journey is more akin to a book than a page. While generic LLMs grapple with limited window sizes, Mendel focuses on processing thousands of pages per patient. This allows us to decode a coherent narrative by connecting the dots amidst conflicting information.

Doesn’t hallucinate

In healthcare, hallucinations are unacceptable. Any failure of a system should be consistent and traceable, not merely a product of chance or probability.

03

04

Every output our models produce is accompanied by a clear rationale. Users can trace the system’s reasoning and dive deeper into the system’s suggestions, fostering trust.

Interpretable

Deep learning alone isn’t getting us to clinician-like AI

LLMs represent cutting-edge deep learning techniques that fuel today's natural language processing (NLP) applications. While they are powerful, standard LLMs are limited in enterprise clinical use cases.

LLMs can converse fluently but often lack deep understanding, leading to superficial reasoning and occasional "hallucinations" in their response. This makes them unsuitable for critical clinical applications without necessary controls and optimizations.

Rule-based systems aren’t getting us to clinician-like AI

Simply enumerating every conceivable rule isn't a feasible way to replicate the nuanced judgment and adaptive learning of human clinicians.

Instead of providing a holistic understanding of patient care, such systems can become rigid and overly complex, missing out on the intricacies and evolving nature of medical knowledge and practice.

A hybrid model that couples deep learning with symbol manipulation can get us there

We've spent 5 years investing in R&D to develop a system capable of clinical reasoning through a hybrid approach that combines deep learning with symbolic AI.

We’re employing processes commonly used in logic, mathematics, and computer science to approach clinical thinking as a form of algebra.

Our research aims to sidestep the pitfalls of symbolic systems by using our proprietary generative symbolic learning method, paired with large language modeling.

The Mendel approach surpasses existing LLMs in its advanced reasoning capabilities

In a recent study, we found that solely using large language models (LLMs) like GPT-3.5 decreases Mendel’s system performance by 64.72% on average across 13 medical variables. However, Mendel’s hybrid model combining symbolic clinical reasoning models with LLMs significantly improves interpretation of electronic medical records.

Mendel's R&D not only utilizes foundational models but also pioneers new ones

Using actual medical records annotated by a dedicated team of physicians and symbolic modeling, we are well-positioned to drive innovation in clinical AI.

Our foundational work solves several challenges with greater accuracy

This hybrid approach is the bedrock of our fully integrated AI suite of clinical-specific data processing products.

We bring together industry-leading NLP models for OCR, document segmentation, document classification, named-entity extraction, relation extraction, de-identification, and LLM-based retrieval and generation (e.g., question/answering) with clinical reasoning models to serve complaint intelligence from millions of patient journeys.

Our AI team is not only AI scientists

Our R&D team consists of top-caliber AI scientists and clinical experts. We’re building a novel technology using a novel approach: training AI scientists to think like physicians and training physicians to think like AI scientists

Contact an Expert

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.