Many people don’t realize that the tongue’s appearance has been used by Chinese doctors for thousands of years as a window into a person’s health. It wasn’t until the 19th century that Western doctors began to understand that the tongue’s appearance and color could help diagnose certain diseases.
A research team from the Middle Technical University in Baghdad and the University of South Australia recently created a new imaging system capable of analyzing the tongue’s colors and features for real-time diagnosis of medical conditions. The team’s research paper describes how the imaging system was trained on 5,260 images and uses six machine learning algorithms to reveal the tongue’s colors under any lighting conditions at an accuracy greater than 98%.
How The Tongue’s Appearance Is Used In Medical Diagnosis
This may sound like superstition or pseudoscience, but it isn’t. Modern science has proven that the tongue’s appearance is a valuable diagnostic tool for revealing illness. For example, diabetes mellitus should be suspected if the tongue has a yellow coating, whereas a purple tongue with a thick fatty layer may be a symptom of cancer and a white tongue may indicate a lack of iron in a person’s blood. These are only a few of the many telltale signs that your tongue can reveal about possible medical conditions. Using the tongue for diagnosis has evolved from stick-out-your-tongue-and-say-ah observations to scientific focus and diagnosis using computer vision and AI.
Studies on this subject between 2016 and 2021 took advantage of computer vision to analyze tongue color to indicate diseases such as diabetes or the severity of Covid-19. Some studies utilized digital cameras to capture images that allowed the tongue’s texture to be analyzed in addition to its color. Advanced imaging and infrared thermography also came into play for measuring the tongue’s temperature. The most recent studies have combined deep learning, machine learning and hybrid neural networks to provide more accurate diagnoses.
The number of images needed for training these models has varied on a study-by-study basis. In general, when not a lot of data is available, such as in this case, models can be trained with relatively small datasets. Out of ten tongue-related studies done between 2019 and 2022, the number of images used to train the models ranged from 50 to 700 images. Even with these relatively small data sets (compared to LLMs), the accuracy for tongue models has been surprisingly high, with some models for diabetes detection achieving over 90% accuracy.
In general, the history of tongue analysis has progressed from basic image analysis to the complexity and sophisticated use of AI. These earlier studies were challenged by varying lighting conditions that often led to errors.
In the latest study (Hassoon et al. 2024—the one linked at the beginning of this article), two datasets were used to evaluate the proposed imaging system. One of the datasets consisted of 5,260 tongue images sorted into seven color groups (red, yellow, green, blue, gray, white and pink). These images were captured under varying light conditions. Eighty percent of these images were used for training and 20% were used for real-time system testing. The 20% also contained 60 abnormal images obtained from hospitalized Iraqi patients who had varied diagnoses, including diabetes, mycotic infections, asthma, Covid-19, fungiform papillae and anemia.
Acquisition Of Experimental Research Images
I was curious how tongue pictures were taken by the research teams. It seemed like an awkward and embarrassing process, but it turned out to be much simpler than expected. As shown in the image above, the subject sitting about 10 inches away from the camera extends his tongue. Once the image is centered and captured with the webcam, a real-time analysis and diagnosis is performed.
One issue I did not see mentioned in the research paper was if the tongue images vary by ethnicity, age and lifestyle. If these are factors, they could easily be addressed by training with larger and more diverse datasets.
A Wave Of AI-Powered Medical Tools Is Coming
AI has revolutionized how tongue color and illness studies are conducted. It has increased the accuracy of diagnostic models and created the capability to analyze tongue images taken under various lighting conditions, and provide a real-time diagnosis.
This is just one of many healthcare applications affected by AI. For example, Google’s DeepMind has achieved accuracy beyond human-level capability to detect abnormalities in medical images. AI can also analyze global datasets to predict and track worldwide disease outbreaks like Covid. It can also decode genetic information to identify mutations as well as perform precise gene editing. These are small examples compared to what I believe is on the horizon and how AI will eventually impact humanity.
Once standardized, this tongue diagnostic tool could easily be integrated into telemedicine platforms. It has the potential to assist with real-time monitoring of health issues using remote consultation, which is growing in popularity. This system would be particularly beneficial in rural and underserved healthcare areas. According to a press release issued by the University of South Australia, Professor Javaan Chahl, one the developers, said the system will eventually be available on a smartphone. That’s a trend we should expect to see with even more AI-driven medical apps over time.
Read the full article here