How AI Is Using Tongue Color to Predict Diseases With High Accuracy
2024-9-5 03:0:28 Author: hackernoon.com(查看原文) 阅读量:4 收藏

The health care industry often takes ages to adopt new technologies because it must jump through numerous regulatory hoops. However, the reaction to artificial intelligence has been different. Countless professionals are already experimenting with AI’s predictive and analytic abilities to streamline diagnostics.

Researchers recently claimed AI can identify the type of disease a person has — and how far it has progressed — by simply looking into their mouth. This technology has come a long way, but can it predict diseases using only tongue colors for diagnosis?

Research Proves AI Can Use Tongue Color for Diagnosis

Researchers from the Middle Technical University in Baghdad, Iraq, and the University of South Australia in Adelaide, Australia, recently discovered AI technology can analyze tongue colors for diagnoses. They developed a computer vision system that processes and classifies images using color space models, which provide measurable values for hues and luminance.

They used thousands of images for training and testing, many of which came from the Al-Hussein Teaching Hospital in Iraq and the Mosul General Hospital in Mosul. They trained the model on real people with actual diseases, not a synthetic dataset. Differences like this are essential when developing a tool for diagnostic purposes.

The researchers classified images into pink, white, red, yellow, green, blue or gray categories so their models could identify colors under any lighting conditions. They trained seven in total. The highest-performing one was built with Extreme Gradient Boost (XGBoost) — an open-source machine learning library — which achieved over 98% accuracy on average. During testing, it accurately diagnosed 58 out of 60 images, meaning it was correct 96.6% of the time.

Their results surprised me. Frankly, I didn’t expect their system to outperform medical professionals. Although various research groups have developed similar diagnostic models for other purposes, few are this accurate. Research shows that even trained physicians with years of experience only get it right 71.4% of the time on average.

What Tongue Color Reveals About Someone’s Health

I’ll admit, I thought looking at the tongue to predict disease was strange. It seemed like some method people came up with before modern medicine existed. To be fair, I was partly right. Using tongue colors for diagnosis is based on an over 2,000-year-old traditional Chinese medical practice.

Of all the tongue’s characteristics, including shape, texture and moisture, color is the most important indicator of health. If I stick out my tongue and look in the mirror right now, I’d expect it to be pink. Any other hue could indicate there is something wrong with my mouth, circulatory system or organs.

Redness could mean I have an unusually high fever or a vitamin deficiency. Research shows there’s a potential link between diabetes and yellowing of the tongue. A green tinge usually indicates a fungal infection or bacterial buildup. Blue discoloration may be a sign of a low blood oxygen level or a blood vessel disease. Gray could be anything from fungus to cancer.

While medical professionals have dozens of diagnostic systems, many still check tongues because relatively few conditions affect its color. If there’s noticeable discoloration, they can narrow down the root of the problem. Compared to a blood test that provides precise readings but no definitive answers, it’s often a better option.

That said, human error often decreases its accuracy. Traditionally, physicians manually inspect patients’ tongues. Even with years of experience, anything from slightly colored overhead lights to the time of day could affect their perception of color. This ambiguous, subjective method has been around for over 2,000 years — it’s time for an upgrade.

How AI Identifies and Predicts Disease Using Tongue Color

The engineering professionals and researchers who developed this breakthrough diagnostic imaging system had participants stand 20 centimeters away from the machine during its testing phase. The embedded AI then detected their tongue’s color and predicted their health status in real time. It processed hues and luminance using color space models.

The XGBoost algorithm correctly predicted diseases 96.6% of the time during testing. This machine learning model is accurate because it continues improving itself after its initial output. It repeatedly makes a guess and calculates an error rate to gradually get closer to its goal, effectively training itself to increase its precision.

How This Research Group’s Machine Sees Tongue Colors

Cone cells — photoreceptors in the retina responsible for color vision — are broadly sensitive to red-blue-green (RGB) regions. However, this color space model doesn’t reflect information very well. The computer vision system instead used YCbCr, LAB, YIQ and HSV.  Unlike humans, it isn’t limited to a narrow visible light spectrum.

Other studies show an AI-powered computer vision system can accurately recognize and recreate colors without spectral dispersion, meaning it can definitively see colors we can’t. While my RGB-sensitive cones can only see around one million colors, this technology can see about 16.8 million. It’s incredibly accurate, too, only deviating from “unseen” color spectra about 1% of the time on average.

If a machine learning model can pick up on subtle differences in saturation and luminance that are invisible to me, why shouldn’t it be able to see hues I can’t? Naturally, the implications are momentous — AI may be able to outperform doctors consistently.

Why Does Using Tongue Colors for Diagnosis Matter?

Since a single model can interact with multiple people simultaneously, it can help various patients at once. They don’t need to visit a clinic — they could download an app and use their phone’s camera. Since the research group’s machine can identify and predict diseases regardless of lighting, there’s little chance of inaccurate output.

I believe at-home, AI-driven screening could revolutionize health care, making it more affordable and accessible. Millions of people pass away yearly from diseases they would’ve had a fighting chance against if they had caught it earlier. For instance, in the United States, around 12,000 people get diagnosed with gallbladder cancer every year, resulting in an estimated 4,000 deaths annually. AI could diagnose them remotely, helping them receive care in time.

AI’s potential to revolutionize diagnostics could benefit hospitals as much as it would patients. Despite widespread digitalization, medical spending increased by approximately 47% from 2021 to 2022 — totaling $55 billion. The main cost driver was that providers’ average time spent with patients increased.

AI’s automation capabilities and autonomous nature could streamline appointments. Evidence shows this technology can help medical facilities save 20%-50% of their annual budgets, so it has potential here. It could enable hospitals to pass cost savings onto consumers or invest in more lifesaving equipment.

I didn’t think using tongue colors for diagnosis could be so impactful, but it has unparalleled potential. Can AI replace doctors? Probably not. However, I believe it will become a staple in the medical industry as it supplements disease identification, prediction and treatment. Their expertise, combined with the power of machine learning, would be an unbeatable pairing.

How Health Care Could Use This Technology for the Better

The health care industry is aware of AI and eager to adopt it, so it’s likely only a matter of time before it becomes prevalent. Experts project this technology’s market value in this sector will grow at a compound annual growth rate of 38.4% from 2020 to 2030 — reaching an estimated $208.2 billion — underscoring its rapidly increasing prevalence in the field.

However, while __72% of physicians agree__they see the most promise for AI in diagnostics, only 38% use it in practice. Realistically, it could take years before they use model-analyzed tongue colors for diagnosis and prediction. My educated guess is that hospitals will take a decade to jump through hoops instead of cutting through the red tape.

Fortunately, the world of mobile health is growing quickly and accessible anytime. While I can’t mention mHealth without bringing up its lack of regulatory oversight and privacy protections, it’d also be senseless to overlook it when discussing the future of medical AI.

Why Health Care Must Be Careful About This Technology

At some point, I caught myself thinking that this breakthrough is too good to be true. What’s the catch? What are the disadvantages of an AI diagnosis? I’ve done my fair share of research into this field, so I knew privacy, ethical and regulatory issues would exist. However, the study that unveiled this breakthrough technology is also worth revisiting under a microscope.

I noticed that the researchers’ work on using AI to analyze tongue colors for diagnosis hasn’t been peer-reviewed yet. Since it was published in June 2024, that’s unsurprising. However, that means their 98% accuracy rate is somewhat up in the air. Their process is sound, but what is science without replication? Unless someone replicates their results, the health care industry may not adopt their invention.

How Will This Breakthrough Impact AI Medical Technology?

Using an imaging system to analyze tongue colors for diagnoses may seem a niche and relatively minor achievement, but it could revolutionize health care. Accurately predicting diseases via an app could save thousands of lives. Moreover, other researchers could use this technology to inspire their own AI-driven diagnostic breakthroughs.


文章来源: https://hackernoon.com/how-ai-is-using-tongue-color-to-predict-diseases-with-high-accuracy?source=rss
如有侵权请联系:admin#unsafe.sh