Although the area of genomics has not been developing at an exponential rate that experts expected when the Human Genome Project was announced to be completed, more and more ways of potential use of genomic data in medicine have showed how it might transform our lives. A few months ago, it was published that so-called “genetic mugshots” can be recreated from DNA. By only using a person’s DNA, a face can be generated which sounds like pure science fiction.
Now researchers at Oxford University have developed a computer program that can diagnose rare genetic disorders in children simply by analyzing family photos.
One day we might be able to sequence the genomes of newborns immediately after birth (or even before) to tell parents what major conditions the child might have to deal with in the future. As an additional feature, children without genomic sequences made available could get an instant diagnosis only by looking into the camera of a computer using this algorithm.
An excerpt about how it works:
The program works by recognising certain characteristic facial structures that can be present with certain conditions, including Down’s syndrome, Teacher Collins, Progeria, Fragile X and Angelman syndrome. It combines computer vision and machine learning to scan pictures for similarities to a database of pictures of people with known conditions, and then returns matches ranked by likelihood.
The New Scientist published a very interesting report about a new idea and technology that will be showcased at the upcoming Human-Computer Interaction conference in Toronto, Canada.
At the Museum of Arts and Crafts in Paris, a couple wandered in front of a set of dark screens. Staring back at them was an image of themselves – but with the skin stripped away, revealing organs, bones and muscle. Surprised, the woman gasped and covered her breasts, trying to shield herself from view.
Here’s how it works: an individual undergoes a PET scan, X-ray and MRI scan to capture high-resolution images of their bones and organs. Altogether, it takes about three-and-a-half hours to collect this data. Then when you step in front of the mirror, a Microsoft Kinect’s motion-capture camera tracks the movement of two dozen different joints, including the knees, elbows and wrists. That means the medical images can be animated with the help of graphical processing units so you can see your body inside out in real time.
A new study analyzing the role of IBM’s supercomputer named Watson in medical decision making was just published in Artificial Intelligence in Medicine. While the most acclaimed medical professionals might keep some studies in mind, Watson can check millions of them quickly. Instead of fighting them, doctors should realize we need to include such solutions in the everyday medical decision-making processes.
Using 500 randomly selected patients from that group for simulations, the two compared actual doctor performance and patient outcomes against sequential decision-making models, all using real patient data. They found great disparity in the cost per unit of outcome change when the artificial intelligence model’s cost of $189 was compared to the treatment-as-usual cost of $497.
“This was at the same time that the AI approach obtained a 30 to 35 percent increase in patient outcomes,” Bennett said. “And we determined that tweaking certain model parameters could enhance the outcome advantage to about 50 percent more improvement at about half the cost.”
I’m very glad they added this message at the end:
“Let humans do what they do well, and let machines do what they do well. In the end, we may maximize the potential of both.”
Being a medical futurist means I work on bringing disruptive technologies to medicine & healthcare; assisting medical professionals and students in using these in an efficient and secure way; and educating e-patients about how to become equal partners with their caregivers.
Based on what we see in other industries, this is going to be an exploding series of changes and while redesigning healthcare takes a lot of time and efforts, the best we can do is to prepare all stakeholders for what is coming next. That was the reason behind creating The Guide to the Future of Medicine white paper which you can download for free.
Please use the Twitter hashtag #MedicalFuture for giving feedback.
In the white paper, there is an infographic featuring the main trends that shape the future of medicine visualized from 3 perspectives:
- Which stage of the delivery of healthcare and the practice of medicine is affected by that (Prevent & Prepare; Data Input & Diagnostics; Therapy & Follow-up; and Outcomes & Consequences);
- Whether it affects patients or healthcare professionals;
- The practicability of it (already available – green boxes; in progress – orange boxes; and still needs time – red boxes)
Click here to see the infographic in the original size.
I hope you will find the guide useful in your work or in preparing your company and colleagues for the future of medicine.
I might be too optimistic about the advances of technologies in medicine, but I believe we live in a great era. Here is how third generation computers or cognitive computers can help cancer centers fight different forms of cancer.
FlowingData, one of my favourite blogs, just featured an entry focusing on how data will be organized in the future.
If there’s anything uniform across all the ideas, it’s ubiquity. In the future, computers won’t feel like computers, and data will not just flow alongside the physical world. Instead, data will intertwine with your day-to-day like threads in a fabric.
They come up with many examples, but I liked this one below the most. Imagine a totally transparent healthcare system in which you see all the relevant data about doctors, procedures, hospitals (success rates, costs), etc. You can really make a wise decision because you will know all the details and data you need.
Microsoft envisioned what 2019 would look like:
And here is a great talk from Minority Report science adviser and inventor John Underkoffler who presents g-speak – the real-life version of the film’s eye-popping, tai chi-meets-cyberspace computer interface.
Just a few articles focusing on the recent H1N1 outbreak.