Research & Innovation

Reality check for IBMs

Within the next five years, computers will be able to help people see things that are invisible today. IBM Research is venturing five predictions.

02 Feb. 2017 Mark Schröder

IBM Research goes time traveling

IBM Research Prognosen
Technological development is progressing rapidly. IBM Research dares to predict the future. (Photo: Vintage Tone / Shutterstock.com)

Computers understand "baby talk". They help doctors interpret medical imaging. Computers hear better than people and can warn them of danger. A touchscreen can give the tactile impression of a piece of clothing. And software is able to identify flavors and create new ones independently. These five sensory expansions would be reality today, promised IBM Research scientists five years ago.

Looking at today's technology, we see that speech recognition still needs work, medical diagnosis is happening, and so far there is no app with super-hearing or touchable virtual fabric. But "Chef Watson" has created a new cocktail and a menu, although computer-generated meals are not yet available for sale. So the IBM scientists' predictions were still ahead of their times.

Now the experts are daring to make new predictions: the "5 in 5". Five innovations that could sustainably change our lives in the next five years. Like the predictions from five years ago, the latest ones involve using computers to expand on human senses. "Based on progress in artificial intelligence and nanotechnology, we want to develop new instruments to help us better understand the invisible connections in today's world", says Dario Gil, Vice President of Science & Solutions at IBM Research.

Listening for health

According to IBM Research, language is a key to better understanding complex processes in the brain. In the next five years, cognitive systems will be able to draw conclusions about mental health from the way people formulate and express their speech. To achieve this, IBM experts are combining written transcripts and audio recordings from patient appointments with machine learning, to identify speech patterns that could help predict depression, manic behaviors or schizophrenia in the future. Currently the computer needs only 300 words to reach an initial conclusion.

The researchers hope that in the future, similar techniques could also be used for conditions such as attention deficit disorder, autism or post-traumatic stress disorder. Cognitive systems are currently analyzing the statements, intonation, language and syntax of people affected by these conditions. A comprehensive image of the individual is created in combination with imaging procedures such as electroencephalography (EEG), to support physicians and mental health professionals in reaching diagnoses and determining treatment. By adding in the use of mobile devices, patients could conduct examinations in their own home to prepare for the doctor's appointment.

Visual assists boost safety

The human eye can detect less than 0.1 percent of the electromagnetic spectrum. Devices using radar or x-rays can expand this range. However, these machines can usually only be operated by specialists, and are expensive to acquire and maintain. IBM Research experts expect that in five years, optical aids will combine with artificial intelligence to allow people to see larger areas of the electromagnetic spectrum. Such devices will be affordable, wearable and available everywhere.

One application scenario, according to IBM Research, is the self-driving car: Computers will be able to analyze obstacles or worsening weather conditions better and faster than today, to steer the vehicle safely to its target location. Another application is in grocery scanners for smartphones, where sensors could display products' sell-by date or nutrition information at the touch of a button.

Understanding global relationships

Relationships directly surrounding them are often hidden to people. This will change for good with the Internet of Things, believes IBM Research: Drones, light bulbs, cameras, refrigerators, satellites, telescopes and weather stations have been generating exabytes of data monthly that until now has been little used. In the next five years, machine learning algorithms and software will help organize and understand information from the physical world.

One example is in farming: Collecting, organizing and analyzing data on cultivation methods, soil composition, groundwater levels and weather, will allow farmers in the future to choose their seeds, select the right locations for planting and optimize yields – without for example wasting groundwater reserves.

Medical laboratory in a chip

Early diagnosis is crucial to successful treatment for many diseases. IBM researchers know that diseases like cancer or Parkinson's are very difficult to diagnose early. One possibility for early diagnosis is the analysis of bioparticles in body fluids. These particles are often 1000 times smaller than the diameter of a human hair, which makes them hard to detect.

In the next five years, IBM Research wants to fit entire medical labs onto a chip. Prototypes already exist today, that can separate and isolate from viruses bioparticles measuring just 20 nanometers in diameter. The chip laboratory should allow consumers of the future to read their own biomarkers, and link this information to smart watch data, for example, in a virtual medical file. Combining different sets of data could offer greater insight into their health situation, and perhaps even identify problematic indicators early.

Reveal pollution

Most pollutants are not visible to the human eye. One example given by IBM Research is methane. When methane is released into the atmosphere before burning, it contributes greatly to global warming. The US Environmental Protection Agency estimates that more than nine million tons of methane were discharged from natural sources alone in 2014. This is the equivalent of the amount of greenhouse gases produced together by the American aluminum, iron, steel and cement industries over the last 100 years.

IBM scientists now predict that inexpensive sensors will be available in five years that can register methane discharges. If the widely scattered methane sources and infrastructure are monitored, then leaks can be discovered within minutes instead of weeks. This technology would help reduce environmental damage and the likelihood of catastrophic events.

Research & Innovation Artificial Intelligence CEBIT RSS Feed